spark-sql跑数据Failed with exception java.io.IOException:org.apache.parquet.io.ParquetDecodingExceptio

版权声明:本文为博主原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明。
本文链接:https://blog.csdn.net/sinat_30316741/article/details/88574242

错误信息:

Failed with exception java.io.IOException:org.apache.parquet.io.ParquetDecodingException: Can not read value at 0 in block -1 in file oss:/xxxxxxxxxx.snappy.parquet

修改方式:

在运行spark-sql前 添加这样一句话:

spark.conf.set("spark.sql.parquet.writeLegacyFormat","true")

这样就能跑成了。

展开阅读全文

hive与hbase整合报错Failed with exception java.io.IOException:java.lang.NullPointerExc

08-29

刚开始接触hadoop一段时间,以后要做数据挖掘的项目。整合hive、hbase就卡那里了。分开运行还顺利,hive建一个hbase能访问的表就比较O疼了。导入数据也没问题,就是查询不可以。rnrn建表:rn[code=sql]rnhive> CREATE TABLE hbase_table_1(key int, value string) rn > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' rn > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val") rn > TBLPROPERTIES ("hbase.table.name" = "xyz"); rnOKrnTime taken: 1.603 secondsrn[/code]rnrn导数据:rn[code=sql]rnhive> INSERT OVERWRITE TABLE hbase_table_1 SELECT * FROM pokes WHERE foo=86; rnTotal jobs = 1rnLaunching Job 1 out of 1rnNumber of reduce tasks is set to 0 since there's no reduce operatorrnStarting Job = job_1409219405692_0016, Tracking URL = http://cluster01:8888/proxy/application_1409219405692_0016/rnKill Command = /home/hadoop/hadoop-2.2.0/bin/hadoop job -kill job_1409219405692_0016rnHadoop job information for Stage-0: number of mappers: 1; number of reducers: 0rn2014-08-29 15:40:20,961 Stage-0 map = 0%, reduce = 0%rn2014-08-29 15:40:29,406 Stage-0 map = 100%, reduce = 0%, Cumulative CPU 3.89 secrnMapReduce Total cumulative CPU time: 3 seconds 890 msecrnEnded Job = job_1409219405692_0016rnMapReduce Jobs Launched: rnJob 0: Map: 1 Cumulative CPU: 3.89 sec HDFS Read: 6016 HDFS Write: 0 SUCCESSrnTotal MapReduce CPU Time Spent: 3 seconds 890 msecrnOKrnTime taken: 21.2 secondsrn[/code]rnrn然后。。。[color=#FF0000]Failed with exception java.io.IOException:java.lang.NullPointerException[/color]rn[code=sql]rnhive> SELECT * FROM hbase_table_1;rnOKrnFailed with exception java.io.IOException:java.lang.NullPointerExceptionrnTime taken: 0.131 secondsrn[/code]rnrnhive.log:rn[code=text]rn2014-08-29 15:25:42,121 INFO [main]: log.PerfLogger (PerfLogger.java:PerfLogEnd(135)) - rn2014-08-29 15:25:42,204 ERROR [main]: CliDriver (SessionState.java:printError(545)) - Failed with exception java.io.IOException:java.lang.NullPointerExceptionrnjava.io.IOException: java.lang.NullPointerExceptionrn at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:636)rn at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:534)rn at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:137)rn at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1519)rn at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:285)rn at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)rn at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)rn at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:792)rn at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)rn at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)rn at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)rn at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)rn at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)rn at java.lang.reflect.Method.invoke(Method.java:606)rn at org.apache.hadoop.util.RunJar.main(RunJar.java:212)rnCaused by: java.lang.NullPointerExceptionrn at org.apache.hadoop.net.DNS.reverseDns(DNS.java:92)rn at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.reverseDNS(TableInputFormatBase.java:218)rn at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getSplits(TableInputFormatBase.java:184)rn at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplits(HiveHBaseTableInputFormat.java:479)rn at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:418)rn at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:561)rn ... 14 morernrn2014-08-29 15:25:42,204 INFO [main]: exec.TableScanOperator (Operator.java:close(574)) - 0 finished. closing... rn[/code]rnrnhive-site.xml:rn[code=html]rnrnrnrnhive.metastore.warehouse.dirrnhdfs://cluster01:9000/hive/warehousernrnrnrnhive.exec.scratchdirrnhdfs://cluster01:9000/hive/scratchdirrnrnrnrnhive.querylog.locationrnfile:///var/hadoop/hive/logsrnrnrnrnjavax.jdo.option.ConnectionURLrnjdbc:mysql://cluster01:3306/hive?createDatabaseIfNotExist=truernrnrnrnjavax.jdo.option.ConnectionDriverNamerncom.mysql.jdbc.Driverrnrnrnrnjavax.jdo.option.ConnectionUserNamernhivernrnrnrnjavax.jdo.option.ConnectionPasswordrnhivernrnrnrnhive.aux.jars.pathrnfile:///home/hadoop/hive-0.13.1/lib/hive-hbase-handler-0.13.1.jar,file:///home/hadoop/hive-0.13.1/lib/protobuf-java-2.5.0.jar,file:///home/hadoop/hive-0.13.1/lib/hbase-client-0.96.2-hadoop2.jar,file:///home/hadoop/hive-0.13.1/lib/hbase-common-0.96.2-hadoop2.jar,file:///home/hadoop/hive-0.13.1/lib/hbase-protocol-0.96.2-hadoop2.jar,file:///home/hadoop/hive-0.13.1/lib/hbase-server-0.96.2-hadoop2.jar,file:///home/hadoop/hive-0.13.1/lib/zookeeper-3.4.6.jar,file:///home/hadoop/hive-0.13.1/lib/guava-11.0.2.jarrnrnrnrnhive.zookeeper.quorumrncluster01,cluster02,cluster03,cluster04,cluster05,cluster06,cluster07rnrnrnrn[/code]rnrn其他:rnhadoop@cluster01:~$ java -versionrnjava version "1.7.0_67"rnJava(TM) SE Runtime Environment (build 1.7.0_67-b01)rnJava HotSpot(TM) 64-Bit Server VM (build 24.65-b04, mixed mode)rnrnhadoop@cluster01:~$ uname -arnLinux cluster01 3.13.0-34-generic #60-Ubuntu SMP Wed Aug 13 15:45:27 UTC 2014 x86_64 x86_64 x86_64 GNU/Linuxrnrnhadoop@cluster01:~$ ls | grep -rnhadoop-2.2.0rnhbase-0.96.2rnhive-0.13.1rnmahout-0.9rnzookeeper-3.4.6 论坛

Servlet failed with Exception

11-23

Servlet failed with Exceptionrnjava.lang.NullPointerExceptionrn at jsp_servlet.__index._jspService(Ljavax.servlet.http.HttpServletRequest;Ljavax.servlet.http.HttpServletResponse;)V(__index.java:188)rn at weblogic.servlet.jsp.JspBase.service(Ljavax.servlet.ServletRequest;Ljavax.servlet.ServletResponse;)V(JspBase.java:33)rn at weblogic.servlet.internal.ServletStubImpl$ServletInvocationAction.run()Ljava.lang.Object;(ServletStubImpl.java:1006)rn at weblogic.servlet.internal.ServletStubImpl.invokeServlet(Ljavax.servlet.ServletRequest;Ljavax.servlet.ServletResponse;Lweblogic.servlet.internal.FilterChainImpl;)V(ServletStubImpl.java:419)rn at weblogic.servlet.internal.ServletStubImpl.invokeServlet(Ljavax.servlet.ServletRequest;Ljavax.servlet.ServletResponse;)V(ServletStubImpl.java:315)rn at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run()Ljava.lang.Object;(WebAppServletContext.java:6718)rn at weblogic.security.acl.internal.AuthenticatedSubject.doAs(Lweblogic.security.subject.AbstractSubject;Ljava.security.PrivilegedAction;)Ljava.lang.Object;(AuthenticatedSubject.java:321)rn at weblogic.security.service.SecurityManager.runAs(Lweblogic.security.acl.internal.AuthenticatedSubject;Lweblogic.security.acl.internal.AuthenticatedSubject;Ljava.security.PrivilegedAction;)Ljava.lang.Object;(SecurityManager.java:121)rn at weblogic.servlet.internal.WebAppServletContext.invokeServlet(Lweblogic.servlet.internal.ServletRequestImpl;Lweblogic.servlet.internal.ServletResponseImpl;)V(WebAppServletContext.java:3764)rn at weblogic.servlet.internal.ServletRequestImpl.execute(Lweblogic.kernel.ExecuteThread;)V(ServletRequestImpl.java:2644)rn at weblogic.kernel.ExecuteThread.execute(Lweblogic.kernel.ExecuteRequest;)V(ExecuteThread.java:219)rn at weblogic.kernel.ExecuteThread.run()V(ExecuteThread.java:178)rn at java.lang.Thread.startThreadFromVM(Ljava.lang.Thread;)V(Unknown Source)rn这是什么原因?各位帮帮忙 论坛

高手请进java.io.IOException: tmpFile.renameTo(classFile) failed

05-30

java.io.IOException: tmpFile.renameTo(classFile) failedrnat org.apache.jasper.compiler.SmapUtil$SDEInstaller.install(SmapUtil.java:245)rnat org.apache.jasper.compiler.SmapUtil.installSmap(SmapUtil.java:164)rnat org.apache.jasper.compiler.JDTCompiler.generateClass(JDTCompiler.java:429)rnat org.apache.jasper.compiler.Compiler.compile(Compiler.java:297)rnat org.apache.jasper.compiler.Compiler.compile(Compiler.java:276)rnat org.apache.jasper.compiler.Compiler.compile(Compiler.java:264)rnat org.apache.jasper.JspCompilationContext.compile(JspCompilationContext.java:563)rnat org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:305)rnat org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:314)rnat org.apache.jasper.servlet.JspServlet.service(JspServlet.java:264)rnat javax.servlet.http.HttpServlet.service(HttpServlet.java:802)rnat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:252)rnat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:173)rnat org.apache.catalina.core.ApplicationDispatcher.invoke(ApplicationDispatcher.java:672)rnat org.apache.catalina.core.ApplicationDispatcher.processRequest(ApplicationDispatcher.java:463)rnat org.apache.catalina.core.ApplicationDispatcher.doForward(ApplicationDispatcher.java:398)rnat org.apache.catalina.core.ApplicationDispatcher.forward(ApplicationDispatcher.java:301)rnat org.apache.struts.action.RequestProcessor.doForward(RequestProcessor.java:1056)rnat org.apache.struts.action.RequestProcessor.processForwardConfig(RequestProcessor.java:388)rnat org.apache.struts.action.RequestProcessor.process(RequestProcessor.java:231)rnat org.apache.struts.action.ActionServlet.process(ActionServlet.java:1164)rnat org.apache.struts.action.ActionServlet.doPost(ActionServlet.java:415)rnat javax.servlet.http.HttpServlet.service(HttpServlet.java:709)rnat javax.servlet.http.HttpServlet.service(HttpServlet.java:802)rnat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:252)rnat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:173)rnat com.augurit.common.filter.SessionCheckFilter.doFilter(SessionCheckFilter.java:53)rnat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:202)rnat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:173)rnat com.augurit.aos.common.filter.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:60)rnat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:202)rnat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:173)rnat org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213)rnat org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:178)rnat org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:126)rnat org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:105)rnat org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:107)rnat org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:148)rnat org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:869)rnat org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:664)rnat org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:527)rnat org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:80)rnrnrn谢谢帮忙解决rnat org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:684)rnat java.lang.Thread.run(Thread.java:595)rn 论坛

java.io.IOException: getSecretKey failed: PBE SecretKeyFactory not available

10-29

一个月以前做的一个apple推送功能,测试应用都没有问题,但是现在用总是报错,我没有做任何改动,请大神们帮我看看。rnjavapns.communication.exceptions.KeystoreException: Keystore exception: failed to decrypt safe contents entry: java.io.IOException: getSecretKey failed: PBE SecretKeyFactory not availablern at javapns.communication.KeystoreManager.wrapKeystoreException(KeystoreManager.java:178)rn at javapns.communication.KeystoreManager.loadKeystore(KeystoreManager.java:66)rn at javapns.communication.KeystoreManager.loadKeystore(KeystoreManager.java:42)rn at javapns.communication.KeystoreManager.loadKeystore(KeystoreManager.java:29)rn at javapns.communication.ConnectionToAppleServer.(ConnectionToAppleServer.java:54)rn at javapns.notification.ConnectionToNotificationServer.(ConnectionToNotificationServer.java:16)rn at javapns.notification.PushNotificationManager.initializeConnection(PushNotificationManager.java:105)rn at javapns.Push.sendPayload(Push.java:171)rn at javapns.Push.payload(Push.java:149)rn at test.TestPush.main(TestPush.java:26)rnCaused by: java.io.IOException: failed to decrypt safe contents entry: java.io.IOException: getSecretKey failed: PBE SecretKeyFactory not availablern at com.sun.net.ssl.internal.pkcs12.PKCS12KeyStore.engineLoad(Unknown Source)rn at java.security.KeyStore.load(Unknown Source)rn at javapns.communication.KeystoreManager.loadKeystore(KeystoreManager.java:64)rn ... 8 morernCaused by: java.io.IOException: getSecretKey failed: PBE SecretKeyFactory not availablern at com.sun.net.ssl.internal.pkcs12.PKCS12KeyStore.getPBEKey(Unknown Source)rn ... 11 morernCaused by: java.security.NoSuchAlgorithmException: PBE SecretKeyFactory not availablern at javax.crypto.SecretKeyFactory.(DashoA13*..)rn at javax.crypto.SecretKeyFactory.getInstance(DashoA13*..)rn ... 12 morern 论坛

Hadoop 实战,WordCount 运行报错 java.io.IOException: Job failed!

08-26

刚开始学习Hadoop,现在部署了Hadoop 2.4.1的集群,编译了eclipse插件,现在运行《Hadoop实战》里面的WordCount程序,jar包已经打好,input文件也已上传。运行时报错 java.io.IOException: Job [code=text][root@master ~]# hdfs dfs -ls /rnFound 1 itemsrndrwxr-xr-x - root supergroup 0 2014-08-26 00:56 /inputrn[root@master ~]# hdfs dfs -ls /inputrnFound 1 itemsrn-rw-r--r-- 3 root supergroup 22 2014-08-26 00:56 /input/file.txtrn[root@master ~]#[/code]运行程序报错如下:[code=text][root@master ~]# hadoop jar wordcount.jar WordCount hdfs://master:9000/input/file.txt hdfs://master:9000/outputrn14/08/26 01:14:50 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-idrn14/08/26 01:14:50 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=rn14/08/26 01:14:50 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initializedrn14/08/26 01:14:51 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.rn14/08/26 01:14:51 INFO mapred.FileInputFormat: Total input paths to process : 1rn14/08/26 01:14:51 INFO mapreduce.JobSubmitter: number of splits:1rn14/08/26 01:14:51 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_local545501673_0001rn14/08/26 01:14:51 WARN conf.Configuration: file:/data/hadoop/tmp/mapred/staging/root545501673/.staging/job_local545501673_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval; Ignoring.rn14/08/26 01:14:51 WARN conf.Configuration: file:/data/hadoop/tmp/mapred/staging/root545501673/.staging/job_local545501673_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts; Ignoring.rn14/08/26 01:14:51 WARN conf.Configuration: file:/data/hadoop/tmp/mapred/local/localRunner/root/job_local545501673_0001/job_local545501673_0001.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval; Ignoring.rn14/08/26 01:14:51 WARN conf.Configuration: file:/data/hadoop/tmp/mapred/local/localRunner/root/job_local545501673_0001/job_local545501673_0001.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts; Ignoring.rn14/08/26 01:14:51 INFO mapreduce.Job: The url to track the job: http://localhost:8080/rn14/08/26 01:14:51 INFO mapreduce.Job: Running job: job_local545501673_0001rn14/08/26 01:14:51 INFO mapred.LocalJobRunner: OutputCommitter set in config nullrn14/08/26 01:14:51 INFO mapred.LocalJobRunner: OutputCommitter is org.apache.hadoop.mapred.FileOutputCommitterrn14/08/26 01:14:52 INFO mapred.LocalJobRunner: Waiting for map tasksrn14/08/26 01:14:52 INFO mapred.LocalJobRunner: Starting task: attempt_local545501673_0001_m_000000_0rn14/08/26 01:14:52 INFO mapred.Task: Using ResourceCalculatorProcessTree : [ ]rn14/08/26 01:14:52 INFO mapred.MapTask: Processing split: hdfs://master:9000/input/file.txt:0+22rn14/08/26 01:14:52 INFO mapred.MapTask: numReduceTasks: 1rn14/08/26 01:14:52 INFO mapred.MapTask: Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBufferrn14/08/26 01:14:52 INFO mapred.MapTask: (EQUATOR) 0 kvi 26214396(104857584)rn14/08/26 01:14:52 INFO mapred.MapTask: mapreduce.task.io.sort.mb: 100rn14/08/26 01:14:52 INFO mapred.MapTask: soft limit at 83886080rn14/08/26 01:14:52 INFO mapred.MapTask: bufstart = 0; bufvoid = 104857600rn14/08/26 01:14:52 INFO mapred.MapTask: kvstart = 26214396; length = 6553600rn14/08/26 01:14:52 INFO mapred.LocalJobRunner:rn14/08/26 01:14:52 INFO mapred.MapTask: Starting flush of map outputrn14/08/26 01:14:52 INFO mapred.MapTask: Spilling map outputrn14/08/26 01:14:52 INFO mapred.MapTask: bufstart = 0; bufend = 38; bufvoid = 104857600rn14/08/26 01:14:52 INFO mapred.MapTask: kvstart = 26214396(104857584); kvend = 26214384(104857536); length = 13/6553600rn14/08/26 01:14:52 INFO mapred.MapTask: Finished spill 0rn14/08/26 01:14:52 INFO mapred.Task: Task:attempt_local545501673_0001_m_000000_0 is done. And is in the process of committingrn14/08/26 01:14:52 INFO mapred.LocalJobRunner: hdfs://master:9000/input/file.txt:0+22rn14/08/26 01:14:52 INFO mapred.Task: Task 'attempt_local545501673_0001_m_000000_0' done.rn14/08/26 01:14:52 INFO mapred.LocalJobRunner: Finishing task: attempt_local545501673_0001_m_000000_0rn14/08/26 01:14:52 INFO mapred.LocalJobRunner: map task executor complete.rn14/08/26 01:14:52 INFO mapred.LocalJobRunner: Waiting for reduce tasksrn14/08/26 01:14:52 INFO mapred.LocalJobRunner: Starting task: attempt_local545501673_0001_r_000000_0rn14/08/26 01:14:52 INFO mapred.Task: Using ResourceCalculatorProcessTree : [ ]rn14/08/26 01:14:52 INFO mapred.ReduceTask: Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@fcc06e0rn14/08/26 01:14:52 INFO reduce.MergeManagerImpl: MergerManager: memoryLimit=333971456, maxSingleShuffleLimit=83492864, mergeThreshold=220421168, ioSortFactor=10, memToMemMergeOutputsThreshold=10rn14/08/26 01:14:52 INFO reduce.EventFetcher: attempt_local545501673_0001_r_000000_0 Thread started: EventFetcher for fetching Map Completion Eventsrn14/08/26 01:14:52 INFO reduce.LocalFetcher: localfetcher#1 about to shuffle output of map attempt_local545501673_0001_m_000000_0 decomp: 48 len: 52 to MEMORYrn14/08/26 01:14:52 INFO reduce.InMemoryMapOutput: Read 48 bytes from map-output for attempt_local545501673_0001_m_000000_0rn14/08/26 01:14:52 INFO reduce.MergeManagerImpl: closeInMemoryFile -> map-output of size: 48, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->48rn14/08/26 01:14:52 INFO reduce.EventFetcher: EventFetcher is interrupted.. Returningrn14/08/26 01:14:52 INFO mapred.LocalJobRunner: 1 / 1 copied.rn14/08/26 01:14:52 INFO reduce.MergeManagerImpl: finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputsrn14/08/26 01:14:52 WARN io.ReadaheadPool: Failed readahead on ifilernEBADF: Bad file descriptorrn at org.apache.hadoop.io.nativeio.NativeIO$POSIX.posix_fadvise(Native Method)rn at org.apache.hadoop.io.nativeio.NativeIO$POSIX.posixFadviseIfPossible(NativeIO.java:263)rn at org.apache.hadoop.io.nativeio.NativeIO$POSIX$CacheManipulator.posixFadviseIfPossible(NativeIO.java:142)rn at org.apache.hadoop.io.ReadaheadPool$ReadaheadRequestImpl.run(ReadaheadPool.java:206)rn at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)rn at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)rn at java.lang.Thread.run(Thread.java:745)rn14/08/26 01:14:52 INFO mapred.Merger: Merging 1 sorted segmentsrn14/08/26 01:14:52 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 42 bytesrn14/08/26 01:14:52 INFO reduce.MergeManagerImpl: Merged 1 segments, 48 bytes to disk to satisfy reduce memory limitrn14/08/26 01:14:52 INFO reduce.MergeManagerImpl: Merging 1 files, 52 bytes from diskrn14/08/26 01:14:52 INFO reduce.MergeManagerImpl: Merging 0 segments, 0 bytes from memory into reducern14/08/26 01:14:52 INFO mapred.Merger: Merging 1 sorted segmentsrn14/08/26 01:14:52 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 42 bytesrn14/08/26 01:14:52 INFO mapred.LocalJobRunner: 1 / 1 copied.rn14/08/26 01:14:52 INFO mapred.LocalJobRunner: reduce task executor complete.rn14/08/26 01:14:52 WARN mapred.LocalJobRunner: job_local545501673_0001rnjava.lang.Exception: java.lang.RuntimeException: java.lang.NoSuchMethodException: org.apache.hadoop.mapred.Reducer.()rn at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)rn at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)rnCaused by: java.lang.RuntimeException: java.lang.NoSuchMethodException: org.apache.hadoop.mapred.Reducer.()rn at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131)rn at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:409)rn at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)rn at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:319)rn at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)rn at java.util.concurrent.FutureTask.run(FutureTask.java:262)rn at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)rn at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)rn at java.lang.Thread.run(Thread.java:745)rnCaused by: java.lang.NoSuchMethodException: org.apache.hadoop.mapred.Reducer.()rn at java.lang.Class.getConstructor0(Class.java:2849)rn at java.lang.Class.getDeclaredConstructor(Class.java:2053)rn at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:125)rn ... 8 morern[/code]rn后接二楼rn 论坛

Exception in thread "main" java.io.IOException: 句柄无效

01-21

看《Java大学教程》两天了,看到了第二章,抄了个计算平均成绩的程序实验,出错了 rnD:\mine>java AveragernEnter letter grade:BrnEnter letter grade:Exception in thread "main" java.io.IOException: 句柄无效。rnat java.io.FileInputStream.skip(Native Method)rnat java.io.BufferedInputStream.skip(BufferedInputStream.java:305)rnat Average.main(Average.java:27)rnrn代码:rnimport java.io.*;rnrnpublic class Average rn public static void main( String args[] ) throws IOExceptionrn rn int counter, grade ,total, average;rnrn total = 0;rn counter = 1;rn rn while ( counter <=10 )rn System.out.print( "Enter letter grade:");rn System.out.flush();rn grade = System.in.read();rn rn if( grade == 'A' )rn total = total +4;rn else if ( grade == 'B')rn total = total +3;rn else if ( grade == 'C')rn total = total +2;rn else if ( grade == 'D')rn total = total +1;rn else if ( grade == 'E')rn total = total +0;rnrn System.in.skip( 1 );rn counter = counter +1;rn rnrn average = total / 10;rn System.out.println( "Class average is " + average );rn rnrnrn如果把System.in.skip( 1 );这个给注释掉,就成了这样了:rnD:\mine>java AveragernEnter letter grade:ArnEnter letter grade:Enter letter grade:Enter letter grade:BrnEnter letter grade:Enter letter grade:Enter letter grade:BrnEnter letter grade:Enter letter grade:Enter letter grade:BrnClass average is 1rn请问是哪里出错了?这是书上的程序我抄的啊,小第初学,搞不懂,请多指点,多谢! 论坛

nested exception is org.springframework.beans.TypeMismatchException: Failed to c

12-28

由于jar 包冲突项目经常起不来,刚刚重新配置了jar包出现 异常。一上午了都没有解决。rn哪位高手帮帮忙,在线等啊。rnrnnested exception is org.springframework.beans.TypeMismatchException: Failed to convert property value of type [java.lang.String] to required type [org.springframework.core.io.Resource[]] for property 'mappingLocations'; nested exception is java.lang.IllegalArgumentException: Could not resolve resource location pattern [classpath:hbms/*.hbm.xml]: class path resource [hbms/] cannot be resolved to URL because it does not existrnrn[code=Java]rn0 [main] INFO org.mortbay.log - Logging to org.slf4j.impl.SimpleLogger@9ed927 via org.mortbay.log.Slf4jLogrn328 [main] INFO org.mortbay.log - jetty 6.0.0rc0rn609 [main] INFO / - Initializing Spring root WebApplicationContextrn609 [main] INFO org.springframework.web.context.ContextLoader - Root WebApplicationContext: initialization startedrn656 [main] INFO org.springframework.web.context.support.XmlWebApplicationContext - Refreshing org.springframework.web.context.support.XmlWebApplicationContext@120bf2c: display name [Root WebApplicationContext]; startup date [Mon Dec 28 11:15:35 CST 2009]; root of context hierarchyrn750 [main] INFO org.springframework.beans.factory.xml.XmlBeanDefinitionReader - Loading XML bean definitions from ServletContext resource [/WEB-INF/applicationContext.xml]rn968 [main] INFO org.springframework.web.context.support.XmlWebApplicationContext - Bean factory for application context [org.springframework.web.context.support.XmlWebApplicationContext@120bf2c]: org.springframework.beans.factory.support.DefaultListableBeanFactory@491c4crn1250 [main] INFO org.springframework.beans.factory.config.PropertyPlaceholderConfigurer - Loading properties file from ServletContext resource [/WEB-INF/data-access.properties]rn1312 [main] INFO org.springframework.beans.factory.support.DefaultListableBeanFactory - Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@491c4c: defining beans [emailEnterParameter,emailSender,propertyConfigurer,transactionManager,sessionFactory,dataSource,productDao,productCategoryDao,productClassDao,productBenefitDao,productUsageDao,productSalesDao,productCustomerCaseDao,newsDao,infoIndexDao,dataDao,introduceDao,faqDao,branchDao,comDataDao,jobDao,careerDao,messageBoardDao,sysusertbDao,syspurviewtbDao,asiaTourAwardDao,webStaticDao,environDao,almsdeedDao,managementTeamDao,bicycleDao,nutritionDao,sponsorshipDao,nutritionArticleDao,promotionDao,awardsDao,downloadDao,sqlDao,productManager,productCategoryManager,productClassManager,productBenefitManager,productUsageManager,productSalesManager,productCustomerCaseManager,newsManager,introduceManager,faqManager,branchManager,dataManager,infoIndexManager,comDataManager,jobManager,careerManager,messageBoardManager,sysusertbManager,syspurviewtbManager,asiaTourAwardManager,webStaticManager,environManager,almsdeedManager,managementTeamManager,bicycleManager,nutritionManager,sponsorshipManager,nutritionArticleManager,promotionManager,awardsManager,downloadManager,facade]; root of factory hierarchyrn1937 [main] INFO org.springframework.beans.factory.support.DefaultListableBeanFactory - Destroying singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@491c4c: defining beans [emailEnterParameter,emailSender,propertyConfigurer,transactionManager,sessionFactory,dataSource,productDao,productCategoryDao,productClassDao,productBenefitDao,productUsageDao,productSalesDao,productCustomerCaseDao,newsDao,infoIndexDao,dataDao,introduceDao,faqDao,branchDao,comDataDao,jobDao,careerDao,messageBoardDao,sysusertbDao,syspurviewtbDao,asiaTourAwardDao,webStaticDao,environDao,almsdeedDao,managementTeamDao,bicycleDao,nutritionDao,sponsorshipDao,nutritionArticleDao,promotionDao,awardsDao,downloadDao,sqlDao,productManager,productCategoryManager,productClassManager,productBenefitManager,productUsageManager,productSalesManager,productCustomerCaseManager,newsManager,introduceManager,faqManager,branchManager,dataManager,infoIndexManager,comDataManager,jobManager,careerManager,messageBoardManager,sysusertbManager,syspurviewtbManager,asiaTourAwardManager,webStaticManager,environManager,almsdeedManager,managementTeamManager,bicycleManager,nutritionManager,sponsorshipManager,nutritionArticleManager,promotionManager,awardsManager,downloadManager,facade]; root of factory hierarchyrn1937 [main] ERROR org.springframework.web.context.ContextLoader - Context initialization failedrnorg.springframework.beans.factory.BeanCreationException: Error creating bean with name 'transactionManager' defined in ServletContext resource [/WEB-INF/applicationContext.xml]: Cannot resolve reference to bean 'sessionFactory' while setting bean property 'sessionFactory'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'sessionFactory' defined in ServletContext resource [/WEB-INF/applicationContext.xml]: Initialization of bean failed; nested exception is org.springframework.beans.TypeMismatchException: Failed to convert property value of type [java.lang.String] to required type [org.springframework.core.io.Resource[]] for property 'mappingLocations'; nested exception is java.lang.IllegalArgumentException: Could not resolve resource location pattern [classpath:hbms/*.hbm.xml]: class path resource [hbms/] cannot be resolved to URL because it does not existrnCaused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'sessionFactory' defined in ServletContext resource [/WEB-INF/applicationContext.xml]: Initialization of bean failed; nested exception is org.springframework.beans.TypeMismatchException: Failed to convert property value of type [java.lang.String] to required type [org.springframework.core.io.Resource[]] for property 'mappingLocations'; nested exception is java.lang.IllegalArgumentException: Could not resolve resource location pattern [classpath:hbms/*.hbm.xml]: class path resource [hbms/] cannot be resolved to URL because it does not existrnCaused by: org.springframework.beans.TypeMismatchException: Failed to convert property value of type [java.lang.String] to required type [org.springframework.core.io.Resource[]] for property 'mappingLocations'; nested exception is java.lang.IllegalArgumentException: Could not resolve resource location pattern [classpath:hbms/*.hbm.xml]: class path resource [hbms/] cannot be resolved to URL because it does not existrnCaused by: java.lang.IllegalArgumentException: Could not resolve resource location pattern [classpath:hbms/*.hbm.xml]: class path resource [hbms/] cannot be resolved to URL because it does not existrn at org.springframework.core.io.support.ResourceArrayPropertyEditor.setAsText(ResourceArrayPropertyEditor.java:83)rn at org.springframework.beans.TypeConverterDelegate.doConvertTextValue(TypeConverterDelegate.java:320)rn at org.springframework.beans.TypeConverterDelegate.doConvertValue(TypeConverterDelegate.java:304)rn at org.springframework.beans.TypeConverterDelegate.convertIfNecessary(TypeConverterDelegate.java:192)rn at org.springframework.beans.TypeConverterDelegate.convertIfNecessary(TypeConverterDelegate.java:138)rn at org.springframework.beans.BeanWrapperImpl.convertForProperty(BeanWrapperImpl.java:380)rn at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1085)rn at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:835)rn at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:423)rn at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:251)rn at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:144)rn at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:248)rn at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:160)rn at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:261)rn at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:109)rn at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1073)rn at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:835)rn at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:423)rn at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:251)rn at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:144)rn at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:248)rn at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:160)rn at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:279)rn at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:360)rn at org.springframework.web.context.ContextLoader.createWebApplicationContext(ContextLoader.java:241)rn at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:184)rn at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:49)rn[/code] 论坛

java.io.IOException

04-28

请高手看下这个问题怎么解决rnorg.apache.catalina.core.StandardWrapperValve invokern严重: Servlet.service() for servlet jsp threw exceptionrnjava.io.IOExceptionrn at org.apache.coyote.http11.InternalAprOutputBuffer.flushBuffer(InternalAprOutputBuffer.java:696)rn at org.apache.coyote.http11.InternalAprOutputBuffer.flush(InternalAprOutputBuffer.java:284)rn at org.apache.coyote.http11.Http11AprProcessor.action(Http11AprProcessor.java:1016)rn at org.apache.coyote.Response.action(Response.java:183)rn at org.apache.catalina.connector.OutputBuffer.doFlush(OutputBuffer.java:314)rn at org.apache.catalina.connector.OutputBuffer.flush(OutputBuffer.java:288)rn at org.apache.catalina.connector.CoyoteOutputStream.flush(CoyoteOutputStream.java:98)rn at javax.imageio.stream.FileCacheImageOutputStream.close(FileCacheImageOutputStream.java:213)rn at javax.imageio.ImageIO.write(ImageIO.java:1567)rn at org.apache.jsp.image_jsp._jspService(image_jsp.java:147)rn at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70)rn at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)rn at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:377)rn at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:313)rn at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:260)rn at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)rn at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)rn at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)rn at org.apache.struts2.dispatcher.FilterDispatcher.doFilter(FilterDispatcher.java:389)rn at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)rn at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)rn at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)rn at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)rn at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)rn at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)rn at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)rn at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298)rn at org.apache.coyote.http11.Http11AprProcessor.process(Http11AprProcessor.java:861)rn at org.apache.coyote.http11.Http11AprProtocol$Http11ConnectionHandler.process(Http11AprProtocol.java:579)rn at org.apache.tomcat.util.net.AprEndpoint$Worker.run(AprEndpoint.java:1584)rn at java.lang.Thread.run(Thread.java:619)rn 论坛

Failed with exception Unable to move source hdfs

03-03

HADOOP平台上调度任务,hive语句是insert+select,结果报错rn[code=text]Kill Command = /appcom/hadoop/bin/hadoop job -kill job_1487150664770_633210rnHadoop job information for Stage-1: number of mappers: 1; number of reducers: 0rn2017-03-02 16:57:19,668 Stage-1 map = 0%, reduce = 0%rn2017-03-02 16:57:49,856 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 2.44 secrnMapReduce Total cumulative CPU time: 2 seconds 440 msecrnEnded Job = job_1487150664770_633210rnStage-3 is selected by condition resolver.rnStage-2 is filtered out by condition resolver.rnStage-4 is filtered out by condition resolver.rnMoving data to: hdfs://hdfs01-sh/tmp/hadoop/staging_hive_2017-03-02_16-55-51_409_3382407447094959418-1/-ext-10000rnLoading data to table bx_nebula_safe.app_mutil_channel_accident_statrnFailed with exception Unable to move source hdfs://hdfs01-sh/tmp/hadoop/staging_hive_2017-03-02_16-55-51_409_3382407447094959418-1/-ext-10000/000000_0 to destination hdfs://hdfs01-sh/user/hive/warehouse/bx_nebula_safe.db/app_mutil_channel_accident_stat/000000_0rnFAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTaskrnMapReduce Jobs Launched: rnStage-Stage-1: Map: 1 Cumulative CPU: 2.44 sec HDFS Read: 8136 HDFS Write: 0 SUCCESSrnTotal MapReduce CPU Time Spent: 2 seconds 440 msecrn+./APP_MUTIL_CHANNEL_ACCIDENT_STAT.sh:51:: exitCodeCheck 1rn+/usr/local/bin/ExitCodeCheck.sh:3:exitCodeCheck: '[' 1 -ne 0 ']'rn+/usr/local/bin/ExitCodeCheck.sh:4:exitCodeCheck: echo 'shell execute return value is' 1 'is not 0'rn+/usr/local/bin/ExitCodeCheck.sh:5:exitCodeCheck: exit 1[/code] 论坛

Initialization of bean failed; nested exception is java.lang.NoSuchMethodError

07-08

[code=text]INFO: Destroying singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@42543a51: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalPersistenceAnnotationProcessor,sessionFactory,txManager,org.springframework.aop.config.internalAutoProxyCreator,org.springframework.transaction.annotation.AnnotationTransactionAttributeSource#0,org.springframework.transaction.interceptor.TransactionInterceptor#0,org.springframework.transaction.config.internalTransactionAdvisor,serviceBean]; root of factory hierarchyrnorg.springframework.beans.factory.BeanCreationException: Error creating bean with name 'sessionFactory' defined in class path resource [copyOfBeans.xml]: Initialization of bean failed; nested exception is java.lang.NoSuchMethodError: org.springframework.core.annotation.AnnotationUtils.getAnnotation(Ljava/lang/reflect/AnnotatedElement;Ljava/lang/Class;)Ljava/lang/annotation/Annotation;rn at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:527)rn at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)rn at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:291)rn at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)rn at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:288)rn at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:190)rn at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:563)rn at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:895)rn at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:425)rn at org.springframework.context.support.ClassPathXmlApplicationContext.(ClassPathXmlApplicationContext.java:139)rn at org.springframework.context.support.ClassPathXmlApplicationContext.(ClassPathXmlApplicationContext.java:83)rn at junit.test.PersonServiceTest.setUpBeforeClass(PersonServiceTest.java:21)rn at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)rn at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)rn at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)rn at java.lang.reflect.Method.invoke(Unknown Source)rn at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)rn at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)rn at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)rn at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:27)rn at org.junit.runners.ParentRunner.run(ParentRunner.java:236)rn at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:50)rn at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)rn at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467)rn at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683)rn at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390)rn at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197)rnCaused by: java.lang.NoSuchMethodError: org.springframework.core.annotation.AnnotationUtils.getAnnotation(Ljava/lang/reflect/AnnotatedElement;Ljava/lang/Class;)Ljava/lang/annotation/Annotation;rn at org.springframework.transaction.annotation.SpringTransactionAnnotationParser.parseTransactionAnnotation(SpringTransactionAnnotationParser.java:39)rn at org.springframework.transaction.annotation.AnnotationTransactionAttributeSource.determineTransactionAttribute(AnnotationTransactionAttributeSource.java:147)rn at org.springframework.transaction.annotation.AnnotationTransactionAttributeSource.findTransactionAttribute(AnnotationTransactionAttributeSource.java:126)rn at org.springframework.transaction.interceptor.AbstractFallbackTransactionAttributeSource.computeTransactionAttribute(AbstractFallbackTransactionAttributeSource.java:146)rn at org.springframework.transaction.interceptor.AbstractFallbackTransactionAttributeSource.getTransactionAttribute(AbstractFallbackTransactionAttributeSource.java:99)rn at org.springframework.transaction.interceptor.TransactionAttributeSourcePointcut.matches(TransactionAttributeSourcePointcut.java:37)rn at org.springframework.aop.support.AopUtils.canApply(AopUtils.java:217)rn at org.springframework.aop.support.AopUtils.canApply(AopUtils.java:254)rn at org.springframework.aop.support.AopUtils.findAdvisorsThatCanApply(AopUtils.java:286)rn at org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findAdvisorsThatCanApply(AbstractAdvisorAutoProxyCreator.java:117)rn at org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findEligibleAdvisors(AbstractAdvisorAutoProxyCreator.java:87)rn at org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.getAdvicesAndAdvisorsForBean(AbstractAdvisorAutoProxyCreator.java:68)rn at org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.wrapIfNecessary(AbstractAutoProxyCreator.java:359)rn at org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessAfterInitialization(AbstractAutoProxyCreator.java:322)rn at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsAfterInitialization(AbstractAutowireCapableBeanFactory.java:407)rn at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1426)rn at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)rn ... 26 more[/code]rnrnapplicationContext.xml(copyOfBeans.xml)如下:rn[code=text]rnrn rn rn rn rn rn rn rn rn rn rn rn rn rn rn rn rn rn[/code]rnrnhibernate.cfg.xml如下:rn[code=text]rnrnrn rn oracle.jdbc.driver.OracleDriverrn jdbc:oracle:thin:@localhost:1521:orclrn aarn vincentrn rn org.hibernate.dialect.Oracle10gDialectrn rn updatern rn truern rn rn rn rn rn rn[/code]rnrnspring3.2.3+hibernate4.2.3+oracle11g 在JUNIT测试保存一条数据然后报错,是注解注入方面么,我不知道少了神马方法,它说是初始化BEAN时候潜在异常有个方法没了,我不解菜鸟求高手指教! 论坛

没有更多推荐了,返回首页