Win7下的eclipse运行mapreduce程序报WordCount$TokenizerMapper not found

一、错误描述

    在eclipse里有我把CDH集群的core-site.xml,mapred-site.xml,hdfs-site.xml都放到eclipse的src目录下,这样运行是获取服务器上的配置信息,程序需要把相关的class文件上传到集群上才能正常运行,然后在eclipse里运行其实是在本地运行。所以解决办法

log4j:WARN No such property [maxBackupIndex] in org.apache.log4j.DailyRollingFileAppender.
log4j:WARN No such property [maxFileSize] in org.apache.log4j.DailyRollingFileAppender.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/F:/CDH4/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/F:/workspace1/recommendation/lib/mahout-examples-0.7-cdh4.1.2-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
[15:31:52,984][ WARN][main][org.apache.hadoop.mapred.JobClient:704] - Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
[15:31:53,028][ WARN][main][org.apache.hadoop.mapred.JobClient:830] - No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
[15:31:53,049][ INFO][main][org.apache.hadoop.mapreduce.lib.input.FileInputFormat:233] - Total input paths to process : 1
[15:31:53,094][ WARN][main][org.apache.hadoop.util.NativeCodeLoader:62] - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[15:32:02,053][ INFO][main][org.apache.hadoop.mapred.JobClient:1386] - Running job: job_201407171020_0067
[15:32:03,056][ INFO][main][org.apache.hadoop.mapred.JobClient:1399] -  map 0% reduce 0%
[15:32:13,082][ INFO][main][org.apache.hadoop.mapred.JobClient:1428] - Task Id : attempt_201407171020_0067_m_000000_0, Status : FAILED
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class com.panguoyuan.mapreduce.WordCount$TokenizerMapper not found
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1571)
	at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java:191)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:605)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.lang.ClassNotFoundException: Class com.panguoyuan.mapreduce.WordCount$TokenizerMapper not found
	at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1477)
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1569)
	... 8 more

二、解决办法

      把src的core-site.xml,mapred-site.xml,hdfs-site.xml三个配置文件拷贝到某一目录下备份,然后从eclipse工程里删除掉,这时运行是运行LocalRunner的方式。删除这三个配置文件后,错误信息消除,mapreduce能正常运行了。

log4j:WARN No such property [maxBackupIndex] in org.apache.log4j.DailyRollingFileAppender.
log4j:WARN No such property [maxFileSize] in org.apache.log4j.DailyRollingFileAppender.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/F:/CDH4/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/F:/workspace1/recommendation/lib/mahout-examples-0.7-cdh4.1.2-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
[15:33:03,707][ WARN][main][org.apache.hadoop.conf.Configuration:808] - session.id is deprecated. Instead, use dfs.metrics.session-id
[15:33:03,710][ INFO][main][org.apache.hadoop.metrics.jvm.JvmMetrics:76] - Initializing JVM Metrics with processName=JobTracker, sessionId=
[15:33:03,759][ WARN][main][org.apache.hadoop.util.NativeCodeLoader:62] - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[15:33:03,910][ WARN][main][org.apache.hadoop.mapred.JobClient:704] - Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
[15:33:04,032][ WARN][main][org.apache.hadoop.mapred.JobClient:830] - No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
[15:33:04,138][ INFO][main][org.apache.hadoop.mapreduce.lib.input.FileInputFormat:233] - Total input paths to process : 1
[15:33:05,214][ INFO][Thread-18][org.apache.hadoop.mapred.LocalJobRunner:192] - OutputCommitter set in config null
[15:33:05,215][ INFO][main][org.apache.hadoop.mapred.JobClient:1386] - Running job: job_local_0001
[15:33:05,220][ INFO][Thread-18][org.apache.hadoop.mapred.LocalJobRunner:210] - OutputCommitter is org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
[15:33:05,290][ WARN][Thread-18][mapreduce.Counters:224] - Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
[15:33:05,308][ INFO][Thread-18][org.apache.hadoop.mapred.Task:533] -  Using ResourceCalculatorPlugin : null
[15:33:05,321][ INFO][Thread-18][org.apache.hadoop.mapred.MapTask:792] - io.sort.mb = 100
[15:33:05,375][ INFO][Thread-18][org.apache.hadoop.mapred.MapTask:804] - data buffer = 79691776/99614720
[15:33:05,375][ INFO][Thread-18][org.apache.hadoop.mapred.MapTask:805] - record buffer = 262144/327680
[15:33:05,504][ INFO][Thread-18][org.apache.hadoop.mapred.LocalJobRunner:370] - 
[15:33:05,526][ INFO][Thread-18][org.apache.hadoop.mapred.MapTask:1131] - Starting flush of map output
[15:33:05,601][ INFO][Thread-18][org.apache.hadoop.mapred.MapTask:1311] - Finished spill 0
[15:33:05,607][ INFO][Thread-18][org.apache.hadoop.mapred.Task:846] - Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
[15:33:05,662][ INFO][Thread-18][org.apache.hadoop.mapred.LocalJobRunner:370] - 
[15:33:05,664][ INFO][Thread-18][org.apache.hadoop.mapred.Task:958] - Task 'attempt_local_0001_m_000000_0' done.
[15:33:05,668][ WARN][Thread-18][mapreduce.Counters:224] - Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
[15:33:05,683][ INFO][Thread-18][org.apache.hadoop.mapred.Task:533] -  Using ResourceCalculatorPlugin : null
[15:33:05,684][ INFO][Thread-18][org.apache.hadoop.mapred.LocalJobRunner:370] - 
[15:33:05,689][ INFO][Thread-18][org.apache.hadoop.mapred.Merger:390] - Merging 1 sorted segments
[15:33:05,703][ INFO][Thread-18][org.apache.hadoop.mapred.Merger:473] - Down to the last merge-pass, with 1 segments left of total size: 1546 bytes
[15:33:05,703][ INFO][Thread-18][org.apache.hadoop.mapred.LocalJobRunner:370] - 
[15:33:05,789][ INFO][Thread-18][org.apache.hadoop.mapred.Task:846] - Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
[15:33:05,790][ INFO][Thread-18][org.apache.hadoop.mapred.LocalJobRunner:370] - 
[15:33:05,792][ INFO][Thread-18][org.apache.hadoop.mapred.Task:999] - Task attempt_local_0001_r_000000_0 is allowed to commit now
[15:33:05,814][ INFO][Thread-18][org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter:173] - Saved output of task 'attempt_local_0001_r_000000_0' to hdfs://10.71.197.94:8020/op
[15:33:05,814][ INFO][Thread-18][org.apache.hadoop.mapred.LocalJobRunner:370] - reduce > reduce
[15:33:05,816][ INFO][Thread-18][org.apache.hadoop.mapred.Task:958] - Task 'attempt_local_0001_r_000000_0' done.
[15:33:06,218][ INFO][main][org.apache.hadoop.mapred.JobClient:1399] -  map 100% reduce 100%
[15:33:06,219][ INFO][main][org.apache.hadoop.mapred.JobClient:1454] - Job complete: job_local_0001
[15:33:06,220][ INFO][main][org.apache.hadoop.mapred.JobClient:566] - Counters: 22
[15:33:06,220][ INFO][main][org.apache.hadoop.mapred.JobClient:568] -   File System Counters
[15:33:06,220][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     FILE: Number of bytes read=1898
[15:33:06,220][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     FILE: Number of bytes written=172868
[15:33:06,220][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     FILE: Number of read operations=0
[15:33:06,220][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     FILE: Number of large read operations=0
[15:33:06,221][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     FILE: Number of write operations=0
[15:33:06,221][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     HDFS: Number of bytes read=2188
[15:33:06,221][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     HDFS: Number of bytes written=1244
[15:33:06,221][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     HDFS: Number of read operations=11
[15:33:06,221][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     HDFS: Number of large read operations=0
[15:33:06,221][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     HDFS: Number of write operations=3
[15:33:06,221][ INFO][main][org.apache.hadoop.mapred.JobClient:568] -   Map-Reduce Framework
[15:33:06,222][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     Map input records=75
[15:33:06,222][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     Map output records=75
[15:33:06,222][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     Map output bytes=1394
[15:33:06,222][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     Input split bytes=101
[15:33:06,222][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     Combine input records=75
[15:33:06,222][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     Combine output records=75
[15:33:06,222][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     Reduce input groups=75
[15:33:06,223][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     Reduce shuffle bytes=0
[15:33:06,223][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     Reduce input records=75
[15:33:06,223][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     Reduce output records=75
[15:33:06,223][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     Spilled Records=150
[15:33:06,223][ INFO][main][org.apache.hadoop.mapred.JobClient:570] -     Total committed heap usage (bytes)=1065484288




  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 2
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值