haddoop 0.20.2 java_java.io.IOException: Error opening job jar: hadoop-0.20.2-examples.jar

今天在执行hadoop自带例子测试的时候

$ bin/hadoop jar hadoop-0.20.1-examples.jar wordcount input output

Exception in thread "m

ain" java.io.IOException: Error opening job jar: hadoop-0.20.2-examples.jar

at org.apache.hadoop.util.RunJar.main(RunJar.java:90)

Caused by: java.util.zip.ZipException: error in opening zip file

at java.util.zip.ZipFile.open(Native Method)

at java.util.zip.ZipFile.(ZipFile.java:114)

at java.util.jar.JarFile.(JarFile.java:135)

at java.util.jar.JarFile.(JarFile.java:72)

at org.apache.hadoop.util.RunJar.main(RunJar.java:88)

解决办法:

bin/hadoop dfs -rmr input

bin/hadoop dfs -put conf input

$ bin/hadoop jar hadoop-*-examples.jar grep input output 'dfs[a-z.]+'

11/01/18 19:21:40 INFO mapred.FileInputFormat: Total input paths to process : 13

11/01/18 19:21:40 INFO mapred.JobClient: Running job: job_201101181725_0011

11/01/18 19:21:41 INFO mapred.JobClient:  map 0% reduce 0%

11/01/18 19:21:49 INFO mapred.JobClient:  map 15% reduce 0%

11/01/18 19:21:50 INFO mapred.JobClient:  map 30% reduce 0%

11/01/18 19:21:52 INFO mapred.JobClient:  map 46% reduce 0%

11/01/18 19:21:53 INFO mapred.JobClient:  map 61% reduce 0%

11/01/18 19:21:55 INFO mapred.JobClient:  map 76% reduce 0%

11/01/18 19:21:56 INFO mapred.JobClient:  map 92% reduce 0%

11/01/18 19:21:58 INFO mapred.JobClient:  map 100% reduce 30%

11/01/18 19:22:07 INFO mapred.JobClient:  map 100% reduce 100%

11/01/18 19:22:09 INFO mapred.JobClient: Job complete: job_201101181725_0011

11/01/18 19:22:09 INFO mapred.JobClient: Counters: 18

11/01/18 19:22:09 INFO mapred.JobClient:   Job Counters

11/01/18 19:22:09 INFO mapred.JobClient:     Launched reduce tasks=1

11/01/18 19:22:09 INFO mapred.JobClient:     Launched map tasks=13

11/01/18 19:22:09 INFO mapred.JobClient:     Data-local map tasks=13

11/01/18 19:22:09 INFO mapred.JobClient:   FileSystemCounters

11/01/18 19:22:09 INFO mapred.JobClient:     FILE_BYTES_READ=132

11/01/18 19:22:09 INFO mapred.JobClient:     HDFS_BYTES_READ=17950

11/01/18 19:22:09 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=752

11/01/18 19:22:09 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=248

11/01/18 19:22:09 INFO mapred.JobClient:   Map-Reduce Framework

11/01/18 19:22:09 INFO mapred.JobClient:     Reduce input groups=6

11/01/18 19:22:09 INFO mapred.JobClient:     Combine output records=6

11/01/18 19:22:09 INFO mapred.JobClient:     Map input records=545

11/01/18 19:22:09 INFO mapred.JobClient:     Reduce shuffle bytes=198

11/01/18 19:22:09 INFO mapred.JobClient:     Reduce output records=6

11/01/18 19:22:09 INFO mapred.JobClient:     Spilled Records=12

11/01/18 19:22:09 INFO mapred.JobClient:     Map output bytes=169

11/01/18 19:22:09 INFO mapred.JobClient:     Map input bytes=17950

11/01/18 19:22:09 INFO mapred.JobClient:     Combine input records=9

11/01/18 19:22:09 INFO mapred.JobClient:     Map output records=9

11/01/18 19:22:09 INFO mapred.JobClient:     Reduce input records=6

11/01/18 19:22:09 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.

11/01/18 19:22:09 INFO mapred.FileInputFormat: Total input paths to process : 1

11/01/18 19:22:10 INFO mapred.JobClient: Running job: job_201101181725_0012

11/01/18 19:22:11 INFO mapred.JobClient:  map 0% reduce 0%

11/01/18 19:22:20 INFO mapred.JobClient:  map 100% reduce 0%

11/01/18 19:22:32 INFO mapred.JobClient:  map 100% reduce 100%

11/01/18 19:22:34 INFO mapred.JobClient: Job complete: job_201101181725_0012

11/01/18 19:22:34 INFO mapred.JobClient: Counters: 18

11/01/18 19:22:34 INFO mapred.JobClient:   Job Counters

11/01/18 19:22:34 INFO mapred.JobClient:     Launched reduce tasks=1

11/01/18 19:22:34 INFO mapred.JobClient:     Launched map tasks=1

11/01/18 19:22:34 INFO mapred.JobClient:     Data-local map tasks=1

11/01/18 19:22:34 INFO mapred.JobClient:   FileSystemCounters

11/01/18 19:22:34 INFO mapred.JobClient:     FILE_BYTES_READ=132

11/01/18 19:22:34 INFO mapred.JobClient:     HDFS_BYTES_READ=248

11/01/18 19:22:34 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=296

11/01/18 19:22:34 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=78

11/01/18 19:22:34 INFO mapred.JobClient:   Map-Reduce Framework

11/01/18 19:22:34 INFO mapred.JobClient:     Reduce input groups=3

11/01/18 19:22:34 INFO mapred.JobClient:     Combine output records=0

11/01/18 19:22:34 INFO mapred.JobClient:     Map input records=6

11/01/18 19:22:34 INFO mapred.JobClient:     Reduce shuffle bytes=132

11/01/18 19:22:34 INFO mapred.JobClient:     Reduce output records=6

11/01/18 19:22:34 INFO mapred.JobClient:     Spilled Records=12

11/01/18 19:22:34 INFO mapred.JobClient:     Map output bytes=114

11/01/18 19:22:34 INFO mapred.JobClient:     Map input bytes=162

11/01/18 19:22:34 INFO mapred.JobClient:     Combine input records=0

11/01/18 19:22:34 INFO mapred.JobClient:     Map output records=6

11/01/18 19:22:34 INFO mapred.JobClient:     Reduce input records=6

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值