解决 hadoop调用job.waitForCompletion(true);这个函数为false和返回空指针

第一步 把linux的hadoop-2.6.0-cdh5.15.1.tar 下载到本地

第二步 以管理员方式解压hadoop-2.6.0-cdh5.15.1.tar

第三步 把对应版本的hadoop.dll(这个去https://github.com/steveloughran/winutils (/bin目录下)找对应的版本,我这边是2.6的版本), 放到C:\Windows\System32下面

第四步 设置环境变量

变量名 HADOOP_HOME

变量值 C:\KuGou\hadoop-2.6.0-cdh5.15.1\hadoop-2.6.0-cdh5.15.1

然后在path中 新建 %HADOOP_HOME%\bin

第五步 打开C:\KuGou\hadoop-2.6.0-cdh5.15.1\hadoop-2.6.0-cdh5.15.1\etc\hadoop(对应你自己解压的tar.gz的目录)

找到hadoop-env.cmd 然后用记事本打开,找到set JAVA_HOME,改为

set JAVA_HOME=C:\PROGRA~1\java\jdk1.8.0_25(你的jdk目录)

用路径替代符

C:\PROGRA~1\Java\jdk1.8.0_91

PROGRA~1  ===== C:\Program Files 目录的dos文件名模式下的缩写
长于8个字符的文件名和文件夹名,都被简化成前面6个有效字符,后面~1,有重名的就 ~2,~3

  • 2
    点赞
  • 8
    收藏
    觉得还不错? 一键收藏
  • 1
    评论
帮我解释下面的代码:import java.io.IOException; import java.util.StringTokenizer; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.Path; import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Job; import org.apache.hadoop.mapreduce.Mapper; import org.apache.hadoop.mapreduce.Reducer; import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; import org.apache.hadoop.util.GenericOptionsParser; public class WordCount { public static class TokenizerMapper extends Mapper<Object, Text, Text, IntWritable>{ private final static IntWritable one = new IntWritable(1); private Text word = new Text(); public void map(Object key, Text value, Context context ) throws IOException, InterruptedException { StringTokenizer itr = new StringTokenizer(value.toString()); while (itr.hasMoreTokens()) { word.set(itr.nextToken()); context.write(word, one); } } } public static class IntSumReducer extends Reducer<Text,IntWritable,Text,IntWritable> { private IntWritable result = new IntWritable(); public void reduce(Text key, Iterable<IntWritable> values, Context context ) throws IOException, InterruptedException { int sum = 0; for (IntWritable val : values) { sum += val.get(); } result.set(sum); context.write(key, result); } } public static void main(String[] args) throws Exception { Configuration conf = new Configuration(); String[] otherArgs = new GenericOptionsParser(conf, args).getRemainingArgs(); if (otherArgs.length != 2) { System.err.println("Usage: wordcount <in> <out>"); System.exit(2); } Job job = new Job(conf, "word count"); job.setJarByClass(WordCount.class); job.setMapperClass(TokenizerMapper.class); job.setCombinerClass(IntSumReducer.class); job.setReducerClass(IntSumReducer.class); job.setOutputKeyClass(Text.class); job.setOutputValueClass(IntWritable.class); FileInputFormat.addInputPath(job, new Path(otherArgs[0])); FileOutputFormat.setOutputPath(job, new Path(otherArgs[1])); System.exit(job.waitForCompletion(true) ? 0 : 1); } }
最新发布
05-31
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值