hadoop启动java.lang.nullpointerexception_eclipse连接hadoop2.2.0中运行wordcount报错java.lang.NullPointerExcept...

在Windows 7环境下使用eclipse连接Hadoop 2.2.0集群,尝试运行Wordcount示例时遇到了空指针异常。尽管能够进行文件系统的操作,但在执行Wordcount时,程序抛出了NullPointerException。问题可能涉及到配置、插件或者Hadoop环境设置。代码中展示了Job的配置和异常堆栈,异常出现在启动进程的过程中。
摘要由CSDN通过智能技术生成

请问,有没有人遇到该问题,如何解决的,本人刚接触hadoop我用Windows7装的eclipse,安装好插件后,连接另一台部署的伪分布集群机器,现在可以向hdfs文件系统中上传和删除文件及目录。但是运行Wordcount列子报空指针错误。。谢谢。。。。

//代码

=== =======================================================Configuration conf = new Configuration();

//String[] otherArgs = new GenericOptionsParser(conf, args).getRemainingArgs();

String[] otherArgs = new String[] {

"/input/README.txt",

"/output"

};

if (otherArgs.length != 2) {

System.err.println("Usage: wordcount ");

System.exit(2);

}

Job job = new Job(conf, "WCDemo");

//JobClient job=new JobClient();

job.setJarByClass(WordCount.class);

job.setMapperClass(TokenizerMapper.class);

job.setCombinerClass(IntSumReducer.class);

job.setReducerClass(IntSumReducer.class);

job.setOutputKeyClass(Text.class);

job.setOutputValueClass(IntWritable.class);

FileInputFormat.addInputPath(job, new Path(otherArgs[0]));

FileOutputFormat.setOutputPath(job, new Path(otherArgs[1]));

System.exit(job.waitForCompletion(true) ? 0 : 1);

//控制台信息

log4j: WARN No appenders could be found

for logger(org.apache.hadoop.metrics2.lib.MutableMetricsFactory).log4j: WARN Please initialize the log4j system properly.log4j: WARN See http: //logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

Exception in thread "main"java.lang.NullPointerException at java.lang.ProcessBuilder.start(ProcessBuilder.java: 441) at org.apache.hadoop.util.Shell.runCommand(Shell.java: 404) at org.apache.hadoop.util.Shell.run(Shell.java: 379) at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java: 589) at org.apache.hadoop.util.Shell.execCommand(Shell.java: 678) at org.apache.hadoop.util.Shell.execCommand(Shell.java: 661) at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java: 639) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java: 435) at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java: 277) at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java: 125) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java: 344) at org.apache.hadoop.mapreduce.Job$10.run(Job.java: 1268) at org.apache.hadoop.mapreduce.Job$10.run(Job.java: 1265) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java: 396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1491) at org.apache.hadoop.mapreduce.Job.submit(Job.java: 1265) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java: 1286) at com.hadoop.example.WordCount.main(WordCount.java: 113)

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值