在windows 使用eclipse 跨平台运行mapreduce 提示连接不上

错误提示:call from 10.32.6.150:8020 failed on connection 

 如果你遇到这个问题了,在网络正常的情况下,更改你的代码,如下:

 

 1 System.setProperty("HADOOP_USER_NAME", "root");        
 2         Configuration conf = new Configuration();  
 3         conf.set("fs.defaultFS", "hdfs://10.32.6.150:9000");
 4         conf.set("mapreduce.jobtracker.address", "hdfs://10.32.6.150:9001");
 5         conf.set("mapreduce.framework.name", "yarn");
 6         conf.set("mapreduce.app-submission.cross-platform", "true");//为了解决跨平台的问题
 7         conf.set("fs.defaultFS", "hdfs://10.32.6.150:9000");
 8         conf.set("mapreduce.jobtracker.address", "hdfs://10.32.6.150:9001");
 9         conf.set("mapreduce.framework.name", "yarn");
10         conf.set("mapreduce.app-submission.cross-platform","true");
11         Job wordCountJob = Job.getInstance(conf,"mr");  
12         //重要:指定本job所在的jar包  
13         wordCountJob.setJar("E:\\Desktop\\WordCount1.jar");
14         wordCountJob.setJarByClass(WordCount.class);  
15           
16         //设置wordCountJob所用的mapper逻辑类为哪个类  
17         wordCountJob.setMapperClass(WordCountMapper.class);  
18         //设置wordCountJob所用的reducer逻辑类为哪个类  
19         wordCountJob.setReducerClass(WordCountReducer.class);  
20           
21         //设置map阶段输出的kv数据类型  
22         wordCountJob.setMapOutputKeyClass(Text.class);  
23         wordCountJob.setMapOutputValueClass(IntWritable.class);  
24           
25         //设置最终输出的kv数据类型  
26         wordCountJob.setOutputKeyClass(Text.class);  
27         wordCountJob.setOutputValueClass(IntWritable.class);  
28           
29         //设置要处理的文本数据所存放的路径  
30         FileInputFormat.setInputPaths(wordCountJob, "hdfs://10.32.6.150:9000/1.txt");  
31         FileOutputFormat.setOutputPath(wordCountJob, new Path("hdfs://10.32.6.150/outputsq111/"));  
32           
33         //提交job给hadoop集群  
34         wordCountJob.waitForCompletion(true);  
 1 System.setProperty("HADOOP_USER_NAME", "root");
 2         Configuration conf = new Configuration();
 3         conf.set("fs.defaultFS", "hdfs://10.32.6.150:9000");
 4         conf.set("mapreduce.jobtracker.address", "hdfs://10.32.6.150:9001");
 5         conf.set("mapreduce.framework.name", "yarn");
 6         conf.set("mapreduce.app-submission.cross-platform","true");
 7         //创建工作对象
 8         Job job = Job.getInstance(conf, "wc");
 9         //添加架包路径
10         job.setJar("E:\\Desktop\\WordCount.jar");
11         job.setJarByClass(WordCountOther.class);
12         
13         //指定map类和reduce类
14         job.setMapperClass(WordCountMapper.class);
15         job.setReducerClass(WordCountReducer.class);
16         
17         //指定map的输出类型
18         job.setMapOutputKeyClass(Text.class);
19         job.setMapOutputValueClass(IntWritable.class);
20         
21         //指定最终的输出类型
22         job.setOutputKeyClass(Text.class);
23         job.setOutputValueClass(IntWritable.class);
24         
25         //指定job的输入输出目录
26         FileInputFormat.setInputPaths(job, new Path("hdfs://10.32.6.150:9000/1.txt"));
27         FileOutputFormat.setOutputPath(job, new Path("hdfs://10.32.6.150:9000/output321/"));
28         
29         boolean result = job.waitForCompletion(true);
30         System.exit(result?0:1);

上边的是报错的代码,下边是可以运行的代码,两者看不出有什么区别,我反复以为网络的问题(因为是内网,觉得会有防火墙之类的),后来执行下同学写的,运行成功

解决方案:这个问题找不到原因,不是网络问题,应该也不是eclipse的问题,遇到了就重新写份job代码:)

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值