eclipse远程调式hadoop

远程连接hadoop分布式环境
1、确保分布式环境版本与eclipse插件版本要一致(0.20.205.0),否则连接是提示:

[img]http://dl2.iteye.com/upload/attachment/0109/3746/545fb9e5-e871-36b1-9e4f-26ea77aa1dd0.png[/img]

2、插件重新打包,需要把
lib/jackson-core-asl-1.8.8.jar,lib/jackson-mapper-asl-1.8.8.jar,lib/commons-configuration-1.6.jar,lib/commons-lang-2.4.jar,lib/commons-httpclient-3.0.1.jar,lib/commons-cli-1.2.jar,打进插件包中,如下:

[img]http://dl2.iteye.com/upload/attachment/0109/3748/42843005-eedc-3776-8b30-e0c6af20f30f.png[/img]
MANIFEST.MF修改:
Bundle-ClassPath: classes/,lib/hadoop-core.jar[color=red],lib/jackson-core-asl-1.8.8.jar,lib/jackson-mapper-asl-1.8.8.jar,lib/commons-configuration-1.6.jar,lib/commons-lang-2.4.jar,lib/commons-httpclient-3.0.1.jar,lib/commons-cli-1.2.jar[/color]
否则出现:
关于eclipse无法连接报错:
"Map/Reduce location status updater". org/codehaus/jackson/map/JsonMappingException
经过查询,是由于hadoop的eclipse 插件里面缺少了包

3、准备测试类

package com.hadoop.learn.test;

import java.io.IOException;
import java.util.StringTokenizer;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.util.GenericOptionsParser;
import org.apache.log4j.Logger;

public class WordCountTest {
private static final Logger log = Logger.getLogger(WordCountTest.class);

public static class TokenizerMapper extends Mapper<Object, Text, Text, IntWritable> {
private final static IntWritable one = new IntWritable(1);
private Text word = new Text();

public void map(Object key, Text value, Context context) throws IOException, InterruptedException {
log.info("Map key : " + key);
log.info("Map value : " + value);
StringTokenizer itr = new StringTokenizer(value.toString());
while (itr.hasMoreTokens()) {
String wordStr = itr.nextToken();
word.set(wordStr);
log.info("Map word : " + wordStr);
context.write(word, one);
}
}
}

public static class IntSumReducer extends Reducer<Text, IntWritable, Text, IntWritable> {
private IntWritable result = new IntWritable();

public void reduce(Text key, Iterable<IntWritable> values, Context context) throws IOException, InterruptedException {
log.info("Reduce key : " + key);
log.info("Reduce value : " + values);
int sum = 0;
for (IntWritable val : values) {
sum += val.get();
}
result.set(sum);
log.info("Reduce sum : " + sum);
context.write(key, result);
}
}

public static void main(String[] args) throws Exception {
Configuration config = new Configuration();
String[] otherArgs = new GenericOptionsParser(config, args).getRemainingArgs();
if (otherArgs.length != 2) {
System.err.println("Usage: WordCountTest <in> <out>");
System.exit(2);
}

Job job = new Job(config, "word count test");
job.setJarByClass(WordCountTest.class);
job.setMapperClass(TokenizerMapper.class);
job.setCombinerClass(IntSumReducer.class);
job.setReducerClass(IntSumReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);

FileInputFormat.addInputPath(job, new Path(otherArgs[0]));
FileOutputFormat.setOutputPath(job, new Path(otherArgs[1]));

System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}



4、配置host
192.168.197.131 hadoop-namenode
否则入参的地址已ip会出错:

[img]http://dl2.iteye.com/upload/attachment/0109/3750/0b55c547-4718-3fd2-83d9-5afb388a8118.jpg[/img]

java.lang.IllegalArgumentException: Wrong FS: hdfs://192.186.54.1:8020/user/hadoop/test.txt, expected: hdfs://hadoop1
正确如下:

[img]http://dl2.iteye.com/upload/attachment/0109/3752/c896d10c-ae5e-39fd-a8c7-42f3c509ecbd.jpg[/img]

5、hadoop-core-0.20.205.0.jar重新编译
运行是,可能报错:
12/04/24 15:32:44 ERROR security.UserGroupInformation: PriviledgedActionException as:Administrator cause:java.io.IOException: Failed to set permissions of path: \tmp\hadoop-Administrator\mapred\staging\Administrator-519341271\.staging to 0700
Exception in thread "main" java.io.IOException: Failed to set permissions of path: \tmp\hadoop-Administrator\mapred\staging\Administrator-519341271\.staging to 0700
at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:682)
这是由于Windows下文件权限问题,在Linux下可以正常运行,不存在这样的问题。
解决方法是修改 F:\编程开发\hadoop\older\hadoop-0.20.203.0rc1\hadoop-0.20.203.0\src\core\org \apache\hadoop\fs\RawLocalFileSystem.java ,注释掉 checkReturnValue(有些粗暴,在Window下,可以不用检查)

[img]http://dl2.iteye.com/upload/attachment/0109/3754/f0d50383-fbaa-3294-90c9-de043b0935d9.png[/img]
然后重新编译,编译可能出错
(1)ant开始下载依赖和编译文件。 我在编译的时候编译错误。经查,是$hadoop_home/src/saveVersion.sh生成的package-info.java有问题,导致无法编译过去。将saveVersion.sh修改一下:

[img]http://dl2.iteye.com/upload/attachment/0109/3756/d9c5458d-bc91-3879-913b-d7cb233209ce.png[/img]
(2):/hadoop/mapred/gridmix/Gridmix.java:396: 错误: 类型参数? extends T不在类型变量E的范围内
这个问题则需要修改/src/contrib/gridmix/src/java/org/apache/hadoop/mapred/gridmix/Gridmix.java这个文件。
原: private <T> String getEnumValues(Enum<? extends T>[] e) {
+改: private String getEnumValues(Enum<?>[] e) {
StringBuilder sb = new StringBuilder();
String sep = "";
-原: for (Enum<? extends T> v : e) {
+改: for (Enum<?> v : e) {
sb.append(sep); sb.append(v.name()); sep = "|";

以上准备完成后,将执行成功

[img]http://dl2.iteye.com/upload/attachment/0109/3758/aafc57ce-40f1-3888-87a8-f3ea7a06a04f.png[/img]
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值