ubuntu下生成Hadoop的eclipse插件

编译前的准备:
我用到是ubuntu12,默认没有安装ant。
第一步安装Ant
1,从http://www.apache.org/dist/ant 下载 apache-ant-1.8.4-bin.tar.gz
2,解压到/usr/local/ant
3,设置环境变量
ANT_HOME="/usr/local/ant"
export PATH=${ANT_HOME}/bin:$PATH
4,添加ant.sh执行权限
$ sudo chmod a+x ant.sh
5,重新load修改
$ source ant.sh
6,看ant版本
hduser@localhost:/etc/profile.d$ ant -version
Apache Ant(TM) version 1.8.4 compiled on May 22 2012

第二步安装autoconf
sudo apt-get install automake autoconf

源文档 <http://stackoverflow.com/questions/11126612/ubuntu-issue-setting-up-autoconf-and-automake>
安装后在hadoop根目录下运行ant
hduser@localhost:/usr/local/hadoop$ ant
报错
     [exec] configure.ac:48: error: possibly undefined macro: AC_PROG_LIBTOOL
     [exec]       If this token and others are legitimate, please use m4_pattern_allow.
     [exec]       See the Autoconf documentation.
安装libtool后继续
sudo apt-get install libtool

源文档 <http://www.cnblogs.com/lexus/archive/2013/02/01/2888521.html>
出现BUILD SUCCESSFUL。

 

第三步编译

1,打开hadoop-version/src/contrib/eclipse-plugin/build.xml
修改<fileset dir="${eclipse.home}/plugins/">为<fileset dir="/usr/local/eclipse/plugins/">
2,用ant创建jar包

修改$Hadoop_sr_home/src/contrib/build-contrib.xml :

  添加<property name="eclipse.home" location="/usr/local/eclipse"/>
  添加<property name="version" location="1.1.2-SNAPSHOT"/>      //版本和${hadoop_home}/build.xml中的version一致
  修改<property name="hadoop.root" localton="/usr/local/hadoop"/>

 

编译hduser@localhost:/usr/local/hadoop/src/contrib/eclipse-plugin$ant jar                 ^

    [javac]/home/hduser/hadoop/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/dfs/DFSFile.java:74:error: cannot find symbol

    [javac]       FileStatus fs =getDFS().getFileStatus(path);

    [javac]       ^

    [javac]  symbol:   class FileStatus

    [javac]  location: class DFSFile

    [javac] Note: Some input files use oroverride a deprecated API.

    [javac] Note: Recompile with-Xlint:deprecation for details.

    [javac] Note: Some input files useunchecked or unsafe operations.

    [javac] Note: Recompile with-Xlint:unchecked for details.

    [javac] 100 errors

 

BUILD FAILED

/home/hduser/hadoop/src/contrib/eclipse-plugin/build.xml:61:Compile failed; see the compiler error output for details.

切换到hadoop根目录下编译lib : hduser@localhost:hadoop$ ant

BUILD SUCCESSFUL
Total time: 11 minutes 4 seconds

 

再次编译hduser@localhost:/usr/local/hadoop/src/contrib/eclipse-plugin$ ant jar


BUILD FAILED
/usr/local/hadoop/src/contrib/eclipse-plugin/build.xml:70: Warning: Could not find file /usr/local/hadoop/build/hadoop-core- /usr/local/hadoop/src/contrib/eclipse-plugin/1.1.2-SNAPSHOT.jar to copy.

http://blog.csdn.net/martin_liang/article/details/8452931

把hadoop_home/src/contrib/eclipse-plugin/build.xml里的<copyfile="${hadoop.root}/build/hadoop-core-${version}.jar"  tofile="${build.dir}/lib"verbose="true"/>注释掉

然后把$hadoop_home/build/hadoop-core-1.1.2-SNAPSHOT.jar 拷贝到$hadoop_home/build/contrib/eclipse-plugin/lib目录下$ mv/home/hduser/hadoop/build/hadoop-core-1.1.2-SNAPSHOT.jar/home/hduser/hadoop/build/contrib/eclipse-plugin/lib


jar:
      [jar] Building jar: /usr/local/hadoop/build/contrib/eclipse-plugin/hadoop-eclipse-plugin-/usr/local/hadoop/src/contrib/eclipse-plugin/1.1.2-SNAPSHOT.jar

BUILD SUCCESSFUL
不知道为什么没生成hadoop-eclipse-plugin-1.1.2.jar却生成了1.1.2-SNAPSHOT.jar

把1.1.2-SNAPSHOT.jar重命名为hadoop-eclipse-plugin-1.1.2-SNAPSHOT.jar,然后拷贝到${eclipse.home}/plugins目录下
启动eclipse,打开Window-->Preferens,会发现Hadoop Map/Reduce选项,在这个选项里你需要配置Hadoop installation directory,配置为${hadoop.home}。
源文档 <http://www.cnblogs.com/flyoung2008/archive/2011/12/09/2281400.html>


测试运行WordCount。
1,File ->  New -> Project -> Map/Reduce Project
2,创建WordCount.java
3,运行,点Run Configurations 输入参数
Argument输入hdfs://localhost:9000/user/hduser/input  hdfs://localhost:9000/user/hduser/output
异常:Caused by: org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="hduser":hduser:supergroup:rwxr-xr-x
说明eclipse是用root权限运行的,没有hdfs的操作权限
在hdfs-site.xml中添加
  <property>
    <name>dfs.permissions</name>
    <value>false</value>
  </property>
源文档 <http://stackoverflow.com/questions/11593374/permission-denied-at-hdfs>

然后,重新启动
hduser@localhost:/usr/local/hadoop$ bin/stop-all.sh
hduser@localhost:/usr/local/hadoop$ bin/start-all.sh
运行成功

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值