首先需要下载/安装以下的代码和软件:
1、下载Hadoop1.0.1源代码,下载地址:http://www.apache.org/dyn/closer.cgi/hadoop/core/,解压后存放在F:\my\hadoop
2、下载安装JDK1.6版本,并安装,下载地址网上搜索一下就有了。
3、下载Apache ant 编译器,下载地址 http://ant.apache.org/ivy/download.cgi,解压后,放在本机目录D:\XXX\hadoop\apache-ant-1.8.3
4、下载安装cygwin,下载地址:http://cygwin.com/install.html,然后安装
5、选择eclipse 3.3版本以上的编译器,我选择了3.6版本
完成以上工作,需要在windows设置环境变量,主要的设置环境如下:
ANT_HOME=D:\XXX\hadoop\apache-ant-1.8.3
JAVA_HOME=C:\Program Files\Java\jdk1.6.0_23
CLASSPATH=添加C:\Program Files\Java\jdk1.6.0_23\jre\lib\rt.jar
PATH=C:\cygwin\bin;D:\oracle;%ANT_HOME%\bin
重新登入,让环境变量生效,点击进入eclipse,对hadoop进行编译,具体配置编
译步骤如下:
1、New-->"java project
Project name:commontest
Location:F:\my\hadoop (hadoop解压路径)
2、在Project Explorer,生成树状结构,光标移到commontest,按"右键",选择
Properties,
选择Builders,缺省为Java Builder,我们需要修改为ant模式,具体步骤如下:
a) 按”New“,选择"Ant Builder",弹出”Edit launch configuration properties"
b) Name: common_builder
c) “Main” 子项中,设置Buidfile 为F:\my\hadoop\build.xml
d) "Targets"子项, “Manual Build"为 jar
e) “Jre”子项,选择“Execution environment"为 JavaSE-1.6(jre)
创建完成后,选择common_Builder作为构建器
选择”Java Build Path",选择Libraries"子项,按“Add External JARs,将以下的路径中的JAR加载:
F:\my\hadoop\lib
D:\xxxx\eclipse\plugins
F:\my\hadoop\bin\lib
F:\my\hadoop
F:\my\hadoop\contrib\datajoin
F:\my\hadoop\contrib\failmon
F:\my\hadoop\contrib\gridmix
F:\my\hadoop\contrib\index
F:\my\hadoop\contrib\streaming
F:\my\hadoop\contrib\vaidya
F:\my\hadoop\contrib\hdfsproxy
F:\my\hadoop\ivy
3、修改F:\my\hadoop\build.xml文件,对2384行的ivy下载进行注释,因为已经包含了ivy.jar
< !--target name="ivy-download" description="To download ivy" unless="offline">
-->
对2392行去除对ivy-download的依赖关系,保留如下:
4、选择Project=>Build Project进行编译,生成hadoop-core-1.0.1.jar
5、对src\contrib\eclipse-plugin\src\java进行JAR包编译,在hadoop 0.20版本以上hadoop-eclipse-plugin-1.0.1.jar需要自己进行编译
a) 参考步骤2中的“c)” 修改Buidfile为F:\my\hadoop\src\contrib\eclipse-plugin
\build.xml
b)在build.xml中添加eclipse.home设置
< property name =" eclipse.home " value =" D:/xxx/eclipse " />
< import file =" ../build-contrib.xml " />
.........................
---直接对拷贝的jar包的文件直接修改,原先为
< copy file =" ${hadoop.root}/build/hadoop-core-1.0.1.jar " tofile =" ${build.dir}/lib/hadoop-core.jar " verbose =" true " />
c)在eclipse.home的上一级目录中,对build-contrib.xml的388行进行注释,不需要从网络上进行ivy.jar包下载
<!--target name="ivy-download" description="To download ivy " unless="offline">
-->
d) 选择Project=>Build Project,生成hadoop-eclipse-plugin-1.0.1.jar包,将该jar包拷贝到eclipse plugs-in目录中,重新启动eclipse,至此,搭建完成hadoop eclipse编译环境
1、下载Hadoop1.0.1源代码,下载地址:http://www.apache.org/dyn/closer.cgi/hadoop/core/,解压后存放在F:\my\hadoop
2、下载安装JDK1.6版本,并安装,下载地址网上搜索一下就有了。
3、下载Apache ant 编译器,下载地址 http://ant.apache.org/ivy/download.cgi,解压后,放在本机目录D:\XXX\hadoop\apache-ant-1.8.3
4、下载安装cygwin,下载地址:http://cygwin.com/install.html,然后安装
5、选择eclipse 3.3版本以上的编译器,我选择了3.6版本
完成以上工作,需要在windows设置环境变量,主要的设置环境如下:
ANT_HOME=D:\XXX\hadoop\apache-ant-1.8.3
JAVA_HOME=C:\Program Files\Java\jdk1.6.0_23
CLASSPATH=添加C:\Program Files\Java\jdk1.6.0_23\jre\lib\rt.jar
PATH=C:\cygwin\bin;D:\oracle;%ANT_HOME%\bin
重新登入,让环境变量生效,点击进入eclipse,对hadoop进行编译,具体配置编
译步骤如下:
1、New-->"java project
Project name:commontest
Location:F:\my\hadoop (hadoop解压路径)
2、在Project Explorer,生成树状结构,光标移到commontest,按"右键",选择
Properties,
选择Builders,缺省为Java Builder,我们需要修改为ant模式,具体步骤如下:
a) 按”New“,选择"Ant Builder",弹出”Edit launch configuration properties"
b) Name: common_builder
c) “Main” 子项中,设置Buidfile 为F:\my\hadoop\build.xml
d) "Targets"子项, “Manual Build"为 jar
e) “Jre”子项,选择“Execution environment"为 JavaSE-1.6(jre)
创建完成后,选择common_Builder作为构建器
选择”Java Build Path",选择Libraries"子项,按“Add External JARs,将以下的路径中的JAR加载:
F:\my\hadoop\lib
D:\xxxx\eclipse\plugins
F:\my\hadoop\bin\lib
F:\my\hadoop
F:\my\hadoop\contrib\datajoin
F:\my\hadoop\contrib\failmon
F:\my\hadoop\contrib\gridmix
F:\my\hadoop\contrib\index
F:\my\hadoop\contrib\streaming
F:\my\hadoop\contrib\vaidya
F:\my\hadoop\contrib\hdfsproxy
F:\my\hadoop\ivy
3、修改F:\my\hadoop\build.xml文件,对2384行的ivy下载进行注释,因为已经包含了ivy.jar
< !--target name="ivy-download" description="To download ivy" unless="offline">
-->
对2392行去除对ivy-download的依赖关系,保留如下:
4、选择Project=>Build Project进行编译,生成hadoop-core-1.0.1.jar
5、对src\contrib\eclipse-plugin\src\java进行JAR包编译,在hadoop 0.20版本以上hadoop-eclipse-plugin-1.0.1.jar需要自己进行编译
a) 参考步骤2中的“c)” 修改Buidfile为F:\my\hadoop\src\contrib\eclipse-plugin
\build.xml
b)在build.xml中添加eclipse.home设置
< property name =" eclipse.home " value =" D:/xxx/eclipse " />
< import file =" ../build-contrib.xml " />
.........................
---直接对拷贝的jar包的文件直接修改,原先为
< copy file =" ${hadoop.root}/build/hadoop-core-1.0.1.jar " tofile =" ${build.dir}/lib/hadoop-core.jar " verbose =" true " />
<
copy
file
="
${hadoop.root}/build/ivy/lib/Hadoop/common/commons-cli-1.2.jar
"
todir
="
${build.dir}/lib
"
verbose
="
true
"
/>
c)在eclipse.home的上一级目录中,对build-contrib.xml的388行进行注释,不需要从网络上进行ivy.jar包下载
<!--target name="ivy-download" description="To download ivy " unless="offline">
-->
d) 选择Project=>Build Project,生成hadoop-eclipse-plugin-1.0.1.jar包,将该jar包拷贝到eclipse plugs-in目录中,重新启动eclipse,至此,搭建完成hadoop eclipse编译环境
来自 “ ITPUB博客 ” ,链接:http://blog.itpub.net/354732/viewspace-719766/,如需转载,请注明出处,否则将追究法律责任。
转载于:http://blog.itpub.net/354732/viewspace-719766/