参考资料:
http://blog.csdn.net/buxinchun/article/details/71335937
http://blog.csdn.net/antgan/article/details/52067441
源码目录building.txt
如果编译hdp2.6.5版本需要jdk7
1. 安装 JDK, 设置 JAVA_HOME
JAVA_HOME=C:\Program Files\Java\jdk1.8.0_151
以有以下2个解决办法:
1.用路径替代符
C:\PROGRA~1\Java\jdk1.8.0_91
PROGRA~1 ===== C:\Program Files 目录的dos文件名模式下的缩写
Progra~1 = 'Program Files'
Progra~2 = 'Program Files(x86)'
长于8个字符的文件名和文件夹名,都被简化成前面6个有效字符,后面~1,有重名的就 ~2,~3,
CLASSPATH=.;%JAVA_HOME%\lib;%JAVA_HOME%\lib\dt.jar;%JAVA_HOME%\lib\tools.jar;
PATH +=%JAVA_HOME%\bin;%JAVA_HOME%\jre\bin;
2. maven
MAVEN_HOME=C:\home\dev\apache-maven-3.3.9
PATH +=%MAVEN_HOME%\bin;
3. protoc2.5
PROTOC_HOME=C:\home\dev\protoc-2.5.0-win32
PATH +=%PROTOC_HOME%;
4. cmake3.6.2
CMAKE_HOME=C:\home\dev\cmake-3.6.2-win64-x64
PATH +=%CMAKE_HOME%\bin;
5. portableGit ,
GIT_HOME=C:\home\dev\PortableGit
PATH +=%GIT_HOME%\bin;
6. findbugs-3.0.1
FINDBUGS_HOME=C:\home\dev\findbugs-3.0.1
PATH +=%FINDBUGS_HOME%\bin;
7. zlib ,http://zlib.net/zlib128-dll.zip
ZLIB_HOME=C:\home\dev\zlib128-dll
PATH +=%ZLIB_HOME%;
8. Windows SDK 7.1
9. (maybe this not essential )win sdk 8.1,
uninstall .net 4.5,reboot
reinstall .net4.0,by install daemon lite,reboot
10. set the Platform environment variable
(case-sensitive.)
set Platform=x64 (when building on a 64-bit system)
set Platform=Win32 (when building on a 32-bit system)
11.
Run builds from a Windows SDK7.1 Command Prompt
mvn package -Pdist,native-win -DskipTests -Dtar
mvn package -Pdist,native-win -Pdocs-DskipTests –Dtar
尤其hdp2.6.5版本编译时,
mvn clean package -Pdist,native-win -DskipTests -Dtar -Dmaven.javadoc.skip=true
1.
HADOOP_HOME=C:\home\dev\hadoop-2.7.2
PATH +=%HADOOP_HOME%\bin;
PATH +=%HADOOP_HOME%\sbin;
2.mkdir tmp,mkdir hdfs/data, mkdir hdfs/name
3.
masters:localhost
slaves:localhost
4.
core-site.xml
------------------------
<configuration>
<property>
<name>hadoop.tmp.dir</name>
<value>/C:/home/dev/hdp-dir/tmp</value>
<description>A base for other temporary directories.</description>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
hdfs-site.xml
------------------
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>/C:/home/dev/hdp-dir/hdfs/name</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>/C:/home/dev/hdp-dir/hdfs/data</value>
</property>
</configuration>
mapred-site.xml
------------------
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>
yarn-site.xml
---------------------
<configuration>
<!-- Site specific YARN configuration properties -->
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.resourcemanager.hostname</name>
<value>localhost</value>
</property>
</configuration>
5.验证
打开 http://127.0.0.1:50070
http://127.0.0.1:8088
ANT_HOME=C:\home\dev\apache-ant-1.9.9
PATH += %ANT_HOME%\bin;
2.
修改执行命令行目录(即..\hadoop2x-eclipse-plugin-master\src\contrib\eclipse-plugin)下的build.xml文件
<target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib">
替换为:
<target name="compile" unless="skip.contrib">
3.
ant jar -Dversion=2.7.2 -Dhadoop.version=2.7.2 -Declipse.home=C:\home\dev\eclipse\eclipse-java-mars-2-win32-x86_64 -Dhadoop.home=C:\home\dev\hadoop-2.7.2
4.提示copy不到相关jar包的错误
修改..\hadoop2x-eclipse-plugin-master\ivy\libraries.properties文件,
将报错的jar包版本号更换成与hadoop安装目录\share\hadoop\common\lib下的jar对应的版本号。
5.
生成的插件在..\hadoop2x-eclipse-plugin-master\build\contrib\eclipse-plugin目录下,
copy 到 eclipse\plugins目录下
1.打开Window-->Preferens,可以看到Hadoop Map/Reduc选项
2.Window-->Show View -->MapReduce Tools
3.下方点击Map/ReduceLocation选项卡,点击右边小象图标,打开Hadoop Location配置窗口
locaton name:
Map/Reduce Master:mapred-site.xml
dfs master : core-site.xml
4.
http://blog.csdn.net/buxinchun/article/details/71335937
http://blog.csdn.net/antgan/article/details/52067441
源码目录building.txt
第一部分:源码编译
如果编译hdp2.6.5版本需要jdk7
1. 安装 JDK, 设置 JAVA_HOME
JAVA_HOME=C:\Program Files\Java\jdk1.8.0_151
以有以下2个解决办法:
1.用路径替代符
C:\PROGRA~1\Java\jdk1.8.0_91
PROGRA~1 ===== C:\Program Files 目录的dos文件名模式下的缩写
Progra~1 = 'Program Files'
Progra~2 = 'Program Files(x86)'
长于8个字符的文件名和文件夹名,都被简化成前面6个有效字符,后面~1,有重名的就 ~2,~3,
CLASSPATH=.;%JAVA_HOME%\lib;%JAVA_HOME%\lib\dt.jar;%JAVA_HOME%\lib\tools.jar;
PATH +=%JAVA_HOME%\bin;%JAVA_HOME%\jre\bin;
2. maven
MAVEN_HOME=C:\home\dev\apache-maven-3.3.9
PATH +=%MAVEN_HOME%\bin;
3. protoc2.5
PROTOC_HOME=C:\home\dev\protoc-2.5.0-win32
PATH +=%PROTOC_HOME%;
4. cmake3.6.2
CMAKE_HOME=C:\home\dev\cmake-3.6.2-win64-x64
PATH +=%CMAKE_HOME%\bin;
5. portableGit ,
GIT_HOME=C:\home\dev\PortableGit
PATH +=%GIT_HOME%\bin;
6. findbugs-3.0.1
FINDBUGS_HOME=C:\home\dev\findbugs-3.0.1
PATH +=%FINDBUGS_HOME%\bin;
7. zlib ,http://zlib.net/zlib128-dll.zip
ZLIB_HOME=C:\home\dev\zlib128-dll
PATH +=%ZLIB_HOME%;
8. Windows SDK 7.1
9. (maybe this not essential )win sdk 8.1,
uninstall .net 4.5,reboot
reinstall .net4.0,by install daemon lite,reboot
10. set the Platform environment variable
(case-sensitive.)
set Platform=x64 (when building on a 64-bit system)
set Platform=Win32 (when building on a 32-bit system)
11.
Run builds from a Windows SDK7.1 Command Prompt
mvn package -Pdist,native-win -DskipTests -Dtar
mvn package -Pdist,native-win -Pdocs-DskipTests –Dtar
尤其hdp2.6.5版本编译时,
mvn clean package -Pdist,native-win -DskipTests -Dtar -Dmaven.javadoc.skip=true
第二部分:伪分布式配置
1.
HADOOP_HOME=C:\home\dev\hadoop-2.7.2
PATH +=%HADOOP_HOME%\bin;
PATH +=%HADOOP_HOME%\sbin;
2.mkdir tmp,mkdir hdfs/data, mkdir hdfs/name
3.
masters:localhost
slaves:localhost
4.
core-site.xml
------------------------
<configuration>
<property>
<name>hadoop.tmp.dir</name>
<value>/C:/home/dev/hdp-dir/tmp</value>
<description>A base for other temporary directories.</description>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
hdfs-site.xml
------------------
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>/C:/home/dev/hdp-dir/hdfs/name</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>/C:/home/dev/hdp-dir/hdfs/data</value>
</property>
</configuration>
mapred-site.xml
------------------
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>
yarn-site.xml
---------------------
<configuration>
<!-- Site specific YARN configuration properties -->
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.resourcemanager.hostname</name>
<value>localhost</value>
</property>
</configuration>
5.验证
打开 http://127.0.0.1:50070
http://127.0.0.1:8088
第三部分:插件编译
1.ANT_HOME=C:\home\dev\apache-ant-1.9.9
PATH += %ANT_HOME%\bin;
2.
修改执行命令行目录(即..\hadoop2x-eclipse-plugin-master\src\contrib\eclipse-plugin)下的build.xml文件
<target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib">
替换为:
<target name="compile" unless="skip.contrib">
3.
ant jar -Dversion=2.7.2 -Dhadoop.version=2.7.2 -Declipse.home=C:\home\dev\eclipse\eclipse-java-mars-2-win32-x86_64 -Dhadoop.home=C:\home\dev\hadoop-2.7.2
4.提示copy不到相关jar包的错误
修改..\hadoop2x-eclipse-plugin-master\ivy\libraries.properties文件,
将报错的jar包版本号更换成与hadoop安装目录\share\hadoop\common\lib下的jar对应的版本号。
5.
生成的插件在..\hadoop2x-eclipse-plugin-master\build\contrib\eclipse-plugin目录下,
copy 到 eclipse\plugins目录下
第四部分:eclipse开发配置
经过验证 如网上所说,hdp2.7.2 很多情况下eclilpse 无法连接,切换到2.6.5即可1.打开Window-->Preferens,可以看到Hadoop Map/Reduc选项
2.Window-->Show View -->MapReduce Tools
3.下方点击Map/ReduceLocation选项卡,点击右边小象图标,打开Hadoop Location配置窗口
locaton name:
Map/Reduce Master:mapred-site.xml
dfs master : core-site.xml
4.