参考:http://www.cnblogs.com/duking1991/p/6104304.html
文中所需工具下载地址:http://download.csdn.net/detail/wzq6578702/9817101
1.工具准备,最靠谱的是hadoop说明文档里要求具备的那些工具。
到hadoop官网,点击source下载hadoop-2.7.3-src.tar.gz。
解压之
tar -zxvf hadoop-2.7.3-src.tar.gz
得到hadoop-2.7.3-src文件夹。
进入hadoop-2.7.3-src文件夹,查看BUILDING.txt
cd hadoop-2.7.3-src vim BUILDING.txt
可以看到编译所需的库或者工具:
包括:
JDK1.7+
maven 3.0 or later
findbugs 1.3.9
protocolBuffer 2.5.0
cmake 2.6
zlib-devel
openssl-devel
除了上面这些,为了不出错,实际还需要autoconf automake gcc等。
下面开始准备这些工具:
首先使用su命令获取root权限,以免受到各种限制
2.下载jdk-7u102-linux-x64.tar.gz,解压后移动到/opt目录下
tar -zxvf jdk-7u102-linux-x64.tar.gz
mv jdk1.8.0_102/opt
然后打开/etc/profile配置jdk环境变量
vim /etc/profile
按 i 进入插入模式,在文件末尾添加
export JAVA_HOME=/opt/jdk1.8.0_102
export CLASSPATH=.: JAVAHOME/jre/lib/rt.jar: JAVA_HOME/lib/tools.jar
export PATH= PATH: JAVA_HOME/bin
export JRE_HOME=/opt/jdk1.8.0_102/jre
export PATH= PATH: JRE_HOME/bin
先后按Esc, Shift+:, wq, 回车即可保存并退出编辑。
输入 source /etc/profile 回车即可保存更改。
运行javac -version 查看状态:
3.下面安装各种库,
yum -y install svn ncurses-devel gcc*
yum -y install lzo-devel zlib-devel autoconf automake libtool cmake openssl-devel
4.安装protobuf-2.5.0.tar.gz(注意版本必须是2.5.0)
protobuf-2.5.0.tar.gz下载地址:
——————————————分割线——————————————
免费下载地址在http://linux.linuxidc.com/
用户名与密码都是www.linuxidc.com
具体下载目录在/2015年资料/10月/10日/CentOS7下用JDK1.7编译Hadoop-2.7.1全过程详解/
下载方法见http://www.linuxidc.com/Linux/2013-07/87684.htm
——————————————分割线——————————————
tar zxvf protobuf-2.5.0.tar.gz
进入protobuf-2.5.0依次执行
cd protobuf-2.5.0 进入目录
假如 你希望编译成功后输出的目录 为 /home/work /protobuf/ 则输入如下两条命令:
./configure --prefix=/home/work /protobuf/
make && make install
编译成功后将export PATH= /home/work /protobuf/bin:$PATH加入到环境变量中
最后输入 protoc --version命令,如显示libprotoc 2.5.0则安装成功
5.安装maven (注意,如果下载很慢或者经常失败,建议更换maven的下载源,方法在另一篇博客中有说明)
下载apache- maven- 3.3.3- bin.tar.gz
解压缩并配置环境变量
解压:
tar -zxvf apache-maven-3.3.3-bin.tar.gz
移动到/opt目录下:
mv apache-maven-3.3.3 /opt
配置环境变量:
vim /etc/profile
在末尾添加:
export MAVEN_HOME=/opt/apache-maven-3.3.3
export MAVEN_OPTS=”-Xms256m -Xmx512m”
export PATH= PATH: MAVEN_HOME/bin
先按Esc, Shift+:, wq, 回车即可保存并推出编辑。
输入 source /etc/profile 回车即可保存更改。
查看安装状态: mvn -version
看到
6.安装ant
下载apache-ant-1.9.4-bin.tar.gz
解压缩并配置环境变量
解压:
tar -zxvf apache-ant-1.9.4-bin.tar.gz
移动到/opt目录下
mv apache-ant-1.9.4 /opt
配置环境变量
同上,在/etc/profile文件末未添加:
export ANT_HOME=/opt/apache-ant-1.9.4
export PATH= PATH: ANT_HOME/bin
然后保存,退出,并使更改生效。
查看安装结果:
ant -version
7.安装findbugs
下载findbugs-3.0.1.tar.gz,选择上面的standard version即可
解压缩并配置环境变量
解压:
tar -zxvf findbugs-3.0.1.tar.gz
移动到 /opt目录下
mv findbugs-3.0.1 /opt
配置环境变量:
在 /etc/profile 文件末尾添加:
export FINDBUGS_HOME=/opt/findbugs-3.0.1
export PATH= PATH: FINDBUGS_HOME/bin
保存退出,并使更改生效。
查看安装结果
8.准备完成,下面开始编译hadoop
进入到hadoop-2.7.3-src目录
vim hadoop-common-project/hadoop-auth/pom.xml
vim /hadoop-common-project/hadoop-common/pom.xml
org.apache.hadoop
hadoop-annotations
compile
将节点hadoop-annotations的值由provide 改成compile
使用命令:
mvn clean package –Pdist,native –DskipTests –Dtar
或者:
mvn package -Pdist,native -DskipTests -Dtar
进行编译。
务必保持网络畅通,经过漫长的等待(本人的比较慢,花了2小时57分钟)!
编译好的文件在/hadoop-dist/target/hadoop-2.7.3.tar.gz下。
注意事项:
1.所有命令,建议手敲,不建议复制粘贴,以避免因为页面空格造成的不必要错误。
2.务必保持网络畅通,如果出现缺少某个文件,则要先清理maven(使用命令 mvn clean) 再重新编译。
3.如果总是出现同一个错误导致编译失败,则可能是缺少某个库或者工具,检查上述工具是否都安装成功,并且版本正确。
成功:
更多细节参考编译文档:
Build instructions for Hadoop
----------------------------------------------------------------------------------
Requirements:
* Unix System
* JDK 1.7+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac
* Zlib devel (if compiling native code)
* openssl devel ( if compiling native hadoop-pipes and to get the best HDFS encryption performance )
* Linux FUSE (Filesystem in Userspace) version 2.6 or above ( if compiling fuse_dfs )
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
----------------------------------------------------------------------------------
Installing required packages for clean install of Ubuntu 14.04 LTS Desktop:
* Oracle JDK 1.7 (preferred)
$ sudo apt-get purge openjdk*
$ sudo apt-get install software-properties-common
$ sudo add-apt-repository ppa:webupd8team/java
$ sudo apt-get update
$ sudo apt-get install oracle-java7-installer
* Maven
$ sudo apt-get -y install maven
* Native libraries
$ sudo apt-get -y install build-essential autoconf automake libtool cmake zlib1g-dev pkg-config libssl-dev
* ProtocolBuffer 2.5.0 (required)
$ sudo apt-get -y install libprotobuf-dev protobuf-compiler
Optional packages:
* Snappy compression
$ sudo apt-get install snappy libsnappy-dev
* Bzip2
$ sudo apt-get install bzip2 libbz2-dev
* Jansson (C Library for JSON)
$ sudo apt-get install libjansson-dev
* Linux FUSE
$ sudo apt-get install fuse libfuse-dev
----------------------------------------------------------------------------------
Maven main modules:
hadoop (Main Hadoop project)
- hadoop-project (Parent POM for all Hadoop Maven modules. )
(All plugins & dependencies versions are defined here.)
- hadoop-project-dist (Parent POM for modules that generate distributions.)
- hadoop-annotations (Generates the Hadoop doclet used to generated the Javadocs)
- hadoop-assemblies (Maven assemblies used by the different modules)
- hadoop-common-project (Hadoop Common)
- hadoop-hdfs-project (Hadoop HDFS)
- hadoop-mapreduce-project (Hadoop MapReduce)
- hadoop-tools (Hadoop tools like Streaming, Distcp, etc.)
- hadoop-dist (Hadoop distribution assembler)
----------------------------------------------------------------------------------
Where to run Maven from?
It can be run from any module. The only catch is that if not run from utrunk
all modules that are not part of the build run must be installed in the local
Maven cache or available in a Maven repository.
----------------------------------------------------------------------------------
Maven build goals:
* Clean : mvn clean
* Compile : mvn compile [-Pnative]
* Run tests : mvn test [-Pnative]
* Create JAR : mvn package
* Run findbugs : mvn compile findbugs:findbugs
* Run checkstyle : mvn compile checkstyle:checkstyle
* Install JAR in M2 cache : mvn install
* Deploy JAR to Maven repo : mvn deploy
* Run clover : mvn test -Pclover [-DcloverLicenseLocation=${user.name}/.clover.license]
* Run Rat : mvn apache-rat:check
* Build javadocs : mvn javadoc:javadoc
* Build distribution : mvn package [-Pdist][-Pdocs][-Psrc][-Pnative][-Dtar]
* Change Hadoop version : mvn versions:set -DnewVersion=NEWVERSION
Build options:
* Use -Pnative to compile/bundle native code
* Use -Pdocs to generate & bundle the documentation in the distribution (using -Pdist)
* Use -Psrc to create a project source TAR.GZ
* Use -Dtar to create a TAR with the distribution (using -Pdist)
Snappy build options:
Snappy is a compression library that can be utilized by the native code.
It is currently an optional component, meaning that Hadoop can be built with
or without this dependency.
* Use -Drequire.snappy to fail the build if libsnappy.so is not found.
If this option is not specified and the snappy library is missing,
we silently build a version of libhadoop.so that cannot make use of snappy.
This option is recommended if you plan on making use of snappy and want
to get more repeatable builds.
* Use -Dsnappy.prefix to specify a nonstandard location for the libsnappy
header files and library files. You do not need this option if you have
installed snappy using a package manager.
* Use -Dsnappy.lib to specify a nonstandard location for the libsnappy library
files. Similarly to snappy.prefix, you do not need this option if you have
installed snappy using a package manager.
* Use -Dbundle.snappy to copy the contents of the snappy.lib directory into
the final tar file. This option requires that -Dsnappy.lib is also given,
and it ignores the -Dsnappy.prefix option.
OpenSSL build options:
OpenSSL includes a crypto library that can be utilized by the native code.
It is currently an optional component, meaning that Hadoop can be built with
or without this dependency.
* Use -Drequire.openssl to fail the build if libcrypto.so is not found.
If this option is not specified and the openssl library is missing,
we silently build a version of libhadoop.so that cannot make use of
openssl. This option is recommended if you plan on making use of openssl
and want to get more repeatable builds.
* Use -Dopenssl.prefix to specify a nonstandard location for the libcrypto
header files and library files. You do not need this option if you have
installed openssl using a package manager.
* Use -Dopenssl.lib to specify a nonstandard location for the libcrypto library
files. Similarly to openssl.prefix, you do not need this option if you have
installed openssl using a package manager.
* Use -Dbundle.openssl to copy the contents of the openssl.lib directory into
the final tar file. This option requires that -Dopenssl.lib is also given,
and it ignores the -Dopenssl.prefix option.
Tests options:
* Use -DskipTests to skip tests when running the following Maven goals:
'package', 'install', 'deploy' or 'verify'
* -Dtest=<TESTCLASSNAME>,<TESTCLASSNAME#METHODNAME>,....
* -Dtest.exclude=<TESTCLASSNAME>
* -Dtest.exclude.pattern=**/<TESTCLASSNAME1>.java,**/<TESTCLASSNAME2>.java
----------------------------------------------------------------------------------
Building components separately
If you are building a submodule directory, all the hadoop dependencies this
submodule has will be resolved as all other 3rd party dependencies. This is,
from the Maven cache or from a Maven repository (if not available in the cache
or the SNAPSHOT 'timed out').
An alternative is to run 'mvn install -DskipTests' from Hadoop source top
level once; and then work from the submodule. Keep in mind that SNAPSHOTs
time out after a while, using the Maven '-nsu' will stop Maven from trying
to update SNAPSHOTs from external repos.
----------------------------------------------------------------------------------
Protocol Buffer compiler
The version of Protocol Buffer compiler, protoc, must match the version of the
protobuf JAR.
If you have multiple versions of protoc in your system, you can set in your
build shell the HADOOP_PROTOC_PATH environment variable to point to the one you
want to use for the Hadoop build. If you don't define this environment variable,
protoc is looked up in the PATH.
----------------------------------------------------------------------------------
Importing projects to eclipse
When you import the project to eclipse, install hadoop-maven-plugins at first.
$ cd hadoop-maven-plugins
$ mvn install
Then, generate eclipse project files.
$ mvn eclipse:eclipse -DskipTests
At last, import to eclipse by specifying the root directory of the project via
[File] > [Import] > [Existing Projects into Workspace].
----------------------------------------------------------------------------------
Building distributions:
Create binary distribution without native code and without documentation:
$ mvn package -Pdist -DskipTests -Dtar
Create binary distribution with native code and with documentation:
$ mvn package -Pdist,native,docs -DskipTests -Dtar
Create source distribution:
$ mvn package -Psrc -DskipTests
Create source and binary distributions with native code and documentation:
$ mvn package -Pdist,native,docs,src -DskipTests -Dtar
Create a local staging version of the website (in /tmp/hadoop-site)
$ mvn clean site; mvn site:stage -DstagingDirectory=/tmp/hadoop-site
----------------------------------------------------------------------------------
Installing Hadoop
Look for these HTML files after you build the document by the above commands.
* Single Node Setup:
hadoop-project-dist/hadoop-common/SingleCluster.html
* Cluster Setup:
hadoop-project-dist/hadoop-common/ClusterSetup.html
----------------------------------------------------------------------------------
Handling out of memory errors in builds
----------------------------------------------------------------------------------
If the build process fails with an out of memory error, you should be able to fix
it by increasing the memory used by maven -which can be done via the environment
variable MAVEN_OPTS.
Here is an example setting to allocate between 256 and 512 MB of heap space to
Maven
export MAVEN_OPTS="-Xms256m -Xmx512m"
----------------------------------------------------------------------------------
Building on Windows
----------------------------------------------------------------------------------
Requirements:
* Windows System
* JDK 1.7+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer
* Windows SDK 7.1 or Visual Studio 2010 Professional
* Windows SDK 8.1 (if building CPU rate control for the container executor)
* zlib headers (if building native code bindings for zlib)
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
* Unix command-line tools from GnuWin32: sh, mkdir, rm, cp, tar, gzip. These
tools must be present on your PATH.
Unix command-line tools are also included with the Windows Git package which
can be downloaded from http://git-scm.com/download/win.
If using Visual Studio, it must be Visual Studio 2010 Professional (not 2012).
Do not use Visual Studio Express. It does not support compiling for 64-bit,
which is problematic if running a 64-bit system. The Windows SDK 7.1 is free to
download here:
http://www.microsoft.com/en-us/download/details.aspx?id=8279
The Windows SDK 8.1 is available to download at:
http://msdn.microsoft.com/en-us/windows/bg162891.aspx
Cygwin is neither required nor supported.
----------------------------------------------------------------------------------
Building:
Keep the source code tree in a short path to avoid running into problems related
to Windows maximum path length limitation. (For example, C:\hdc).
Run builds from a Windows SDK Command Prompt. (Start, All Programs,
Microsoft Windows SDK v7.1, Windows SDK 7.1 Command Prompt.)
JAVA_HOME must be set, and the path must not contain spaces. If the full path
would contain spaces, then use the Windows short path instead.
You must set the Platform environment variable to either x64 or Win32 depending
on whether you're running a 64-bit or 32-bit system. Note that this is
case-sensitive. It must be "Platform", not "PLATFORM" or "platform".
Environment variables on Windows are usually case-insensitive, but Maven treats
them as case-sensitive. Failure to set this environment variable correctly will
cause msbuild to fail while building the native code in hadoop-common.
set Platform=x64 (when building on a 64-bit system)
set Platform=Win32 (when building on a 32-bit system)
Several tests require that the user must have the Create Symbolic Links
privilege.
All Maven goals are the same as described above with the exception that
native code is built by enabling the 'native-win' Maven profile. -Pnative-win
is enabled by default when building on Windows since the native components
are required (not optional) on Windows.
If native code bindings for zlib are required, then the zlib headers must be
deployed on the build machine. Set the ZLIB_HOME environment variable to the
directory containing the headers.
set ZLIB_HOME=C:\zlib-1.2.7
At runtime, zlib1.dll must be accessible on the PATH. Hadoop has been tested
with zlib 1.2.7, built using Visual Studio 2010 out of contrib\vstudio\vc10 in
the zlib 1.2.7 source tree.
http://www.zlib.net/
----------------------------------------------------------------------------------
Building distributions:
* Build distribution with native code : mvn package [-Pdist][-Pdocs][-Psrc][-Dtar]