1.建立jdk环境
>>> jdk与hadoop关系参见: http://wiki.apache.org/hadoop/HadoopJavaVersions
>>> 至少选择 oracle jdk 1.6.0_20 或者以上. 本次编译选择jdk 1.7.0_21.
[bruce@iRobot hadoop-install]$ cat ~/.bash_profile
# .bash_profile
# Get the aliases and functions
if [ -f ~/.bashrc ]; then
. ~/.bashrc
fi
# User specific environment and startup programs
PATH=$PATH:$HOME/bin
#java settings
export PATH
export JAVA_HOME=/u01/app/software/jdk1.7.0_21
export PATH=$JAVA_HOME/bin:$JAVA_HOME/jre/bin:$PATH
export CLASSPATH=.:$CLASSPATH:$JAVA_HOME/lib:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
2.建立hadoop目录
[bruce@iRobot hadoop-install]$ pwd
/home/bruce/hadoop-install
3.下载hadoop
Apache hadoop可执行链接:(64 bit系统上,Apache hadoop可执行包中的native library需要重新。否则,会有告警。)
wget http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-2.5.2/hadoop-2.5.2.tar.gz
Apache hadoop源代码链接:
wget http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-2.5.2/hadoop-2.5.2-src.tar.gz
[bruce@iRobot hadoop-install]$ ls
hadoop-2.5.2 hadoop-2.5.2-src hadoop-2.5.2-src.tar.gz hadoop-2.5.2.tar.gz
4.编译hadoop的native library:
---------------------------
(1)setup environment
RHEL上安装依赖软件包:
yum install svn autoconf automake libtool cmake ncurses-devel openssl-devel gcc
...(安装Maven,protobuf,ant。其他的,根据编译时的提示,安装相应的包。RHEL上最好建立yum库,否则找依赖很痛苦。)
wget http://mirrors.hust.edu.cn/apache/maven/maven-3/3.3.3/binaries/apache-maven-3.3.3-bin.tar.gz
wget https://github.com/google/protobuf/archive/v2.5.0.zip
...
wget http://mirrors.cnnic.cn/apache/ant/source/apache-ant-1.9.6-src.zip
...
(2)Create binary distribution with native code and without documentation:
$pwd
/home/bruce/hadoop-install/hadoop-2.5.2-src
编译native library,dist,跳过javadoc.
$ mvn package -Dmaven.javadoc.skip=true -Pdist,native -DskipTests -Dtar
>>>> 中间可能出错:以上错误是protoc版本过低导致的,需要protoc2.5+
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 127 -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 127
>>> You need to ensure you have protobuf installed and protoc is on your PATH.
下载: https://github.com/google/protobuf/tree/v2.5.0
$wget https://github.com/google/protobuf/archive/v2.5.0.zip
To build and install the C++ Protocol Buffer runtime and the Protocol
Buffer compiler (protoc) execute the following:
$ autoreconf -f -i -Wall,no-obsolete <== 如果没有configure文件,则执行
$ ./configure
$ make
$ make check
$ sudo make install
If "make check" fails, you can still install, but it is likely that
some features of this library will not work correctly on your system.
Proceed at your own risk.
"make install" may require superuser privileges.
For advanced usage information on configure and make, see INSTALL.txt.
>>>>>
在编译成功后,maven输出:
[INFO] Executed tasks
[INFO]
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-dist ---
[INFO] Skipping javadoc generation
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 1.712 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 1.096 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 2.250 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.325 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 1.948 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 2.319 s]
[INFO] Apache Hadoop MiniKDC .............................. SUC
Apache Hadoop安装
最新推荐文章于 2024-07-21 19:41:19 发布
本文详细记录了在Linux环境下安装Apache Hadoop 2.5.2的步骤,包括环境配置、依赖安装、源码编译、解决编译问题、建立软链接以及设置配置文件等。通过此教程,读者可以了解Hadoop本地模式、伪分布式模式和YARN单节点模式的启动与运行。
摘要由CSDN通过智能技术生成