1. 生成hadoop patch
单文件:
diff
-
u from
-
file to
-
file
多文件:
diff
-
uNr from
-
dir to
-
dir
>
to
-
dir.patch
2. 在hadoop根目录下安装patch
patch
-
p0 < Hadoop
-
0.20
.
2
-
v1382.patch
3.如果需要回滚patch,恢复到以前的版本,可执行
patch
-
RE
-
p0 < Hadoop
-
0.20
.
2
-
v1382.patch
4.在hadoop根目录下执行ant开始编译:
ant
如果这一步编译出现下列错误提示信息:
BUILD FAILED/home/hadoop/hadoop-0.20.2/build.xml:1624: Class org.apache.tools.ant.taskdefs.ConditionTask doesn't support the nested "typefound" element.
这是因为ant版本太低,我用系统自带的ant 1.6.5 编译就出错了。到apache网站上下载个ant 1.8.0就可以了。
ant的安装很简单,直接解压到 /home/hadoop/ant ,然后在~/.bashrc配置下路径就可以了:
# .bashrc |
# Source global definitions |
if [ -f /etc/bashrc ]; then |
. /etc/bashrc |
fi |
# User specific aliases and functions |
export ANT_HOME=/home/hadoop/ant |
PATH=$ANT_HOME/bin:$PATH:$HOME/bin:/home/hadoop/hadoop/bin |
在ant编译成功后执行
ant jar |
会在build目录下生成一个hadoop-0.20.3-dev-core.jar文件,这个就是我们需要的了。
用这个文件替换hadoop根目录下的hadoop-*-core.jar文件,然后分发到所有节点上。重启HDFS即可。