hadoop提供了CMake来编译libhdfs,因此在编译之前需要先安装好CMake工具。
然后进入libhdfs的源代码目录,如:/data/hadoop-2.7.1-src/hadoop-hdfs-project/hadoop-hdfs/src
执行cmake以生成Makefile文件(假设jdk的安装目录为/data/jdk1.7.0_55):
cmake -DGENERATED_JAVAH=/data/jdk1.7.0_55 -DJAVA_HOME=/data/jdk1.7.0_55 .
成功之后,会在目录下生成Makefile文件,接下来就可以执行make编译生成libhdfs.so和libhdfs.a了。
如果遇到下面这样的错误:
/data/jdk1.7.0_55/jre/lib/amd64/server/libjvm.so: file not recognized: File format not recognized
则需要考虑升级链接器ld,参考说明:http://blog.chinaunix.net/uid-20682147-id-4239779.html。
ld和nm等均是GNU binutils的成员,可以从http://ftp.gnu.org/gnu/binutils/下载到新的版本。
注意在升级gcc和ld之后,需要更新下环境变量PATH,再重执行cmake,否则可能引用的仍然是老版本的gcc和ld。
/data/hadoop-2.7.1-src/hadoop-hdfs-project/hadoop-hdfs/src # cmake -DGENERATED_JAVAH=/data/java_1_7 -DJAVA_HOME=/data/java_1_7
-- The C compiler identification is GNU 4.1.2
-- The CXX compiler identification is GNU 4.1.2
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
JAVA_HOME=/data/java_1_7, JAVA_JVM_LIBRARY=/data/java_1_7/jre/lib/amd64/server/libjvm.so
JAVA_INCLUDE_PATH=/data/java_1_7/include, JAVA_INCLUDE_PATH2=/data/java_1_7/include/linux
Located all JNI components successfully.
-- Performing Test HAVE_BETTER_TLS
-- Performing Test HAVE_BETTER_TLS - Success
-- Performing Test HAVE_INTEL_SSE_INTRINSICS
-- Performing Test HAVE_INTEL_SSE_INTRINSICS - Success
-- Looking for dlopen in dl
-- Looking for dlopen in dl - found
-- Found JNI: /data/java_1_7/jre/lib/amd64/libjawt.so
-- Found PkgConfig: /usr/bin/pkg-config (found version "0.20")
-- checking for module 'fuse'
-- package 'fuse' not found
-- Failed to find Linux FUSE libraries or include files. Will not build FUSE client.
-- Configuring done
-- Generating done
-- Build files have been written to: /data/hadoop-2.7.1-src/hadoop-hdfs-project/hadoop-hdfs/src
然后进入libhdfs的源代码目录,如:/data/hadoop-2.7.1-src/hadoop-hdfs-project/hadoop-hdfs/src
执行cmake以生成Makefile文件(假设jdk的安装目录为/data/jdk1.7.0_55):
cmake -DGENERATED_JAVAH=/data/jdk1.7.0_55 -DJAVA_HOME=/data/jdk1.7.0_55 .
成功之后,会在目录下生成Makefile文件,接下来就可以执行make编译生成libhdfs.so和libhdfs.a了。
如果遇到下面这样的错误:
/data/jdk1.7.0_55/jre/lib/amd64/server/libjvm.so: file not recognized: File format not recognized
则需要考虑升级链接器ld,参考说明:http://blog.chinaunix.net/uid-20682147-id-4239779.html。
ld和nm等均是GNU binutils的成员,可以从http://ftp.gnu.org/gnu/binutils/下载到新的版本。
注意在升级gcc和ld之后,需要更新下环境变量PATH,再重执行cmake,否则可能引用的仍然是老版本的gcc和ld。
/data/hadoop-2.7.1-src/hadoop-hdfs-project/hadoop-hdfs/src # cmake -DGENERATED_JAVAH=/data/java_1_7 -DJAVA_HOME=/data/java_1_7
-- The C compiler identification is GNU 4.1.2
-- The CXX compiler identification is GNU 4.1.2
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
JAVA_HOME=/data/java_1_7, JAVA_JVM_LIBRARY=/data/java_1_7/jre/lib/amd64/server/libjvm.so
JAVA_INCLUDE_PATH=/data/java_1_7/include, JAVA_INCLUDE_PATH2=/data/java_1_7/include/linux
Located all JNI components successfully.
-- Performing Test HAVE_BETTER_TLS
-- Performing Test HAVE_BETTER_TLS - Success
-- Performing Test HAVE_INTEL_SSE_INTRINSICS
-- Performing Test HAVE_INTEL_SSE_INTRINSICS - Success
-- Looking for dlopen in dl
-- Looking for dlopen in dl - found
-- Found JNI: /data/java_1_7/jre/lib/amd64/libjawt.so
-- Found PkgConfig: /usr/bin/pkg-config (found version "0.20")
-- checking for module 'fuse'
-- package 'fuse' not found
-- Failed to find Linux FUSE libraries or include files. Will not build FUSE client.
-- Configuring done
-- Generating done
-- Build files have been written to: /data/hadoop-2.7.1-src/hadoop-hdfs-project/hadoop-hdfs/src