mac下hadoop 2.6.0编译native library

  1. 本文中的所有路径请根据个人情况修改。
  2. 编译好的native library见个人资源:【http://download.csdn.net/detail/tterminator/9565597

一、为什么要编译native library

mac单机模式安装Hadoop后启动,报错:WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable。

有兴趣的可以自己编译下,因为网上很多的native library资源和方法都是不可用的,自己在本地编译也是官网推荐的。

二、问题原因

在官网上有说明:
The pre-built 32-bit i386-Linux native hadoop library is available as part of the hadoop distribution and is located in the lib/native directory. You can download the hadoop distribution from Hadoop Common Releases.

在官网下载的Hadoop版本中自带的native library 只针对32位Linux

The native hadoop library is supported on *nix platforms only. The library does not to work with Cygwin or the Mac OS X platform.

官网已明确说明预编译的native library 不适用于mac os x。所以这也是必须要在mac下编译native library的充要理由。

详细说明请参见http://hadoop.apache.org/docs/r2.6.0/hadoop-project-dist/hadoop-common/NativeLibraries.html

三、编译环境说明

  1. mac os x:10.10.4
  2. jdk:java version “1.7.0_80”
  3. hadoop:2.6.0-src

四、编译前期准备

编译开始前需要在mac下安装以下软件:
1.安装brew

类似Ubuntu中的apt软件包管理工具,用于安装缺少的软件包,这里用于安装cmake工具:brew install cmake。

2.安装cmake

没有版本要求。

3.安装protoc

版本必须是2.5.0,否则编译失败,不能用brew install protobuf的方式安装,因为该方式安装的版本不一定是2.5.0。

protocbuf 2.5.0的源码详见个人资源【http://download.csdn.net/detail/tterminator/9562400】,需要自己手动编译,其实也比较简单,步骤如下:
(1)设置编译目录:
./configure --prefix=/User/King-pan/software/tools/protobuf
其中/User/King-pan/software/tools/protobuf 为自己设定的编译安装目录。
(2)安装:
make
make install
(3)配置环境变量:
sudo vi .bash_profile
(4)添加配置文件:
export PROTOBUF=/Users/King-pan/software/tools/protobuf
export PATH=$PROTOBUF/bin:$PATH
(5)测试:
protoc --version

4.安装maven

本人编译时使用的版本:Apache Maven 3.3.3。

五、最后准备

在编译过程中会出现以下报错:
Exception in thread “main” java.lang.AssertionError: Missing tools.jar at: /Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/Classes/classes.jar. Expression: file.exists()

解决办法:
在JAVA_HOME,也即/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home路径下新建目录Class,并执行创建软符号链接命令(推荐使用绝对路径):
sudo ln -s $JAVA_HOME/lib/tools.jar $JAVA_HOME/Classes/classes.jar

六、开始编译

  1. 进入下载的2.6.0源码根目录下,执行以下命令:
    mvn package -Pdist,native -DskipTests -Dtar

  2. 等待之后会出现以下编译结果:
    [INFO] Reactor Summary:
    [INFO]
    [INFO] Apache Hadoop Main ................................. SUCCESS [ 1.206 s]
    [INFO] Apache Hadoop Project POM .......................... SUCCESS [06:57 min]
    [INFO] Apache Hadoop Annotations .......................... SUCCESS [03:22 min]
    [INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.272 s]
    [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [02:02 min]
    [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [02:46 min]
    [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [19:36 min]
    [INFO] Apache Hadoop Auth ................................. SUCCESS [07:47 min]
    [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [01:21 min]
    [INFO] Apache Hadoop Common ............................... SUCCESS [12:20 min]
    [INFO] Apache Hadoop NFS .................................. SUCCESS [ 4.994 s]
    [INFO] Apache Hadoop KMS .................................. SUCCESS [ 30.076 s]
    [INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.044 s]
    [INFO] Apache Hadoop HDFS ................................. SUCCESS [07:09 min]
    [INFO] Apache Hadoop HttpFS ............................... SUCCESS [01:46 min]
    [INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [03:25 min]
    [INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 3.440 s]
    [INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.027 s]
    [INFO] hadoop-yarn ........................................ SUCCESS [ 0.031 s]
    [INFO] hadoop-yarn-api .................................... SUCCESS [01:08 min]
    [INFO] hadoop-yarn-common ................................. SUCCESS [01:32 min]
    [INFO] hadoop-yarn-server ................................. SUCCESS [ 0.027 s]
    [INFO] hadoop-yarn-server-common .......................... SUCCESS [01:21 min]
    [INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [01:49 min]
    [INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 2.149 s]
    [INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 4.729 s]
    [INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 15.844 s]
    [INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 4.096 s]
    [INFO] hadoop-yarn-client ................................. SUCCESS [ 5.619 s]
    [INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.024 s]
    [INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 1.940 s]
    [INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 1.541 s]
    [INFO] hadoop-yarn-site ................................... SUCCESS [ 0.031 s]
    [INFO] hadoop-yarn-registry ............................... SUCCESS [ 3.909 s]
    [INFO] hadoop-yarn-project ................................ SUCCESS [ 3.328 s]
    [INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.040 s]
    [INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 17.288 s]
    [INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 13.444 s]
    [INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 2.979 s]
    [INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 7.477 s]
    [INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 5.705 s]
    [INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [01:24 min]
    [INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 1.265 s]
    [INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 4.280 s]
    [INFO] hadoop-mapreduce ................................... SUCCESS [ 3.765 s]
    [INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 16.991 s]
    [INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 31.676 s]
    [INFO] Apache Hadoop Archives ............................. SUCCESS [ 1.504 s]
    [INFO] Apache Hadoop Rumen ................................ SUCCESS [ 4.793 s]
    [INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 3.410 s]
    [INFO] Apache Hadoop Data Join ............................ SUCCESS [ 1.948 s]
    [INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [ 1.599 s]
    [INFO] Apache Hadoop Extras ............................... SUCCESS [ 2.169 s]
    [INFO] Apache Hadoop Pipes ................................ SUCCESS [ 6.456 s]
    [INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 4.146 s]
    [INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [08:32 min]
    [INFO] Apache Hadoop Client ............................... SUCCESS [ 6.769 s]
    [INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 0.112 s]
    [INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 3.510 s]
    [INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 11.581 s]
    [INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.021 s]
    [INFO] Apache Hadoop Distribution ......................... SUCCESS [ 26.965 s]
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 01:28 h
    [INFO] Finished at: 2016-06-28T01:40:35+08:00
    [INFO] Final Memory: 193M/874M
    [INFO] ------------------------------------------------------------------------

七、将编译出的native library复制到下载的二进制版本的Hadoop 2.6.0相应目录中

  1. 编译出的native library库的位置为
    hadoop-2.6.0-src/hadoop-dist/target/hadoop-2.6.0/lib/native
  2. 拷贝到二进制版本的Hadoop 2.6.0的目录
    hadoop-2.6.0/lib/native

八、修改/etc/hadoop/hadoop-env.sh配置

export HADOOP_OPTS=”$HADOOP_OPTS -Djava.net.preferIPv4Stack=true -Djava.library.path=/hadoop-2.6.0/lib/native”

九、重新启动Hadoop

此时就不会出现本文开头处的那个警告了。

十、附录

  1. ssh localhost 报错:connection closed by ::1
    解决办法:
    查看mac os x 系统日志/var/log/system.log后发现有提示:
    (1)Jun 28 09:29:29 bj-m-203544a.local sshd[48173]: error: Could not load host key: /etc/ssh_host_rsa_key
    (2)Jun 28 09:29:29 bj-m-203544a.local sshd[48173]: error: Could not load host key: /etc/ssh_host_dsa_key
    根据提示把.ssh/id_dsaid_rsa拷贝到/etc目录下,并重命名为ssh_host_dsa_keyssh_host_rsa_key即可。

  2. ssh localhost 免密登录
    修改.ssh/authorized_keys文件权限为644
    (1)chmod 644 authorized_keys
    (2)修改/etc/ssh_config文件中属性PasswordAuthentication值为no【可能会影响其他需要密码的ssh登录,若有影响再把该值给为yes即可。】

  3. 单机模式安装的Hadoop配置打包,详见个人资源

  4. 其它
    (1)启动Hadoop:
    ./start-dfs.sh
    ./start-yarn.sh

    (2)关闭Hadoop
    ./stop-dfs.sh
    ./stop-yarn.sh

    (3)hadoop fs -mkdir /tmp
    (4)hadoop fs -copyFromLocal ~/word.txt /tmp
    (5)Hadoop管理页面
    http://localhost:50070
    (6)yarn管理界面
    http://localhost:8098/cluster【端口实在yarn-site.xml中设置的】
    (7)hadoop fs -rm /tmp/out/part-r-00000
    (8)hadoop fs -rmdir /tmp/out/
    备注:如果有Hadoop进程没有起来,可以查看Hadoop日志,里面会有详细的记录:
    hadoop/logs/
    -rw-r--r-- 1 junwei8 staff 0 Jun 28 10:16 SecurityAuth-junwei8.audit
    -rw-r--r-- 1 junwei8 staff 65411 Jun 28 11:39 hadoop-junwei8-datanode-bj-m-203544a.local.log
    -rw-r--r-- 1 junwei8 staff 511 Jun 28 11:39 hadoop-junwei8-datanode-bj-m-203544a.local.out
    -rw-r--r-- 1 junwei8 staff 511 Jun 28 10:16 hadoop-junwei8-datanode-bj-m-203544a.local.out.1
    -rw-r--r-- 1 junwei8 staff 87710 Jun 28 11:40 hadoop-junwei8-namenode-bj-m-203544a.local.log
    -rw-r--r-- 1 junwei8 staff 511 Jun 28 11:39 hadoop-junwei8-namenode-bj-m-203544a.local.out
    -rw-r--r-- 1 junwei8 staff 511 Jun 28 10:16 hadoop-junwei8-namenode-bj-m-203544a.local.out.1
    -rw-r--r-- 1 junwei8 staff 64683 Jun 28 11:40 hadoop-junwei8-secondarynamenode-bj-m-203544a.local.log
    -rw-r--r-- 1 junwei8 staff 511 Jun 28 11:39 hadoop-junwei8-secondarynamenode-bj-m-203544a.local.out
    -rw-r--r-- 1 junwei8 staff 511 Jun 28 10:16 hadoop-junwei8-secondarynamenode-bj-m-203544a.local.out.1
    drwxr-xr-x 2 junwei8 staff 68 Jun 28 11:40 userlogs
    -rw-r--r-- 1 junwei8 staff 77547 Jun 28 11:40 yarn-junwei8-nodemanager-bj-m-203544a.local.log
    -rw-r--r-- 1 junwei8 staff 494 Jun 28 11:40 yarn-junwei8-nodemanager-bj-m-203544a.local.out
    -rw-r--r-- 1 junwei8 staff 494 Jun 28 10:17 yarn-junwei8-nodemanager-bj-m-203544a.local.out.1
    -rw-r--r-- 1 junwei8 staff 251199 Jun 28 11:40 yarn-junwei8-resourcemanager-bj-m-203544a.local.log
    -rw-r--r-- 1 junwei8 staff 494 Jun 28 11:40 yarn-junwei8-resourcemanager-bj-m-203544a.local.out
    -rw-r--r-- 1 junwei8 staff 494 Jun 28 10:17 yarn-junwei8-resourcemanager-bj-m-203544a.local.out.1
    -rw-r--r-- 1 junwei8 staff 494 Jun 27 21:18 yarn-junwei8-resourcemanager-bj-m-203544a.local.out.2
    -rw-r--r-- 1 junwei8 staff 494 Jun 27 21:17 yarn-junwei8-resourcemanager-bj-m-203544a.local.out.3

十一、参考链接:

  1. http://www.rockyfeng.me/hadoop_native_library_mac.html
  2. http://my.oschina.net/KingPan/blog/283881?p=1
  3. http://hadoop.apache.org/docs/r2.6.0/hadoop-project-dist/hadoop-common/NativeLibraries.html
  4. http://leibnitz.iteye.com/blog/2149745
  • 1
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 6
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 6
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值