ubuntu18.04编译hadoop2.7.6支持snappy

官网下载

http://hadoop.apache.org/releases.html

解压

hadoop解压

#读取 BUILDING.txt文件 上图中用红色框选中的文件,是官方的编译步走

Build instructions for Hadoop

----------------------------------------------------------------------------------
Requirements:

* Unix System
* JDK 1.7+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac
* Zlib devel (if compiling native code)
* openssl devel ( if compiling native hadoop-pipes and to get the best HDFS encryption performance )
* Linux FUSE (Filesystem in Userspace) version 2.6 or above ( if compiling fuse_dfs )
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)

----------------------------------------------------------------------------------
Installing required packages for clean install of Ubuntu 14.04 LTS Desktop:

* Oracle JDK 1.7 (preferred)
  $ sudo apt-get purge openjdk*
  $ sudo apt-get install software-properties-common
  $ sudo add-apt-repository ppa:webupd8team/java
  $ sudo apt-get update
  $ sudo apt-get install oracle-java7-installer
* Maven
  $ sudo apt-get -y install maven
* Native libraries
  $ sudo apt-get -y install build-essential autoconf automake libtool cmake zlib1g-dev pkg-config libssl-dev
* ProtocolBuffer 2.5.0 (required)
  $ sudo apt-get -y install libprotobuf-dev protobuf-compiler

Optional packages:

* Snappy compression
  $ sudo apt-get install snappy libsnappy-dev
* Bzip2
  $ sudo apt-get install bzip2 libbz2-dev
* Jansson (C Library for JSON)
  $ sudo apt-get install libjansson-dev
* Linux FUSE
  $ sudo apt-get install fuse libfuse-dev

安转依赖

jdk、maven就不说了。不会的话你也别学hadoop了。

本地依赖

  • sudo apt-get -y install build-essential autoconf automake libtool cmake zlib1g-dev pkg-config libssl-dev snappy libsnappy-dev bzip2 libbz2-dev libjansson-dev fuse libfuse-dev

注意 libssl-dev安装换成 libssl1.0-dev版本。 我之前就是默认安转,在编译OpenSSL功能的时候一直卡主编译不通过。具体错误形式如下

project/hadoop-common/src/main/native/src -I/home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/src -I/home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/src/src -I/home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/target/native -I/opt/jdk1.8.0_171/include -I/opt/jdk1.8.0_171/include/linux -I/usr/local/include -I/home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/util  -g -Wall -O2 -D_REENTRANT -D_GNU_SOURCE -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64   -o CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyDecompressor.c.o   -c /home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyDecompressor.c
     [exec] [ 19%] Building C object CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c.o
     [exec] /usr/bin/cc  -I/home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/target/native/javah -I/home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/src/main/native/src -I/home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/src -I/home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/src/src -I/home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/target/native -I/opt/jdk1.8.0_171/include -I/opt/jdk1.8.0_171/include/linux -I/usr/local/include -I/home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/util  -g -Wall -O2 -D_REENTRANT -D_GNU_SOURCE -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64   -o CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c.o   -c /home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c
     [exec] CMakeFiles/hadoop_static.dir/build.make:230: recipe for target 'CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c.o' failed
     [exec] make[2]: 离开目录“/home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/target/native”
     [exec] CMakeFiles/Makefile2:104: recipe for target 'CMakeFiles/hadoop_static.dir/all' failed
     [exec] make[1]: 离开目录“/home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/target/native”
     [exec] Makefile:83: recipe for target 'all' failed
     [exec] /home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c: In function ‘check_update_max_output_len’:
     [exec] /home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c:256:14: error: dereferencing pointer to incomplete type ‘EVP_CIPHER_CTX {aka struct evp_cipher_ctx_st}’
     [exec]    if (context->flags & EVP_CIPH_NO_PADDING) {
     [exec]               ^~
     [exec] /home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c:275:1: warning: control reaches end of non-void function [-Wreturn-type]
     [exec]  }
     [exec]  ^
     [exec] /home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c: In function ‘check_doFinal_max_output_len’:
     [exec] /home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c:320:1: warning: control reaches end of non-void function [-Wreturn-type]
     [exec]  }
     [exec]  ^
     [exec] make[2]: *** [CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c.o] Error 1
     [exec] make[1]: *** [CMakeFiles/hadoop_static.dir/all] Error 2
     [exec] make: *** [all] Error 2
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 20.675 s
[INFO] Finished at: 2018-07-19T20:27:17+08:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-common: An Ant BuildException has occured: exec returned: 2
[ERROR] around Ant part ...<exec failonerror="true" dir="/home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/target/native" executable="make">... @ 7:140 in /home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/target/antrun/build-main.xml
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-common: An Ant BuildException has occured: exec returned: 2
around Ant part ...<exec failonerror="true" dir="/home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/target/native" executable="make">... @ 7:140 in /home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/target/antrun/build-main.xml
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:213)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:954)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoExecutionException: An Ant BuildException has occured: exec returned: 2
around Ant part ...<exec failonerror="true" dir="/home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/target/native" executable="make">... @ 7:140 in /home/benny/software/hadoop-2.7.6-src/hadoop-common-project/hadoop-common/target/antrun/build-main.xml
    at org.apache.maven.plugin.antrun.AntRunMojo.execute (AntRunMojo.java:355)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:137)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:208)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:954)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
Caused by: org.apache.tools.ant.BuildException: exec returned: 2
    at org.apache.tools.ant.taskdefs.ExecTask.runExecute (ExecTask.java:646)
    at org.apache.tools.ant.taskdefs.ExecTask.runExec (ExecTask.java:672)
    at org.apache.tools.ant.taskdefs.ExecTask.execute (ExecTask.java:498)
    at org.apache.tools.ant.UnknownElement.execute (UnknownElement.java:291)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.apache.tools.ant.dispatch.DispatchUtils.execute (DispatchUtils.java:106)
    at org.apache.tools.ant.Task.perform (Task.java:348)
    at org.apache.tools.ant.Target.execute (Target.java:390)
    at org.apache.tools.ant.Target.performTasks (Target.java:411)
    at org.apache.tools.ant.Project.executeSortedTargets (Project.java:1399)
    at org.apache.tools.ant.Project.executeTarget (Project.java:1368)
    at org.apache.maven.plugin.antrun.AntRunMojo.execute (AntRunMojo.java:327)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:137)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:208)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:954)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
[ERROR] 
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:

安装protobuf

下载

  1. googletest下载,protobuf编译需要依赖它, 下载地址:https://github.com/google/googletest/tree/release-1.7.0 最好选择1.7.0版本,1.7.0版本后目录结构有变化。

  2. protobuf下载 官方文档中说可以通过下面命令安装,但是我发现libprotobuf-dev没有,只好手动安装

$ sudo apt-get -y install libprotobuf-dev protobuf-compiler

protobuf下载地址 https://github.com/google/protobuf/tree/v2.5.0 最好选择2.5.0版本与官方一致, 解压后,把前面下载的googletest复制到protobuf的跟目录,并且重命名为gtest

编译

  1. 运行autogen.sh 生成configure
  2. 运行configure
  3. make
  4. sudo make install 验证protobuf是否安装成功
benny@benny-ubuntu:~/software/protobuf-2.5.0$ protoc --version
libprotoc 2.5.0

可看到安转成功了

安装snappy

  1. 下载地址 https://github.com/google/snappy
  2. 编译安转和protobuf一样。 configure、 make、 sudo make install 安装完成可以验证
-rw-r--r-- 1 root root 616634 7月  19 17:06 /usr/local/lib/libsnappy.a
-rwxr-xr-x 1 root root    945 7月  19 17:06 /usr/local/lib/libsnappy.la*
lrwxrwxrwx 1 root root     18 7月  19 17:06 /usr/local/lib/libsnappy.so -> libsnappy.so.1.3.0*
lrwxrwxrwx 1 root root     18 7月  19 17:06 /usr/local/lib/libsnappy.so.1 -> libsnappy.so.1.3.0*
-rwxr-xr-x 1 root root 304936 7月  19 17:06 /usr/local/lib/libsnappy.so.1.3.0*

#编译hadoop 编译之前,刷新一下动态链接库。

sudo ldconfig

到现在,准备工作算是做完了。maven命令编译。

进入hadoop的根目录执行下面的maven命令

mvn clean package -DskipTests -Pdist,native -Dtar -Dsnappy.lib -Drequire.snappy -Dbundle.snappy -Drequire.openssl -Dbundle.openssl -Dopenssl.lib

编译完成后./hadoop-dist/target/ 目录下就是最终生成的文件了。

benny@benny-ubuntu:~/software/hadoop-2.7.6-src$ ll ./hadoop-dist/target/
总用量 585192
drwxr-xr-x 10 benny benny      4096 7月  20 14:57 ./
drwx------  4 benny benny      4096 7月  20 14:57 ../
drwxr-xr-x  2 benny benny      4096 7月  20 14:57 antrun/
drwxr-xr-x  3 benny benny      4096 7月  20 14:57 classes/
-rw-r--r--  1 benny benny      1877 7月  20 14:57 dist-layout-stitching.sh
-rw-r--r--  1 benny benny       650 7月  20 14:57 dist-tar-stitching.sh
drwxr-xr-x  9 benny benny      4096 7月  20 17:25 hadoop-2.7.6/
-rw-r--r--  1 benny benny 199256146 7月  20 14:57 hadoop-2.7.6.tar.gz
-rw-r--r--  1 benny benny     26522 7月  20 14:57 hadoop-dist-2.7.6.jar
-rw-r--r--  1 benny benny 399837851 7月  20 14:57 hadoop-dist-2.7.6-javadoc.jar
-rw-r--r--  1 benny benny     24051 7月  20 14:57 hadoop-dist-2.7.6-sources.jar
-rw-r--r--  1 benny benny     24051 7月  20 14:57 hadoop-dist-2.7.6-test-sources.jar
drwxr-xr-x  2 benny benny      4096 7月  20 14:57 javadoc-bundle-options/
drwxr-xr-x  2 benny benny      4096 7月  20 14:57 maven-archiver/
drwxr-xr-x  3 benny benny      4096 7月  20 14:57 maven-shared-archive-resources/
-rw-r--r--  1 benny benny        30 7月  20 14:57 .plxarc
drwxr-xr-x  3 benny benny      4096 7月  20 14:57 test-classes/
drwxr-xr-x  2 benny benny      4096 7月  20 14:57 test-dir/
benny@benny-ubuntu:~/software/hadoop-2.7.6-src$ 

支持snappy的本地动态链接库位于./hadoop-dist/target/hadoop-2.7.6/lib/native/

benny@benny-ubuntu:~/software/hadoop-2.7.6-src$ ll ./hadoop-dist/target/hadoop-2.7.6/lib/native/*
-rw-r--r-- 1 benny benny 1521778 7月  20 14:57 ./hadoop-dist/target/hadoop-2.7.6/lib/native/libhadoop.a
-rw-r--r-- 1 benny benny 1752088 7月  20 14:57 ./hadoop-dist/target/hadoop-2.7.6/lib/native/libhadooppipes.a
lrwxrwxrwx 1 benny benny      18 7月  20 14:57 ./hadoop-dist/target/hadoop-2.7.6/lib/native/libhadoop.so -> libhadoop.so.1.0.0*
-rwxr-xr-x 1 benny benny  866704 7月  20 14:57 ./hadoop-dist/target/hadoop-2.7.6/lib/native/libhadoop.so.1.0.0*
-rw-r--r-- 1 benny benny  751080 7月  20 14:57 ./hadoop-dist/target/hadoop-2.7.6/lib/native/libhadooputils.a
-rw-r--r-- 1 benny benny  464116 7月  20 14:57 ./hadoop-dist/target/hadoop-2.7.6/lib/native/libhdfs.a
lrwxrwxrwx 1 benny benny      16 7月  20 14:57 ./hadoop-dist/target/hadoop-2.7.6/lib/native/libhdfs.so -> libhdfs.so.0.0.0*
-rwxr-xr-x 1 benny benny  286408 7月  20 14:57 ./hadoop-dist/target/hadoop-2.7.6/lib/native/libhdfs.so.0.0.0*

转载于:https://my.oschina.net/u/1396185/blog/1860336

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值