最近学习Kylin,但是在执行到第一个Example时,提示没有Snappy支持。之前没有配置过Snappy,在经过一系列的Google、百度后,并没有成功添加配置,最后还是通过编译源码的方式才将支持搞定,现在先说一下大体流程,因为尝试了很多种方式,只能提供大体思路
下载源码
http://archive.cloudera.com/cdh5/cdh/5/hadoop-2.6.0-cdh5.12.1-src.tar.gz
环境准备
jdk1.7
protobuf 2.5.0
编译源码
mvn clean package -DskipTests -Pdist,native -Dtar -Dsnappy.lib=/usr/local/lib -Dbundle.snappy
查看在编译的过程中遇到问题,大多数问题都可以百度到解决,只有下面这个异常找了很多
No versions available for com.amazonaws:DynamoDBLocal:jar:[1.11.86,2.0) with
下面的解决方案试了并没有用
如果可以看到没有亚马逊这个包的错误,也就代表快成功了
PS:下面是我的解决方式
在下面下载DynamoDBLocal.jar
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DynamoDBLocal.html
修改hadoop-2.6.0-cdh5.12.1/hadoop-project/pom.xml
注释
<!--<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>DynamoDBLocal</artifactId>
<version>[1.11.86,2.0)</version>
</dependency>-->
变更为
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>DynamoDBLocal</artifactId>
<version>1.11.86</version>
<scope>system</scope>
<systemPath>/home/levin/download/DynamoDBLocal.jar</systemPath>
</dependency>
并且修改
hadoop-2.6.0-cdh5.12.1/hadoop-tools/hadoop-aws/pom.xml
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>DynamoDBLocal</artifactId>
<!--<scope>test</scope>--> 注释这个作用域
<exclusions>
<exclusion>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-core</artifactId>
</exclusion>
<exclusion>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-http</artifactId>
</exclusion>
</exclusions>
</dependency>
在修改这些之后编译
[INFO] Executed tasks
[INFO]
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-dist ---
[INFO] Building jar: /home/levin/download/hadoop-2.6.0-cdh5.12.1/hadoop-dist/target/hadoop-dist-2.6.0-cdh5.12.1-javadoc.jar
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 2.775 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [ 1.545 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 2.610 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 2.424 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.381 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 1.657 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 2.740 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 3.534 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 3.630 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 2.029 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [01:05 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 3.236 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 7.973 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.083 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [01:36 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 15.541 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 3.737 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 2.789 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.115 s]
[INFO] hadoop-yarn ........................................ SUCCESS [ 0.119 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [ 55.179 s]
[INFO] hadoop-yarn-common ................................. SUCCESS [ 14.725 s]
[INFO] hadoop-yarn-server ................................. SUCCESS [ 0.244 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 6.425 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 16.452 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 1.987 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 3.750 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 10.428 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 0.940 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [ 3.255 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.067 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 1.773 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 1.549 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [ 0.069 s]
[INFO] hadoop-yarn-registry ............................... SUCCESS [ 3.271 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [ 4.148 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.169 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 11.265 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 9.555 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 2.307 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 5.493 s]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 3.430 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 3.511 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 2.009 s]
[INFO] hadoop-mapreduce-client-nativetask ................. SUCCESS [01:03 min]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 3.900 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [ 3.799 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 2.431 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 6.144 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [ 1.320 s]
[INFO] Apache Hadoop Archive Logs ......................... SUCCESS [ 1.513 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 3.193 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 2.525 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [ 1.636 s]
[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [ 1.476 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [ 1.712 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [ 6.417 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 3.381 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [ 8.754 s]
[INFO] Apache Hadoop Azure support ........................ SUCCESS [ 3.018 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [ 3.519 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 1.014 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 3.565 s]
[INFO] Apache Hadoop Azure Data Lake support .............. SUCCESS [ 2.515 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 10.708 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.034 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 29.076 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 09:08 min
[INFO] Finished at: 2018-04-16T14:53:33+08:00
[INFO] Final Memory: 264M/804M
[INFO] ------------------------------------------------------------------------
成功了
执行下面的命令
cp hadoop-2.6.0-cdh5.12.1/hadoop-dist/target/hadoop-2.6.0-cdh5.12.1/lib/native/* $HADOOP_HOME/lib/
hadoop checknative -a
已经支持Snappy
18/04/16 14:56:37 WARN bzip2.Bzip2Factory: Failed to load/initialize native-bzip2 library system-native, will use pure-Java version
18/04/16 14:56:37 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
Native library checking:
hadoop: true /home/levin/install-soft/bigdata/hadoop-2.6.0-cdh5.12.1/lib/libhadoop.so
zlib: true /lib/x86_64-linux-gnu/libz.so.1
snappy: true /usr/lib/x86_64-linux-gnu/libsnappy.so.1
lz4: true revision:10301
bzip2: false
openssl: true /usr/lib/x86_64-linux-gnu/libcrypto.so
18/04/16 14:56:37 INFO util.ExitUtil: Exiting with status 1
最后编译成功的lib\native下的文件可以到下面下载直接覆盖使用