第1节 hudi 0.9编译 CDH6.3.2环境

本文档详细介绍了如何在Linux环境中搭建Maven和Git,然后下载并编译Hudi源码,包括修改配置以使用阿里云镜像,最后展示了成功编译后的文件目录结构。在编译过程中,尽管出现了一些警告,但最终成功构建了Hudi项目。
摘要由CSDN通过智能技术生成

1.准备编译环境

1) Maven安装

下载地址 Index of /dist/maven/maven-3/3.6.1/binaries (apache.org)

(1)把apache-maven-3.6.1-bin.tar.gz上传到linux的/data/software目录下

(2)解压apache-maven-3.6.1-bin.tar.gz到/data/module/目录下面

tar -zxvf apache-maven-3.6.1-bin.tar.gz -C /data/module/

(3)修改apache-maven-3.6.1的名称为maven

(4)添加环境变量到/etc/profile中

[stars@stars-bigdata-01 module]# vim /etc/profile
#MAVEN_HOME
export MAVEN_HOME=/data/module/maven
export PATH=$PATH:$MAVEN_HOME/bin

(5)测试安装结果

source /etc/profile  

mvn -v 

(6)修改setting.xml,指定为阿里云

vim maven/conf/settings.xml
<!-- 添加阿里云镜像-->
<mirror>
        <id>nexus-aliyun</id>
        <mirrorOf>central</mirrorOf>
        <name>Nexus aliyun</name>
        <url>http://maven.aliyun.com/nexus/content/groups/public</url>
</mirror>

2) Git安装

yum install git
git --version
git version 1.8.3.1

2 下载源码

 git clone --branch release-0.9.0 https://gitee.com/apache/Hudi.git

  修改pom.xml

[stars@stars-bigdata-01 module]# cd Hudi/
[stars@stars-bigdata-01 module]# vim pom.xml
 <repository>
        <id>nexus-aliyun</id>
        <name>nexus-aliyun</name>
        <url>http://maven.aliyun.com/nexus/content/groups/public/</url>
        <releases>
            <enabled>true</enabled>
        </releases>
        <snapshots>
            <enabled>false</enabled>
        </snapshots>
    </repository>

 3 编译

apache spark 3.0.0用

[xxx@xxx Hudi]# mvn clean package -DskipTests -Dspark3 -Dscala-2.12

由于cdh使用的spark 版本是 2.11_2.4.0 所以选择hudi pom里自带的2.4.4

使用

mvn clean package -DskipTests

[INFO] Reactor Summary for Hudi 0.9.0:
[INFO] 
[INFO] Hudi ............................................... SUCCESS [ 59.061 s]
[INFO] hudi-common ........................................ SUCCESS [01:13 min]
[INFO] hudi-timeline-service .............................. SUCCESS [  6.151 s]
[INFO] hudi-client ........................................ SUCCESS [  0.097 s]
[INFO] hudi-client-common ................................. SUCCESS [ 29.714 s]
[INFO] hudi-hadoop-mr ..................................... SUCCESS [ 38.765 s]
[INFO] hudi-spark-client .................................. SUCCESS [ 44.479 s]
[INFO] hudi-sync-common ................................... SUCCESS [  0.956 s]
[INFO] hudi-hive-sync ..................................... SUCCESS [  7.672 s]
[INFO] hudi-spark-datasource .............................. SUCCESS [  0.046 s]
[INFO] hudi-spark-common_2.11 ............................. SUCCESS [ 12.661 s]
[INFO] hudi-spark2_2.11 ................................... SUCCESS [ 11.888 s]
[INFO] hudi-spark_2.11 .................................... SUCCESS [ 39.437 s]
[INFO] hudi-utilities_2.11 ................................ SUCCESS [ 52.633 s]
[INFO] hudi-utilities-bundle_2.11 ......................... SUCCESS [01:34 min]
[INFO] hudi-cli ........................................... SUCCESS [ 10.846 s]
[INFO] hudi-java-client ................................... SUCCESS [  1.805 s]
[INFO] hudi-flink-client .................................. SUCCESS [ 23.037 s]
[INFO] hudi-spark3_2.12 ................................... SUCCESS [ 22.693 s]
[INFO] hudi-dla-sync ...................................... SUCCESS [  3.368 s]
[INFO] hudi-sync .......................................... SUCCESS [  0.038 s]
[INFO] hudi-hadoop-mr-bundle .............................. SUCCESS [  6.471 s]
[INFO] hudi-hive-sync-bundle .............................. SUCCESS [  1.736 s]
[INFO] hudi-spark-bundle_2.11 ............................. SUCCESS [ 10.716 s]
[INFO] hudi-presto-bundle ................................. SUCCESS [ 22.618 s]
[INFO] hudi-timeline-server-bundle ........................ SUCCESS [  7.461 s]
[INFO] hudi-hadoop-docker ................................. SUCCESS [  1.406 s]
[INFO] hudi-hadoop-base-docker ............................ SUCCESS [  0.863 s]
[INFO] hudi-hadoop-namenode-docker ........................ SUCCESS [  1.103 s]
[INFO] hudi-hadoop-datanode-docker ........................ SUCCESS [  1.231 s]
[INFO] hudi-hadoop-history-docker ......................... SUCCESS [  0.891 s]
[INFO] hudi-hadoop-hive-docker ............................ SUCCESS [  4.294 s]
[INFO] hudi-hadoop-sparkbase-docker ....................... SUCCESS [  0.929 s]
[INFO] hudi-hadoop-sparkmaster-docker ..................... SUCCESS [  0.855 s]
[INFO] hudi-hadoop-sparkworker-docker ..................... SUCCESS [  0.859 s]
[INFO] hudi-hadoop-sparkadhoc-docker ...................... SUCCESS [  0.859 s]
[INFO] hudi-hadoop-presto-docker .......................... SUCCESS [  0.885 s]
[INFO] hudi-integ-test .................................... SUCCESS [ 18.117 s]
[INFO] hudi-integ-test-bundle ............................. SUCCESS [02:43 min]
[INFO] hudi-examples ...................................... SUCCESS [  5.054 s]
[INFO] hudi-flink_2.11 .................................... SUCCESS [ 12.870 s]
[INFO] hudi-flink-bundle_2.11 ............................. SUCCESS [ 31.194 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  13:49 min
[INFO] Finished at: 2021-09-16T13:59:08+08:00
[INFO] ------------------------------------------------------------------------

看样子是已经编译构建成功了

4)编译好之后文件目录对应Hudi下的packaging目录

[xxx@xxx Hudi]# cd packaging/
[xxx@xxx packaging]# ll
总用量 36
drwxr-xr-x 4 root root 4096 9月  16 13:58 hudi-flink-bundle
drwxr-xr-x 4 root root 4096 9月  16 13:54 hudi-hadoop-mr-bundle
drwxr-xr-x 4 root root 4096 9月  16 13:54 hudi-hive-sync-bundle
drwxr-xr-x 4 root root 4096 9月  16 13:56 hudi-integ-test-bundle
drwxr-xr-x 4 root root 4096 9月  16 13:54 hudi-presto-bundle
drwxr-xr-x 4 root root 4096 9月  16 13:54 hudi-spark-bundle
drwxr-xr-x 4 root root 4096 9月  16 13:55 hudi-timeline-server-bundle
drwxr-xr-x 4 root root 4096 9月  16 13:51 hudi-utilities-bundle
-rw-r--r-- 1 root root 2206 9月  16 11:42 README.md

编译的时候发现问多warnning 但是没有关系

如果hadoop环境是apache版本的就使用spark3和scala 2.12

  • 0
    点赞
  • 5
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值