FLINK1.9.1源码编译 支持hadoop2.6.0-cdh1.15.1

官网编译指导:https://ci.apache.org/projects/flink/flink-docs-release-1.9/flinkDev/building.html#pre-bundled-versions

[hadoop@hadoop001 flink]$ mvn --version
# maven 版本要求 3.3以上
Apache Maven 3.6.1 (d66c9c0b3152b2e69ee9bac180bb8fcc8e6af555; 2019-04-05T03:00:29+08:00)
Maven home: /root/apps/apache-maven-3.6.1
# JDK 1.8.0_191以上
Java version: 1.8.0_221, vendor: Oracle Corporation, runtime: /root/apps/jdk1.8.0_221/jre


  • 编译
mvn clean install -DskipTests
cd flink-dist
mvn clean install
  • 加上 hadoop cdh 依赖
mvn clean install -DskipTests  -Pvendor-repos -Dhadoop.version=2.6.0-cdh5.15.1 -Dmaven.javadoc.skip=true -Dcheckstyle.skip=true

  • 下载依赖的 flink-shaded 源码

不同的 Flink 版本使用的 Flink-shaded不同,1.9.1 版本使用 7.0

https://mirrors.tuna.tsinghua.edu.cn/apache/flink/flink-shaded-7.0/flink-shaded-7.0-src.tgz

  • 编译flink-shaded
mvn clean install  -DskipTests -Dhadoop.version=2.6.0-cdh5.15.1

测试 flink on yarn

  1. 第一步: 将 /home/hadoop/software/flink-shaded-7.0/flink-shaded-hadoop-2/target/的
    flink-shaded-hadoop-2-2.6.0-cdh5.15.1-7.0.jar 拷贝到编译好的flink的 /home/hadoop/software/flink-1.9.1/build-target/lib路径下
[hadoop@hadoop001 lib]$ ll
total 172700
-rw-r--r-- 1 hadoop hadoop 105319883 Nov 18 15:24 flink-dist_2.11-1.9.1.jar
-rw-r--r-- 1 hadoop hadoop  30101800 Nov 18 15:58 flink-shaded-hadoop-2-2.6.0-cdh5.15.1-7.0.jar
-rw-r--r-- 1 hadoop hadoop  18738069 Nov 18 15:24 flink-table_2.11-1.9.1.jar
-rw-r--r-- 1 hadoop hadoop  22175659 Nov 18 15:24 flink-table-blink_2.11-1.9.1.jar
-rw-r--r-- 1 hadoop hadoop    489884 Oct  8 20:23 log4j-1.2.17.jar
-rw-r--r-- 1 hadoop hadoop      9931 Nov 18 09:46 slf4j-log4j12-1.7.15.jar
[hadoop@hadoop001 lib]$ pwd
/home/hadoop/software/flink-1.9.1/build-target/lib

  1. 指定hadoop依赖
    可在系统变量中配置 或在跑yarn 时 执行;
[hadoop@hadoop001 lib]$ export HADOOP_CLASSPATH=`hadoop classpath`
[hadoop@hadoop001 flink]$ bin/flink run -m yarn-cluster -yn 3 -s 4 examples/batch/WordCount.jar

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/software/flink-1.9.1/flink-dist/target/flink-1.9.1-bin/flink-1.9.1/lib/slf4j-log4j12-1.7.15.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/root/apps/hadoop-2.6.0-cdh5.15.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2019-11-18 17:13:46,093 INFO  org.apache.hadoop.yarn.client.RMProxy                         - Connecting to ResourceManager at hadoop001/192.168.0.3:8032
2019-11-18 17:13:46,183 INFO  org.apache.flink.yarn.cli.FlinkYarnSessionCli                 - No path for the flink jar passed. Using the location of class org.apache.flink.yarn.YarnClusterDescriptor to locate the jar
2019-11-18 17:13:46,183 INFO  org.apache.flink.yarn.cli.FlinkYarnSessionCli                 - No path for the flink jar passed. Using the location of class org.apache.flink.yarn.YarnClusterDescriptor to locate the jar
2019-11-18 17:13:46,186 INFO  org.apache.flink.yarn.cli.FlinkYarnSessionCli                 - The argument yn is deprecated in will be ignored.
2019-11-18 17:13:46,186 INFO  org.apache.flink.yarn.cli.FlinkYarnSessionCli                 - The argument yn is deprecated in will be ignored.
2019-11-18 17:13:46,338 INFO  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Cluster specification: ClusterSpecification{masterMemoryMB=1024, taskManagerMemoryMB=1024, numberTaskManagers=3, slotsPerTaskManager=1}
2019-11-18 17:13:46,718 WARN  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - The configuration directory ('/home/hadoop/software/flink-1.9.1/flink-dist/target/flink-1.9.1-bin/flink-1.9.1/conf') contains both LOG4J and Logback configuration files. Please delete or rename one of them.
2019-11-18 17:13:49,118 INFO  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Submitting application master application_1567577007137_0069
2019-11-18 17:13:49,133 INFO  org.apache.hadoop.yarn.client.api.impl.YarnClientImpl         - Submitted application application_1567577007137_0069
2019-11-18 17:13:49,133 INFO  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Waiting for the cluster to be allocated
2019-11-18 17:13:49,134 INFO  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Deploying cluster, current state ACCEPTED
2019-11-18 17:13:53,489 INFO  org.apache.flink.yarn.AbstractYarnClusterDescriptor           - YARN application has been deployed successfully.
Starting execution of program
Executing WordCount example with default input data set.
Use --input to specify file input.
Printing result to stdout. Use --output to specify output path.
(end,2)
(enterprises,1)
(whips,1)
(who,2)
(whose,1)
(will,1)
(wish,1)
(with,3)
(would,2)
(wrong,1)
(you,1)
Program execution finished
Job with JobID f23b0cd07e3144c9090590b7b79a2591 has finished.
Job Runtime: 9711 ms
Accumulator Results: 
- 9aae0b5e4aa683332452de77a9400400 (java.util.ArrayList) [170 elements]



在这里插入图片描述

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值