CDH6.3.2集群环境部署Flink-1.15.0

Flink1.15.0安装部署

1.安装准备

1.1环境介绍

​ 本文档部署环境:CDH6.3.2集群、jdk1.8

1.2 Flink安装包

2.安装包解压和修改配置

2.1 环境变量配置

[root@cdh02 ~]# vim /etc/profile
#jdk1.8
export JAVA_HOME=/usr/local/java/jdk1.8.0_161
export PATH=$PATH:$JAVA_HOME/bin:$JAVA_HOME/jre/bin:$PATH
export CLASSPATH=.:$JAVA_HOME/lib:$JAVA_HOME/jre/lib
#scala
export SCALA_HOME="$PATH:/usr/local/scala/scala-2.12.8"
#Hadoop
export HADOOP_HOME=/opt/cloudera/parcels/CDH/lib/hadoop
export HADOOP_CONF_DIR=/usr/data/conf/hadoop
export HADOOP_CLASSPATH=$HADOOP_HOME/lib/*.jar
#Flinlk
export FLINK_HOME=/opt/shealphy/flink-1.15.0
#ES
export ES_HOME=/usr/local/elasticsearch-6.8.8

export PATH=$PATH:$SCALA_HOME/bin:$ES_HOME/bin:$FLINK_HOME/bin:$HADOOP_HOME/bin

2.2 Flink的配置

  • 本次安装使用的是Flink on Yarn模式,无需配置master和worker,yarn自行进行调度

    • 拷贝flink-shaded-hadoop-3-uber-3.1.1.7.2.9.0-173-9.0.jar到flink的lib下

      [root@cdh02 shealphy]# scp -r ./flink-shaded-hadoop-3-uber-3.1.1.7.2.9.0-173-9.0.jar ./flink-1.15.0/lib/
      

3.启动Flink

3.1 启动yarn-session

[root@cdh02 ~]# cd /opt/shealphy/flink-1.15.0/bin
[root@cdh02 bin]# ./yarn-session.sh -jm 1024 -tm 1024 -s 2 -d
  • 启动报错

    1. ERROR org.apache.flink.yarn.cli.FlinkYarnSessionCli

      ...
      2022-05-09 18:03:22,001 INFO  org.apache.flink.configuration.GlobalConfiguration           [] - Loading configuration property: yarn.application-attempts, 10
      2022-05-09 18:03:22,001 INFO  org.apache.flink.configuration.GlobalConfiguration           [] - Loading configuration property: fs.overwrite-files, true
      2022-05-09 18:03:22,002 INFO  org.apache.flink.configuration.GlobalConfiguration           [] - Loading configuration property: fs.output.always-create-directory, true
      2022-05-09 18:03:22,004 ERROR org.apache.flink.yarn.cli.FlinkYarnSessionCli                [] - Error while running the Flink session.
      java.lang.NoSuchMethodError: org.apache.commons.cli.Option.builder(Ljava/lang/String;)Lorg/apache/commons/cli/Option$Builder;
              at org.apache.flink.yarn.cli.FlinkYarnSessionCli.<init>(FlinkYarnSessionCli.java:230) ~[flink-dist-1.15.0.jar:1.15.0]
              at org.apache.flink.yarn.cli.FlinkYarnSessionCli.<init>(FlinkYarnSessionCli.java:156) ~[flink-dist-1.15.0.jar:1.15.0]
              at org.apache.flink.yarn.cli.FlinkYarnSessionCli.main(FlinkYarnSessionCli.java:851) [flink-dist-1.15.0.jar:1.15.0]
      
      ------------------------------------------------------------
       The program finished with the following exception:
      
      java.lang.NoSuchMethodError: org.apache.commons.cli.Option.builder(Ljava/lang/String;)Lorg/apache/commons/cli/Option$Builder;
              at org.apache.flink.yarn.cli.FlinkYarnSessionCli.<init>(FlinkYarnSessionCli.java:230)
              at org.apache.flink.yarn.cli.FlinkYarnSessionCli.<init>(FlinkYarnSessionCli.java:156)
              at org.apache.flink.yarn.cli.FlinkYarnSessionCli.main(FlinkYarnSessionCli.java:851)
      
      • 报错分析:缺少org.apache.commons.cli相关的jar或者jar包版本不一致

        • 解决1,从集群环境中拷贝对应jar包

          # 拷贝CDH6.3.2集群hadoop lib下的commons-cli-1.2.jar包到flink lib下
          # 重新启动flink-session依旧报错
          
        • 解决2,从Maven仓库下载高版本jar包commons-cli-1.4.jar

          # 拷贝commons-cli-1.4.jar到flink lib下
          # 重新启动flink-session,启动成功
          
          2022-05-09 18:44:28,961 WARN  org.apache.hadoop.util.NativeCodeLoader                      [] - Unable to load nativ                          e-hadoop library for your platform... using builtin-java classes where applicable
          2022-05-09 18:44:31,990 INFO  org.apache.flink.yarn.YarnClusterDescriptor                  [] - Removing 'localhost'                           Key: 'jobmanager.bind-host' , default: null (fallback keys: []) setting from effective configuration; using '0.0.0.                          0' instead.
          2022-05-09 18:44:31,991 INFO  org.apache.flink.yarn.YarnClusterDescriptor                  [] - Removing 'localhost'                           Key: 'taskmanager.bind-host' , default: null (fallback keys: []) setting from effective configuration; using '0.0.0                          .0' instead.
          2022-05-09 18:44:32,032 INFO  org.apache.flink.runtime.util.config.memory.ProcessMemoryUtils [] - The derived from f                          raction jvm overhead memory (160.000mb (167772162 bytes)) is less than its min value 192.000mb (201326592 bytes), mi                          n value will be used instead
          2022-05-09 18:44:32,042 INFO  org.apache.flink.yarn.YarnClusterDescriptor                  [] - Submitting applicati                          on master application_1651710719941_0001
          2022-05-09 18:44:32,430 INFO  org.apache.hadoop.yarn.client.api.impl.YarnClientImpl        [] - Submitted applicatio                          n application_1651710719941_0001
          2022-05-09 18:44:32,430 INFO  org.apache.flink.yarn.YarnClusterDescriptor                  [] - Waiting for the clus                          ter to be allocated
          2022-05-09 18:44:32,432 INFO  org.apache.flink.yarn.YarnClusterDescriptor                  [] - Deploying cluster, c                          urrent state ACCEPTED
          2022-05-09 18:44:38,569 INFO  org.apache.flink.yarn.YarnClusterDescriptor                  [] - YARN application has                           been deployed successfully.
          2022-05-09 18:44:38,570 INFO  org.apache.flink.yarn.YarnClusterDescriptor                  [] - Found Web Interface                           cdh02:39198 of application 'application_1651710719941_0001'.
          JobManager Web Interface: http://cdh02:39198
          

3.2 flink启动

  • 有两种模式:session 和 per-job

    这里选用session模式

3.2.1 session方式启动demo
  • 必须在active状态下的namenode上进行

    [root@cdh02 flink-1.15.0]# cd /opt/shealphy/flink-1.15.0
    [root@cdh02 flink-1.15.0]# ./bin/flink run examples/batch/WordCount.jar
    Setting HBASE_CONF_DIR=/etc/hbase/conf because no HBASE_CONF_DIR was set.
    2022-05-09 18:59:10,026 INFO  org.apache.flink.yarn.cli.FlinkYarnSessionCli                [] - Found Yarn properties file under /tmp/.yarn-properties-root.
    2022-05-09 18:59:10,026 INFO  org.apache.flink.yarn.cli.FlinkYarnSessionCli                [] - Found Yarn properties file under /tmp/.yarn-properties-root.
    Executing WordCount example with default input data set.
    Use --input to specify file input.
    Printing result to stdout. Use --output to specify output path.
    2022-05-09 18:59:10,393 WARN  org.apache.flink.yarn.configuration.YarnLogConfigUtil        [] - The configuration directory ('/opt/shealphy/flink-1.15.0/conf') already contains a LOG4J config file.If you want to use logback, then please delete or rename the log configuration file.
    2022-05-09 18:59:10,436 INFO  org.apache.hadoop.yarn.client.RMProxy                        [] - Connecting to ResourceManager at cdh02/172.16.0.46:8032
    2022-05-09 18:59:10,534 INFO  org.apache.flink.yarn.YarnClusterDescriptor                  [] - No path for the flink jar passed. Using the location of class org.apache.flink.yarn.YarnClusterDescriptor to locate the jar
    2022-05-09 18:59:10,595 INFO  org.apache.flink.yarn.YarnClusterDescriptor                  [] - Found Web Interface cdh02:39198 of application 'application_1651710719941_0001'.
    Job has been submitted with JobID 50113f5b0efa88b65f68ad713a7a84fb
    Program execution finished
    Job with JobID 50113f5b0efa88b65f68ad713a7a84fb has finished.
    Job Runtime: 10265 ms
    Accumulator Results:
    - 0d9904e571e7cf2f06480ef5dd4b3c99 (java.util.ArrayList) [170 elements]
    (a,5)
    (action,1)
    (after,1)
    (against,1)
    (all,2)
    (and,12)
    (arms,1)
    (arrows,1)
    (awry,1)
    ...
    
3.2.2 per-job方式启动demo
  • 启动命令

    [root@cdh02 flink-1.15.0]# ./bin/flink run -m yarn-cluster ./examples/batch/WordCount.jar
    
  • 0
    点赞
  • 9
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值