flink集群的安装与搭建

注:(前提条件jdk,hadoop的环境,flink安装包)

1.解压flink安装包

[root@master /]# tar -zxvf /h3cu/flink-1.14.0-bin-scala_2.12.tgz -C /usr/local/src/

2.进入到安装目录改名

[root@master /]# cd /usr/local/src/
[root@master src]# ls
flink-1.14.0  hadoop  hbase  hive  jdk  spark  zk
[root@master src]# mv flink-1.14.0/ flink    #改成flink方便后面环境变量配置
[root@master src]# ls
flink  hadoop  hbase  hive  jdk  spark  zk

3.编辑配置环境变量

[root@master src]# vi /etc/profile


##在文件末尾追加一下内容
export FLINK_HOME=/usr/local/src/flink
export PATH=$PATH:$FLINK_HOME/bin
export HADOOP_CLASSPATH=`hadoop classpath`        #此处不配的可能会导致后面报错

[root@master src]# source /etc/profile    #使环境变量生效

4.进入到flink的conf下编辑配置文件

[root@master src]# cd flink/conf/
[root@master conf]# ls
flink-conf.yaml           log4j.properties          logback-session.xml  workers
log4j-cli.properties      log4j-session.properties  logback.xml          zoo.cfg
log4j-console.properties  logback-console.xml       masters

[root@master conf]# vi flink-conf.yaml 


jobmanager.rpc.address: master        ####

[root@master conf]# vi workers
master
slave1
slave2

5.分发flink及环境变量

[root@master conf]# scp /etc/profile slave1:/etc/
profile                                                              100% 2576     1.5MB/s   00:00    
[root@master conf]# scp /etc/profile slave2:/etc/
profile                                                              100% 2576     1.4MB/s   00:00
[root@master conf]# scp -r /usr/local/src/flink/ slave1:/usr/local/src/
[root@master conf]# scp -r /usr/local/src/flink/ slave2:/usr/local/src/

6.flink词频统计测试

[root@master conf]# flink run -m yarn-cluster -p 2 /usr/local/src/flink/examples/batch/WordCount.jar

7.很显然执行此命令后结果已经可以看到但出现以下报错

Exception in thread "Thread-5" java.lang.IllegalStateException: Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
	at org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$SafetyNetWrapperClassLoader.ensureInner(FlinkUserCodeClassLoaders.java:164)
	at org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$SafetyNetWrapperClassLoader.getResource(FlinkUserCodeClassLoaders.java:183)
	at org.apache.hadoop.conf.Configuration.getResource(Configuration.java:2780)
	at org.apache.hadoop.conf.Configuration.getStreamReader(Configuration.java:3036)
	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2995)
	at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2968)
	at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2848)
	at org.apache.hadoop.conf.Configuration.get(Configuration.java:1200)
	at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1812)
	at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1789)
	at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183)
	at org.apache.hadoop.util.ShutdownHookManager.shutdownExecutor(ShutdownHookManager.java:145)
	at org.apache.hadoop.util.ShutdownHookManager.access$300(ShutdownHookManager.java:65)
	at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:102)

 8.解决问题

[root@master conf]# vi flink-conf.yaml

#在文件空白处加入一下(切记空格不可忽视)
classloader.check-leaked-classloader: false

  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值