flink-1.9.1单机版环境的搭建

10 篇文章 0 订阅
1 篇文章 0 订阅

Flink基础环境

本次安装Flink版本为flink-1.9.1,可以点击此链接直接下载

https://www.apache.org/dyn/closer.lua/flink/flink-1.9.1/flink-1.9.1-bin-scala_2.11.tgz

前提条件

1.安装JDK
[root@CentOSA ~]# rpm -ivh jdk-8u191-linux-x64.rpm
warning: jdk-8u191-linux-x64.rpm: Header V3 RSA/SHA256 Signature, key ID ec551f03: NOKEY
Preparing...                ########################################### [100%]
   1:jdk1.8                 ########################################### [100%]
Unpacking JAR files...
        tools.jar...
        plugin.jar...
        javaws.jar...
        deploy.jar...
        rt.jar...
        jsse.jar...
        charsets.jar...
        localedata.jar...
[root@CentOSA ~]# java -version
java version "1.8.0_191"
Java(TM) SE Runtime Environment (build 1.8.0_191-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.191-b12, mixed mode)
[root@CentOSA ~]# vi ~/.bashrc
[root@CentOSA ~]# source .bashrc
[root@CentOSA ~]# jps
1449 Jps
2.安装HDFS
[root@CentOSA ~]# tar -zxf hadoop-2.9.2.tar.gz -C /usr/
[root@CentOSA ~]# vi ~/.bashrc
HADOOP_HOME=/usr/hadoop-2.9.2
JAVA_HOME=/usr/java/latest
PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
CLASSPATH=.
export JAVA_HOME
export PATH
export CLASSPATH
export HADOOP_HOME
[root@CentOSA ~]# source .bashrc
[root@CentOSA ~]# vi /usr/hadoop-2.9.2/etc/hadoop/core-site.xml
<!--nn访问入口-->
<property>
    <name>fs.defaultFS</name>
    <value>hdfs://CentOSA:9000</value>
</property>
<!--hdfs工作基础目录-->
<property>
    <name>hadoop.tmp.dir</name>
    <value>/usr/hadoop-2.9.2/hadoop-${user.name}</value>
</property>
[root@CentOSA ~]# vi /usr/hadoop-2.9.2/etc/hadoop/slaves
CentOSA
[root@CentOSA ~]# vi /usr/hadoop-2.9.2/etc/hadoop/hdfs-site.xml
<!--block副本因子-->
<property>
    <name>dfs.replication</name>
    <value>1</value>
</property>
<!--配置Sencondary namenode所在物理主机-->
<property>
    <name>dfs.namenode.secondary.http-address</name>
    <value>CentOSA:50090</value>
</property>
<!--设置datanode最大文件操作数-->
<property>
    <name>dfs.datanode.max.xcievers</name>
    <value>4096</value>
</property>
<!--设置datanode并行处理能力-->
<property>
    <name>dfs.datanode.handler.count</name>
    <value>6</value>
</property>
[root@CentOSA ~]# hdfs namenode -format # 格式化
[root@CentOSA ~]# start-dfs.sh 
3.Flink安装
  • 上传并解压flink
[root@CentOSA ~]# tar -zxf flink-1.9.1-bin-scala_2.11.tgz -C /usr/
  • 配置flink-conf.yaml
[root@CentOSA ~]# vi /usr/flink-1.9.1/conf/flink-conf.yaml
jobmanager.rpc.address: CentOSA
taskmanager.numberOfTaskSlots: 4
parallelism.default: 3
  • 配置slaves
[root@CentOSA ~]# vi /usr/flink-1.9.1/conf/slaves
CentOSA
  • 启动Flink
[root@CentOSA flink-1.9.1]# ./bin/start-cluster.sh
Starting cluster.
Starting standalonesession daemon on host CentOSA.
Starting taskexecutor daemon on host CentOSA.
[root@CentOSA flink-1.9.1]# jps
1713 DataNode
3237 TaskManagerRunner
1894 SecondaryNameNode
1622 NameNode
2794 StandaloneSessionClusterEntrypoint
3277 Jps

访问:http://CentOSA:8081

环境测试案例

1.引入依赖
<properties>
    <flink.version>1.9.1</flink.version>
    <scala.version>2.11</scala.version>
</properties>
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-clients_2.11</artifactId>
    <version>${flink.version}</version>
</dependency>
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-streaming-scala_${scala.version}</artifactId>
    <version>${flink.version}</version>
</dependency>
2.Client程序
//1.创建流处理的环境
val fsEnv = StreamExecutionEnvironment.getExecutionEnvironment
//2.读取socket中的数据
val lines:DataStream[String]=fsEnv.socketTextStream("CentOSA",5555)
lines.flatMap(_.split("\\s+"))
.map((_,1))
.keyBy(t=>t._1)
.sum(1)
.print()
//3.执行流计算任务
fsEnv.execute("test1")
  • 将程序打包
[root@CentOSA ~]# cd /usr/flink-1.9.1/
[root@CentOSA flink-1.9.1]# ./bin/flink run --class Test1 --detached --parallelism 3 /root/original-FlinkStream-1.0-SNAPSHOT.jar
Starting execution of program
Job has been submitted with JobID 241250bf67445ed08d1b914e7159ced4
[root@CentOSA flink-1.9.1]# ./bin/flink list -m CentOSA:8081
Waiting for response...
------------------ Running/Restarting Jobs -------------------
28.01.2020 03:15:31 : 241250bf67445ed08d1b914e7159ced4 : test1 (RUNNING)
--------------------------------------------------------------
No scheduled jobs.
  • 取消任务
[root@centos flink-1.9.1]# ./bin/flink cancel -m CentOSA:8081 241250bf67445ed08d1b914e7159ced4
Cancelling job 241250bf67445ed08d1b914e7159ced4.
Cancelled job 241250bf67445ed08d1b914e7159ced4.
  • 2
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值