Flink Stanalon HA Standalone集群构建基础环境准备物理资源:CentOSA/B/C-6.10 64bit 内存2GB主机名IPCentOSA192.168.221.136CentOSB192.168.221.137CentOSC192.168.221.138[外链图片转存失败(img-l9lPb4wS-1566826494200)(assets/156...
用Flink对Operator State 状态的代码进行hdfs的CheckpointedFunction恢复 依赖 <dependencies> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-core</artifactId> <version>...
用Flink对keyed state 状态的代码进行设置TTL(过期时间)和hdfs的checkpoints恢复和配置Flink-conf.yaml 依赖 <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-core</artifactId> <version>1.8.1</version>...
用Flink将kafka的数据存到redis中 依赖<dependency> <groupId>org.apache.bahir</groupId> <artifactId>flink-connector-redis_2.11</artifactId> <version>1.0</version></dependency>代码/...
用Flink将kafka的数据存到kafka中 依赖<dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-connector-kafka_2.11</artifactId> <version>1.8.1</version></dependency&g...
Flink搭建和测试 环境搭建Flink 安装部署前提条件HDFS正常启动 (SSH免密码认证)JDK1.8+上传并解压flink[root@CentOS ~]# tar -zxf flink-1.8.1-bin-scala_2.11.tgz -C /usr/配置flink-conf.yaml配置文件[root@CentOS ~]# cd /usr/flink-1.8.1/[root@...
Spark Stanalone HA集群搭建 Standalone集群构建基础环境准备物理资源:CentOSA/B/C-6.10 64bit 内存2GB主机名IPCentOSA192.168.221.136CentOSB192.168.221.137CentOSC192.168.221.138[外链图片转存失败(img-l9lPb4wS-1566826494200)(assets/156...
用structuredStreaming将kafka数据存到kafka中 (jar包形式运行) 依赖 <dependencies> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.11</artifactId> <versio...
用spark将kafka的数据存到kafka中(方案二) 依赖 <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.11</artifactId> <version>2.4.3</version> ...
用spark将kafka的数据存到kafka里(方案一) 依赖 <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.11</artifactId> <version>2.4.3</version> ...
用mapreduce将hdfs的数据存到mysql中(虚拟机的mysql) 依赖 <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-client</artifactId> <version>2.9.2</versio...
用mapreduce将hdfs的数据存入到hbase中 依赖 <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-hdfs</artifactId> <version>2.6.0</version> ...
用spark将hdfs数据存到hbase中 第一步依赖 <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.11</artifactId> <version>2.4.3</version>...
用spark Streamming 将kafka的数据 展示到控制台 第一步依赖<dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming-kafka-0-10_2.11</artifactId> <version>2.4.3</version></dep...
用spark Streamming 将mysql的数据 展示到控制台 第一步依赖<dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming-kafka-0-10_2.11</artifactId> <version>2.4.3</version></dep...