CC00014.bdpositions——|Hadoop&实时数仓.V14|——|项目.v14|ODS层处理|将kafka中的维表写入DIM层.V2|

一、编程实现:样例类:将Kafka中的维度表写入DIM层
### --- 编程实现样例类一:TableObject

package ods

/**
 * 存放mysql log_bin日志信息的样例类
 * log_bin日志经过canel转成json发给kafka
 * flink应用读kafka中json数据保存成TableObject样例类格式
 */
case class TableObject (database:String, tableName:String, typeInfo: String, dataInfo: String) extends Serializable
### --- 编程实现样例类二:AreaInfo

package ods

case class AreaInfo(
                     id: String,
                     name: String,
                     pid: String,
                     sname: String,
                     level: String,
                     citycode: String,
                     yzcode: String,
                     mername: String,
                     Lng: String,
                     Lat: String,
                     pinyin: String
                   )
### --- 编程实现样例类三:DataInfo

package ods

case class DataInfo(
                     modifiedTime: String,
                     orderNo: String,
                     isPay: String,
                     orderId: String,
                     tradeSrc: String,
                     payTime: String,
                     productMoney: String,
                     totalMoney: String,
                     dataFlag: String,
                     userId: String,
                     areaId: String,
                     createTime: String,
                     payMethod: String,
                     isRefund: String,
                     tradeType: String,
                     status: String
                   )
二、编程实现工具类
### --- 编程实现工具类一:SourceKafka:将Kafka作为Source,Flink作为消费者从Kafka中获取数据。

package ods

import java.util.Properties

import org.apache.flink.api.common.serialization.SimpleStringSchema
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer

class SourceKafka {
  def getKafkaSource(topicName: String) : FlinkKafkaConsumer[String] = {
    val props = new Properties()
    props.setProperty("bootstrap.servers","hadoop01:9092,hadoop02:9092,hadoop03:9092");//3,4
    props.setProperty("group.id","consumer-group")
    props.setProperty("key.deserializer","org.apache.kafka.common.serialization.StringDeserializer")
    props.setProperty("value.deserializer","org.apache.kafka.common.serialization.StringDeserializer")
    props.setProperty("auto.offset.reset","latest")

    new FlinkKafkaConsumer[String](topicName, new SimpleStringSchema(),props);
  }
}
### --- 编程实现工具类二:ConnHBase

package ods

import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.hbase.client.{Connection, ConnectionFactory}
import org.apache.hadoop.hbase.{HBaseConfiguration, HConstants}

class ConnHBase {
  def connToHbase:Connection ={
    val conf : Configuration = HBaseConfiguration.create()
    conf.set("hbase.zookeeper.quorum","hadoop01,hadoop02,hadoop03")
    //    conf.set("hbase.zookeeper.quorum","hadoop01,hadoop02")
    conf.set("hbase.zookeeper.property.clientPort","2181")
    conf.setInt(HConstants.HBASE_CLIENT_OPERATION_TIMEOUT,30000)
    conf.setInt(HConstants.HBASE_CLIENT_SCANNER_TIMEOUT_PERIOD,30000)
    val connection = ConnectionFactory.createConnection(conf)
    connection
  }

}
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

yanqi_vip

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值