【streamx】streamx获取kafka数据写入mysql
一、版本
stramx:1.2.3
scala: 2.11.8
mysql: 5.7.33
kafka: 0.10.1
二、代码
1.source
CREATE TABLE user_log (
user_id VARCHAR,
item_id bigint,
category_id bigint
) WITH (
‘connector’ = ‘kafka’,
‘topic’ = ‘testly’,
‘properties.bootstrap.servers’ = ‘localhost:8090’,
‘properties.group.id’ = ‘testGroup’,
‘scan.startup.mode’ = ‘latest-offset’,
‘format’ = ‘json’
);
-
sink
CREATE TABLE pvuv_sink (
dt VARCHAR primary key,
pv BIGINT,
uv BIGINT
) WITH (
‘connector’ = ‘jdbc’, – 使用 jdbc connector
‘url’ = ‘jdbc:mysql://localhost:3306/streamx’, – jdbc url
‘table-name’ = ‘pvuv_sink’, – 表名
‘username’ = ‘root’, – 用户名
‘password’ = ‘Qzcdh2021@’ – 密码
); -
逻辑写入
INSERT INTO pvuv_sink
SELECT
DATE_FORMAT(ts, ‘yyyy-MM-dd HH:00’) dt,
COUNT(*) AS pv,
COUNT(DISTINCT user_id) AS uv
FROM user_log
GROUP BY DATE_FORMAT(ts, ‘yyyy-MM-dd HH:00’); -
测试数据
{“user_id”: “锤锤1”, “item_id”:“1715”, “category_id”: “1464116”, “behavior”: “pv”, “ts”:“2021-02-01 01:00:00”}
{“user_id”: “强强2”, “item_id”:“2244074”,“category_id”:“1575622”,“behavior”: “pv”, “ts”:“2021-02-01 01:00:00”}
{“user_id”: “豆豆3”, “item_id”:“2244074”,“category_id”:“1575622”,“behavior”: “pv”, “ts”:“2021-02-01 01:00:00”} -
Streamx FlinkSql 配置
Flink Version: 1.13.0
UpLoadJar: flink-connector-jdbc_2.11-1.13.0.jar mysql-connector-java-5.1.47.jar