Flink Iceberg Connector

19 篇文章 15 订阅

1. 创建表

在Flink的当前Catalog(默认为default_catalog,类型为GenericInMemoryCatalog)中,创建一个和Iceberg表对应的表映射

Flink SQL> create temporary table default_catalog.default_database.my_user(
> user_id bigint,
> user_name string,
> birthday date,
> country string
> ) with (
> 'connector'='iceberg',
> 'catalog-type'='hadoop',
> 'catalog-name'='hadoop_catalog',
> 'catalog-database'='iceberg_db',
> 'catalog-table'='my_user',
> 'warehouse'='hdfs://nnha/user/iceberg/warehouse'
> );
[INFO] Execute statement succeed.

Flink SQL>
  • catalog-type: 默认是hive。如果类型是hive,还得添加表参数'uri'='thrift://hive1:9083'
  • catalog-database: 默认Flink当前所在的Flink Database,比如default_catalog中的default_database
  • catalog-table: 默认使用Flink建表语句create table my_user中的表名

2. 插入数据和查询数据

Flink SQL> set 'sql-client.execution.result-mode' = 'tableau';
[INFO] Session property has been set.

Flink SQL>
Flink SQL> select * from default_catalog.default_database.my_user;
+----+----------------------+--------------------------------+------------+--------------------------------+
| op |              user_id |                      user_name |   birthday |                        country |
+----+----------------------+--------------------------------+------------+--------------------------------+
| +I |                    5 |                       zhao_liu | 2022-02-02 |                          japan |
| +I |                    2 |                      zhang_san | 2022-02-01 |                          china |
| +I |                    1 |                      zhang_san | 2022-02-01 |                          china |
+----+----------------------+--------------------------------+------------+--------------------------------+
Received a total of 3 rows

Flink SQL>
Flink SQL> insert into default_catalog.default_database.my_user(user_id, user_name, birthday, country) values(6, 'zhang_san', date '2022-02-01', 'china');
[INFO] Submitting SQL update statement to the cluster...
[INFO] SQL update statement has been successfully submitted to the cluster:
Job ID: 0b3e692d5100fb075668fc7a32d7f3e4


Flink SQL> select * from my_user;
+----+----------------------+--------------------------------+------------+--------------------------------+
| op |              user_id |                      user_name |   birthday |                        country |
+----+----------------------+--------------------------------+------------+--------------------------------+
| +I |                    6 |                      zhang_san | 2022-02-01 |                          china |
| +I |                    5 |                       zhao_liu | 2022-02-02 |                          japan |
| +I |                    2 |                      zhang_san | 2022-02-01 |                          china |
| +I |                    1 |                      zhang_san | 2022-02-01 |                          china |
+----+----------------------+--------------------------------+------------+--------------------------------+
Received a total of 4 rows

Flink SQL>
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值