Flink SQL报错

create table kafka_sink_upsert(
    `id` string,
    `ts` bigint,
    `vc` double,
    PRIMARY KEY (`id`) NOT ENFORCED
) with (
    'connector' = 'kafka', 
    'topic' = 'test03',
    'properties.bootstrap.servers' = 'hadoop102:9092',
    'key.format' = 'json',
    'value.format' = 'json'
);

insert into kafka_sink_upsert select id,max(ts) ts,sum(vc) vc from kafka_source group by id;

错误一:

Flink SQL> insert into kafka_sink_upsert select id,max(ts) ts,sum(vc) vc from kafka_source group by id;
[ERROR] Could not execute SQL statement. Reason:
org.apache.calcite.sql.validate.SqlValidatorException: Object 'kafka_source' not found

原因:未创建source表

解决:创建kafka_source表

create table kafka_source(
    `id` string,
    `ts` bigint,
    `vc` double
) with (
    'connector' = 'kafka',
    'topic' = 'test',
    'properties.bootstrap.servers' = 'hadoop102:9092',
    'properties.group.id' = 'sql_test_0524',
    'scan.startup.mode' = 'latest-offset',
    'format' = 'csv'
);

错误二:

Flink SQL> insert into kafka_sink_upsert select id,max(ts) ts,sum(vc) vc from kafka_source group by id;
[ERROR] Could not execute SQL statement. Reason:
org.apache.flink.table.api.ValidationException: The Kafka table 'default_catalog.default_database.kafka_sink_upsert' with 'json' format doesn't support defining PRIMARY KEY constraint on the table, because it can't guarantee the semantic of primary key.

原因:sink_upsert表连接器未修改类型

解决方案:

Flink SQL> drop table kafka_sink_upsert;

重新建表

create table kafka_sink_upsert(
    `id` string,
    `ts` bigint,
    `vc` double,
    PRIMARY KEY (`id`) NOT ENFORCED
) with (
    'connector' = 'upsert-kafka',  --注意连接器类型
    'topic' = 'test03',
    'properties.bootstrap.servers' = 'hadoop102:9092',
    'key.format' = 'json',
    'value.format' = 'json'
);

错误三:

Flink SQL> select * from kafka_sink_upsert;

[ERROR] Could not execute SQL statement. Reason:
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonParseException: Unexpected charact
 at [Source: UNKNOWN; line: 1, column: 6]

原因:消费到Kafka主题中的脏数据

解决方案:换一个新的主题消费数据

错误四:

Flink SQL> CREATE CATALOG myhive WITH (
>     'type' = 'hive',
>     'default-database' = 'default',
>     'hive-conf-dir' = '/opt/module/hive/conf'
> );
[ERROR] Could not execute SQL statement. Reason:
java.lang.IllegalArgumentException: Embedded metastore is not allowed. Make sure you have set a valid value for hive.metastore.uris

原因:hive中未配置hive.metastore.uris参数,无法连接到metastore

解决方案:hive/conf/hive-site.xml 添加配置

    <!-- hive metastore 服务地址 -->
    <property>
        <name>hive.metastore.uris</name>
        <value>thrift://hadoop102:9083</value>
    </property>

重新启动flink集群和flink-client

bin/yarn-session.sh

bin/sql-client.sh embedded -s yarn-session

hive --service metastore

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值