1、SpringBoot的项目中使用@DalTransactional注解
现象:在业务代码方法前加了该注解,希望修改是事务的。当中有一个发送kafka消息的调用。
当消费者消费到消息时,从mysql查询,此时事务还未提交,导致不能处理到最新的结果数据。从而产生kafka数据丢失的幻觉。
解决:除了发送到mq的代码,其他修改业务数据的代码封装起来,以事务进行。提交完了,在外层进行mq写入。
2、Clickhouse sql写入,sql当中指定的input不能为null,否则会写入失败
现象:flink消费ubt数据时,根据日志层层打点,发现没有问题,最终定位到写入sql中有字段为null,导致在DBEaver中一直查询不到
解决:
input后的字段都不能为null,可能为null的字段要处理空字符串'' String sql = "insert into ab_app_division_log select division_id, division_type, platform_code, exp_code, ab_version, layer, domain, mod, version, effective_time, currentTime,d from input('division_id String, division_type String, platform_code String, exp_code String, ab_version String, layer String, domain String, mod String, version String, effective_time String, currentTime DateTime, d Date')"; PreparedStatement preparedStatement = sql; try { if (preparedStatement != null) { int count = 0; for (ABModel abModel : elements) { preparedStatement.setString(1, abModel.getDivisionId()); preparedStatement.setString(2, abModel.getDivisionType()); preparedStatement.setString(3, abModel.getPlatformCode()); preparedStatement.setString(4, abModel.getExpCode()); preparedStatement.setString(5, abModel.getAbVersion()); preparedStatement.setString(6, abModel.getLayer()); preparedStatement.setString(7, abModel.getDomain()); preparedStatement.setString(8, abModel.getMod()); preparedStatement.setString(9, abModel.getSendFrom()); preparedStatement.setString(10, abModel.getStartTime()); preparedStatement.setObject(11, abModel.getTs()); preparedStatement.setObject(12, new java.sql.Date(abModel.getTs().getTime())); preparedStatement.addBatch(); count++; } preparedStatement.executeBatch(); }