执行hive的insert into语句报错org.codehaus.jackson.JsonParseException: Unexpected end-of-input: was expectin

问题如题,详细报错如下

Caused by: java.lang.RuntimeException: MetaException(message:org.apache.hadoop.hive.serde2.SerDeException Encountered exception determining schema. Returning signal schema to indicate problem: org.codehaus.jackson.JsonParseException: Unexpected end-of-input: was expecting closing '"' for name
2020/11/11 14:00:19 -   at [Source: java.io.StringReader@50bf3cc6; line: 1, column: 6001])
2020/11/11 14:00:19 -  
2020/11/11 14:00:19 -  	at org.pentaho.di.core.database.Database.execStatement(Database.java:1570)
2020/11/11 14:00:19 -  	at org.pentaho.di.core.database.Database.execStatement(Database.java:1518)
2020/11/11 14:00:19 -  	at org.pentaho.big.data.kettle.plugins.hive.trans.HiveOutput.loadTempToTable(HiveOutput.java:404)
2020/11/11 14:00:19 -  	at org.pentaho.big.data.kettle.plugins.hive.trans.HiveOutput.processRow(HiveOutput.java:73)
2020/11/11 14:00:19 -  	at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2020/11/11 14:00:19 -  	at java.lang.Thread.run(Thread.java:745)
2020/11/11 14:00:19 -  Caused by: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: RuntimeException MetaException(message:org.apache.hadoop.hive.serde2.SerDeException Encountered exception determining schema. Returning signal schema to indicate problem: org.codehaus.jackson.JsonParseException: Unexpected end-of-input: was expecting closing '"' for name
2020/11/11 14:00:19 -   at [Source: java.io.StringReader@50bf3cc6; line: 1, column: 6001])
2020/11/11 14:00:19 -  	at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:262)
2020/11/11 14:00:19 -  	at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:248)
2020/11/11 14:00:19 -  	at org.apache.hive.jdbc.HiveStatement.runAsyncOnServer(HiveStatement.java:297)
2020/11/11 14:00:19 -  	at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:238)
2020/11/11 14:00:19 -  	at sun.reflect.GeneratedMethodAccessor74.invoke(Unknown Source)
2020/11/11 14:00:19 -  	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2020/11/11 14:00:19 -  	at java.lang.reflect.Method.invoke(Method.java:498)
2020/11/11 14:00:19 -  	at org.pentaho.hadoop.shim.common.DriverProxyInvocationChain$CaptureResultSetInvocationHandler.invoke(DriverProxyInvocationChain.java:596)
2020/11/11 14:00:19 -  	at com.sun.proxy.$Proxy58.execute(Unknown Source)
2020/11/11 14:00:19 -  	at org.pentaho.di.core.database.Database.execStatement(Database.java:1544)
2020/11/11 14:00:19 -  	... 5 more

这个问题碰到好几次了,原因是因为hive元数据库table_params表的param_value字段字段不够,导致字段太多的表的schema被截断,在insert into执行时就会使用被截断的schema,于是就报出了上面的错误。

解决办法:
去大数据平台hive的配置里面找到元数据库的配置信息,然后连到元数据库,找到table_params表修改其字段param_value的类型,从varchar(4000)修改为longtext.然后重新建表,它的schema就不会被截断了。

记录下,方便后来人。

  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值