hive数据加载出错:Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive

hive数据加载,从一组表抽取数据到一个表里,插入时出错如下错误消息:

2025-09-28 16:31:48,477 WARN  [Thread-1064] shims.HadoopShimsSecure (Hadoop23Shims.java:getTaskAttemptLogUrl(154)) - Can't fetch tasklog: TaskLogServlet is not supported in MR2 mode.

Task with the most failures(4):
-----
Task ID:
  task_1758702773770_1129_m_000000

URL:
  http://repo.spring.io:8088/taskdetails.jsp?jobid=job_1758702773770_1129&tipid=task_1758702773770_1129_m_000000
-----
Diagnostic Messages for this Task:
Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable (null)
        at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:163)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
        at org.apache.hadoop.hive.ql.exec.mr.ExecMapRunner.run(ExecMapRunner.java:37)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:466)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:350)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:178)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:172)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable (null)
        at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:568)
        at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:154)
        ... 9 more
Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.serde2.io.ParquetHiveRecord cannot be cast to org.apache.hadoop.io.BytesWritable
        at org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat$1.write(HiveIgnoreKeyTextOutputFormat.java:91)
        at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:987)
        at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:995)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:941)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:928)
        at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95)
        at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:995)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:941)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:928)
        at org.apache.hadoop.hive.ql.exec.UDTFOperator.forwardUDTFOutput(UDTFOperator.java:133)
        at org.apache.hadoop.hive.ql.udf.generic.UDTFCollector.collect(UDTFCollector.java:45)
        at org.apache.hadoop.hive.ql.udf.generic.GenericUDTF.forward(GenericUDTF.java:110)
        at org.apache.hadoop.hive.ql.udf.generic.GenericUDTFInline.process(GenericUDTFInline.java:64)
        at org.apache.hadoop.hive.ql.exec.UDTFOperator.process(UDTFOperator.java:116)
        at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:995)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:941)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:928)
        at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95)
        at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:995)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:941)
        at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:125)
        at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:153)
        at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:555)
        ... 10 more


2025-09-28 16:31:48,479 ERROR [Thread-1063] exec.Task (SessionState.java:printError(1247)) -
Task with the most failures(4):
-----
Task ID:
  task_1758702773770_1129_m_000000

URL:
  http://repo.spring.io:8088/taskdetails.jsp?jobid=job_1758702773770_1129&tipid=task_1758702773770_1129_m_000000
-----
Diagnostic Messages for this Task:
Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable (null)
        at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:163)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
        at org.apache.hadoop.hive.ql.exec.mr.ExecMapRunner.run(ExecMapRunner.java:37)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:466)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:350)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:178)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:172)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable (null)
        at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:568)
        at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:154)
        ... 9 more
Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.serde2.io.ParquetHiveRecord cannot be cast to org.apache.hadoop.io.BytesWritable
        at org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat$1.write(HiveIgnoreKeyTextOutputFormat.java:91)
        at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:987)
        at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:995)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:941)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:928)
        at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95)
        at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:995)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:941)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:928)
        at org.apache.hadoop.hive.ql.exec.UDTFOperator.forwardUDTFOutput(UDTFOperator.java:133)
        at org.apache.hadoop.hive.ql.udf.generic.UDTFCollector.collect(UDTFCollector.java:45)
        at org.apache.hadoop.hive.ql.udf.generic.GenericUDTF.forward(GenericUDTF.java:110)
        at org.apache.hadoop.hive.ql.udf.generic.GenericUDTFInline.process(GenericUDTFInline.java:64)
        at org.apache.hadoop.hive.ql.exec.UDTFOperator.process(UDTFOperator.java:116)
        at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:995)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:941)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:928)
        at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95)
        at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:995)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:941)
        at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:125)
        at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:153)
        at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:555)
        ... 10 more

网上各种搜索,没解决问题,后来想到是不是那表语句有问题?
于是重新建表,问题解决了。
原来的那表语句:

CREATE TABLE cust.dws_tr_contract_customer_info(
  id string COMMENT '主键', 
  pact_code string COMMENT '合同编号', 
...
  representative_level string COMMENT '法定代表人征信等级')
 ROW FORMAT SERDE 
 'org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe' 
  LOCATION

新的建表语句:

CREATE TABLE cust.dws_tr_contract_customer_info(
  id string COMMENT '主键', 
  pact_code string COMMENT '合同编号', 
...
  representative_level string COMMENT '法定代表人征信等级')
  stored as parquet
  LOCATION
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值