spark sql写入mysql报错:Caused by: java.lang.IllegalArgumentException: Can‘t get JDBC type for void

文章讲述了使用SparkSQL将数据写入MySQL时遇到的类型错误,原因是源数据中的某些字段类型为void,无法与MySQL的表结构匹配。解决方法是通过CAST函数将数据强制转换为MySQL所需的类型,如VARCHAR。
摘要由CSDN通过智能技术生成

一、问题描述

spark sql写入mysql报错:Caused by: java.lang.IllegalArgumentException: Can’t get JDBC type for void

Caused by: java.lang.IllegalArgumentException: Can't get JDBC type for void
at org.apache.spark.sql.errors.QueryExecutionErrors$.cannotGetJdbcTypeError(QueryExecutionErrors.scala:674)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$getJdbcType$2(JdbcUtils.scala:181)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.getJdbcType(JdbcUtils.scala:181)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$savePartition$5(JdbcUtils.scala:705)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$savePartition$5$adapted(JdbcUtils.scala:705)
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
at scala.collection.TraversableLike.map(TraversableLike.scala:286)
at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.savePartition(JdbcUtils.scala:705)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$saveTable$1(JdbcUtils.scala:895)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$saveTable$1$adapted(JdbcUtils.scala:893)
at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2(RDD.scala:1020)
at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2$adapted(RDD.scala:1020)
at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2254)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:131)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:506)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1462)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:509)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)

在这里插入图片描述

二、问题原因

写入时sourcesql是

select cat1,cat2 from (
select cat1 as cat1,null as cat2 from test
union 
select null as cat1,cat2 as cat2 from test
)ttt

这样会导致部分字段类型无法确认,而写入mysql是

    resultDF.write
      .format("jdbc")
      .option("dbtable", sinkTable)
      .option("url", mysqlUrl)
      .option("user", mysqlUser)
      .option("password", mysqlPwd)
      .option("truncate", "true")
      .mode(SaveMode.Overwrite)
      .save()

mysql对应字段是有类型,例如varchar等,而源头字段类型是void,所以无法写入。

三、解决方法

将结果强制转换为需要的类型即可

select cast(cat1 as string),cast(cat2 as string) from (
select cat1 as cat1,null as cat2 from test
union 
select null as cat1,cat2 as cat2 from test
)ttt
  • 13
    点赞
  • 10
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值