sparksession写入avro文件报错java.lang.NoSuchMethodError: org.apache.spark.sql.package$.SPARK_VERSION_METAD

使用sparksession集成avro并写入avro文件报错,代码及异常信息如下

sparkSession.sql(docSql).coalesce(1).write().format("avro").mode("overwrite").save(outputPath+"/3");
23/05/12 06:28:28 WARN scheduler.TaskSetManager: Lost task 0.0 in stage 8.0 (TID 656, ip-192-168-80-172.cn-northwest-1.compute.internal, executor 3): java.lang.NoSuchMethodError: org.apache.spark.sql.package$.SPARK_VERSION_METADATA_KEY()Ljava/lang/String;
	at org.apache.spark.sql.avro.AvroOutputWriter.<init>(AvroOutputWriter.scala:52)
	at org.apache.spark.sql.avro.AvroOutputWriterFactory.newInstance(AvroOutputWriterFactory.scala:43)
	at org.apache.spark.sql.execution.datasources.SingleDirectoryDataWriter.newOutputWriter(FileFormatDataWriter.scala:120)
	at org.apache.spark.sql.execution.datasources.SingleDirectoryDataWriter.<init>(FileFormatDataWriter.scala:108)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:236)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:170)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:169)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

23/05/12 06:28:28 INFO scheduler.TaskSetManager: Starting task 0.1 in stage 8.0 (TID 657, ip-192-168-80-172.cn-northwest-1.compute.internal, executor 3, partition 0, NODE_LOCAL, 12686 bytes)
23/05/12 06:28:28 INFO scheduler.TaskSetManager: Lost task 0.1 in stage 8.0 (TID 657) on ip-192-168-80-172.cn-northwest-1.compute.internal, executor 3: java.lang.NoSuchMethodError (org.apache.spark.sql.package$.SPARK_VERSION_METADATA_KEY()Ljava/lang/String;) [duplicate 1]
23/05/12 06:28:28 INFO scheduler.TaskSetManager: Starting task 0.2 in stage 8.0 (TID 658, ip-192-168-80-172.cn-northwest-1.compute.internal, executor 2, partition 0, NODE_LOCAL, 12686 bytes)
23/05/12 06:28:28 INFO storage.BlockManagerInfo: Added broadcast_10_piece0 in memory on ip-192-168-80-172.cn-northwest-1.compute.internal:36977 (size: 121.6 KB, free: 3.7 GB)
23/05/12 06:28:28 INFO scheduler.TaskSetManager: Lost task 0.2 in stage 8.0 (TID 658) on ip-192-168-80-172.cn-northwest-1.compute.internal, executor 2: java.lang.NoSuchMethodError (org.apache.spark.sql.package$.SPARK_VERSION_METADATA_KEY()Ljava/lang/String;) [duplicate 2]
23/05/12 06:28:28 INFO scheduler.TaskSetManager: Starting task 0.3 in stage 8.0 (TID 659, ip-192-168-80-172.cn-northwest-1.compute.internal, executor 2, partition 0, NODE_LOCAL, 12686 bytes)
23/05/12 06:28:28 INFO scheduler.TaskSetManager: Lost task 0.3 in stage 8.0 (TID 659) on ip-192-168-80-172.cn-northwest-1.compute.internal, executor 2: java.lang.NoSuchMethodError (org.apache.spark.sql.package$.SPARK_VERSION_METADATA_KEY()Ljava/lang/String;) [duplicate 3]
23/05/12 06:28:28 ERROR scheduler.TaskSetManager: Task 0 in stage 8.0 failed 4 times; aborting job
23/05/12 06:28:28 INFO cluster.YarnClusterScheduler: Removed TaskSet 8.0, whose tasks have all completed, from pool 
23/05/12 06:28:28 INFO cluster.YarnClusterScheduler: Cancelling stage 8
23/05/12 06:28:28 INFO cluster.YarnClusterScheduler: Killing all running tasks in stage 8: Stage cancelled
23/05/12 06:28:28 INFO scheduler.DAGScheduler: ResultStage 8 (save at JingXKScopeScore.java:101) failed in 0.329 s due to Job aborted due to stage failure: Task 0 in stage 8.0 failed 4 times, most recent failure: Lost task 0.3 in stage 8.0 (TID 659, ip-192-168-80-172.cn-northwest-1.compute.internal, executor 2): java.lang.NoSuchMethodError: org.apache.spark.sql.package$.SPARK_VERSION_METADATA_KEY()Ljava/lang/String;
	at org.apache.spark.sql.avro.AvroOutputWriter.<init>(AvroOutputWriter.scala:52)
	at org.apache.spark.sql.avro.AvroOutputWriterFactory.newInstance(AvroOutputWriterFactory.scala:43)
	at org.apache.spark.sql.execution.datasources.SingleDirectoryDataWriter.newOutputWriter(FileFormatDataWriter.scala:120)
	at org.apache.spark.sql.execution.datasources.SingleDirectoryDataWriter.<init>(FileFormatDataWriter.scala:108)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:236)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:170)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:169)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.run(Task.scala:123)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

这个异常信息提示在Spark作业运行时,调用了一个不存在的方法org.apache.spark.sql.package$.SPARK_VERSION_METADATA_KEY(),导致作业失败。
根据错误信息,可以看出该错误是由AvroOutputWriter这个类的构造函数引起的,这个类是用于将Spark DataFrame数据写入Avro格式的。该构造函数在初始化的时候会使用SPARK_VERSION_METADATA_KEY方法,但是该方法在当前的Spark版本中不存在,所以导致了错误。

查看包的版本,发现项目中用的spark版本是2.4.6,而集群版本是2.4.5

<spark.version>2.4.6</spark.version>

这种遗留的代码也不敢随意改动,毕竟还有其他别的任务在跑,尝试将avro的版本改正

	<dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-avro_2.11</artifactId>
        <version>2.4.5</version>
    </dependency>

重新打包运行成功,问题解决。

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值