Spark与Hadoop netty 包不匹配导致的错误

Spark-2.3.2 与 hadoop 2.7.3 搭建Spark on Yarn ,运行以下demo报错

spark-submit --deploy-mode client \
               --class org.apache.spark.examples.SparkPi \
               $SPARK_HOME/examples/jars/spark-examples_2.11-2.3.2.jar  10
            

报错

2021-03-31 21:57:02 INFO  DAGScheduler:54 - ResultStage 0 (reduce at SparkPi.scala:38) failed in 0.673 s due to Job aborted due to stage failure: Task 1 in stage 0.0 failed 4 times, most recent failure: Lost task 1.3 in stage 0.0 (TID 7, slave018, executor 1): java.lang.NoSuchMethodError: org.apache.spark.network.shuffle.protocol.BlockTransferMessage.encode(Lorg/spark_project/io/netty/buffer/ByteBuf;)V
        at org.apache.spark.network.shuffle.protocol.BlockTransferMessage.toByteBuffer(BlockTransferMessage.java:80)
        at org.apache.spark.network.shuffle.OneForOneBlockFetcher.start(OneForOneBlockFetcher.java:113)
        at org.apache.spark.network.netty.NettyBlockTransferService$$anon$2.createAndStart(NettyBlockTransferService.scala:115)
        at org.apache.spark.network.shuffle.RetryingBlockFetcher.fetchAllOutstanding(RetryingBlockFetcher.java:141)
        at org.apache.spark.network.shuffle.RetryingBlockFetcher.start(RetryingBlockFetcher.java:121)
        at org.apache.spark.network.netty.NettyBlockTransferService.fetchBlocks(NettyBlockTransferService.scala:123)
        at org.apache.spark.network.BlockTransferService.fetchBlockSync(BlockTransferService.scala:98)
        at org.apache.spark.storage.BlockManager.getRemoteBytes(BlockManager.scala:693)
        at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$org$apache$spark$broadcast$TorrentBroadcast$$readBlocks$1.apply$mcVI$sp(TorrentBroadcast.scala:162)
        at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$org$apache$spark$broadcast$TorrentBroadcast$$readBlocks$1.apply(TorrentBroadcast.scala:151)
        at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$org$apache$spark$broadcast$TorrentBroadcast$$readBlocks$1.apply(TorrentBroadcast.scala:151)
        at scala.collection.immutable.List.foreach(List.scala:381)
        at org.apache.spark.broadcast.TorrentBroadcast.org$apache$spark$broadcast$TorrentBroadcast$$readBlocks(TorrentBroadcast.scala:151)
        at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$readBroadcastBlock$1$$anonfun$apply$2.apply(TorrentBroadcast.scala:231)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$readBroadcastBlock$1.apply(TorrentBroadcast.scala:211)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1347)
        at org.apache.spark.broadcast.TorrentBroadcast.readBroadcastBlock(TorrentBroadcast.scala:207)
        at org.apache.spark.broadcast.TorrentBroadcast._value$lzycompute(TorrentBroadcast.scala:66)
        at org.apache.spark.broadcast.TorrentBroadcast._value(TorrentBroadcast.scala:66)
        at org.apache.spark.broadcast.TorrentBroadcast.getValue(TorrentBroadcast.scala:96)
        at org.apache.spark.broadcast.Broadcast.value(Broadcast.scala:70)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:81)
        at org.apache.spark.scheduler.Task.run(Task.scala:109)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

错误原因:hadoop2.7.3版本的 netty jar包太老,需要用spark2.3.2 的带的jar包更换

spark2.3.2 自带的jar包在$SPARK_HOME/jars下面,分别为 netty-3.9.9.Final.jarnetty-all-4.1.17.Final.jar,使用它两替换hadoop 对应的两个jar包即可。注意每个hadoop节点上的包都要替换。

# 使用find 命令找到 hadoop 安装目录下相关的包
find $HADOOP_HOME -name 'netty*.jar' >> path.txt
# path.txt 里面有hadoop所有的netty jar包路径

# 删除hadoop自带的jar包
find $HADOOP_HOME -name 'netty*.jar'|xargs rm

# 使用 spark 对应netty包进行替换

# cp的第二个目标路径都是path.txt 文件中的路径改下jar包名,
cp netty-3.9.9.Final.jar /path/to/hadoop/netty-3.9.9.Final.jar;
cp netty-all-4.1.17.Final.jar /path/to/hadoop/netty-all-4.1.17.Final.jar
  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值