【spark】Cluster deploy mode is not applicable to Spark shells

前言

使用CDH5.13 集成 apche spark2.4.2 使用spark-shell命令报错。

spark配置

spark-defaults.conf文件如下:

spark.master                     yarn
spark.deploy.mode                cluster
spark.submit.deployMode          cluster
spark.eventLog.enabled           true
spark.eventLog.dir               hdfs://nameservice1/tmp/spark/log/
spark.serializer                 org.apache.spark.serializer.KryoSerializer
spark.driver.memory              1g
spark.driver.maxResultSize       2g
spark.executor.memory            2g
yarn.scheduler.maximum-allocation-mb  2048m
spark.executor.instances         600
spark.executor.extraJavaOptions  -XX:+PrintGCDetails
spark.shuffle.service.enabled    false
spark.history.fs.logDirectory    hdfs://nameservice1/tmp/spark/log/
spark.yarn.historyServer.address master:18080
#spark.executor.memoryOverhead    2900
#spark.driver.memoryOverhead      2900
#spark.yarn.executor.memoryOverhead 2900
#spark.yarn.driver.memoryOverhead 2900
spark.network.timeout 2000s
spark.executor.heartbeatInterval 800s
spark.files.fetchTimeout 1000s
spark.port.maxRetries 100
spark.sql.autoBroadcastJoinThreshold 536870912
spark.sql.shuffle.partitions 1500
spark.sql.broadcastTimeout      800000ms
spark.default.parallelism 1500
spark.executor.cores 1
spark.kryoserializer.buffer.max   256m
#spark.yarn.queue                  bi
spark.executor.extraJavaOptions  -Dfile.encoding=UTF-8
spark.driver.extraJavaOptions    -Dfile.encoding=UTF-8
spark.port.maxRetries           100

报错信息:

报错信息如下:

Exception in thread "main" org.apache.spark.SparkException: Cluster deploy mode is not applicable to Spark shells.
	at org.apache.spark.deploy.SparkSubmit.error(SparkSubmit.scala:857)
	at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:292)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:143)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

具体为:
在这里插入图片描述

解决方案

使用命令:

/bin/spark-shell --master yarn --deploy-mode client --num-executors 3 --executor-memory 2G --executor-cores 2

问题解决!

方法二:

将 spark-defaults.conf文件如下:

spark.deploy.mode                cluster
spark.submit.deployMode          cluster

改为 client模式

spark.deploy.mode                client
spark.submit.deployMode          client
  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值