环境: Ubuntu 16 spark 2.3.3 jdk1.8 hadoop 2.7
使用spark standalone mode 部署spark,参考: http://spark.apache.org/docs/latest/spark-standalone.html
配置文件: ${SPARK_HOME}/conf 创建spaek-env.sh文件
在${SPARK_HOME}/sbin下开启 ./spark-master.sh ./spark-slave.sh spark://hot:7077
使用 ${SPARK_HOME}/bin/spark-submit --class xxx --master xxx /path/to/jar 可以正常执行
创建maven工程:
编程上图代码运行:出现如下错误:
ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
日志中的错误如下: