hive on spark 3.1.2集成spark3.0.0
需要修改spark-env.sh
加上 export SPARK_DIST_CLASSPATH=$(hadoop classpath)
否则报错
2 14:51:56,117 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 13, (reason: User class threw exception: java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/JobCo
原创
2021-07-22 15:22:29 ·
1462 阅读 ·
1 评论