spark启动时,报错如下:
进入spark/conf/目录下,修改spark-env.sh文件:添加:
export SPARK_DIST_CLASSPATH=$(/usr/local/hadoop/bin/hadoop classpath)
要到bin/hadoop那一层。
spark启动时,报错如下:
进入spark/conf/目录下,修改spark-env.sh文件:添加:
export SPARK_DIST_CLASSPATH=$(/usr/local/hadoop/bin/hadoop classpath)
要到bin/hadoop那一层。