我一直在关注本教程为
scala安装spark:
https://www.tutorialspoint.com/apache_spark/apache_spark_installation.htm
但是,当我尝试运行spark-shell时,我在控制台中收到此错误.
/usr/local/spark/bin/spark-shell: line 57: /usr/local/spark/bin/bin/spark-submit: No such file or directory
我的bashrc看起来像这样:
export PATH = $PATH:/usr/local/spark/bin
export SCALA_HOME=/usr/local/scala/bin
export PYTHONPATH=$SPARK_HOME/python
我错了什么?我以前为python安装了spark,但现在我正在尝试使用scala.火花会使变量混淆吗?谢谢.
你正在搜索的路径中有一个bin太多了:
/usr/local/spark/bin/bin/spark-submit
应该
/usr/local/spark/bin/spark-submit
在你的情况下SPARK_HOME应该是/usr/local/spark /,而不是/usr/local/spark / bin /,因为现在看起来似乎是这样.