最近配置spark时修改.bashrc后,当执行的时候总是会报bash:…: is a directory错误,.bashrc文件内容如下:
# .bashrc
# Source global definitions
if [ -f /etc/bashrc ]; then
. /etc/bashrc
fi
# User specific aliases and functions
JAVA_HOME= /usr/lib/jvm/jre-1.7.0-openjdk.x86_64/
export JAVA_HOME
HADOOP_HOME=/opt/moudles/hadoop-2.7.1
export HADOOP_HOME
PATH=$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$JAVA_HOME/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/home/hadoop/bin
MAVEN_HOME=/opt/moudles/apache-maven-3.3.9
export MAVEN_HOME
PATH=$MAVEN_HOME/bin:$PATH
export SPARK_HOME= /opt/moudles/spark-1.6.1
PATH=$PATH:$SPARK_HOME/bin:$PARK_HOME/sbin
可以发现当给JAVA_HOME和SPARK_HOME赋值时候,JAVA_HOME和SPARK_HOME的等号后面有空格,而linux下设置环境变量的时候是不允许有空格的,故导致bash:…: is a directory的错误,解决方法是将等号后的空格去掉。