一、spark-shell.cmd 启动出错 NoClassDefFoundError
在load-spark-env.cmd 文件中增加以下设置环境变量内容,hadoop后面先在系统环境变量里设置,或者直接使用路径
@echo off
rem ################### SET ENV ##################
rem set SPARK_MASTER_IP=localhost
rem set SPARK_WORKER_CORES=1
set SPARK_WORKER_MEMORY=1g
echo HADOOP_HOME: %HADOOP_HOME%
set HADOOP_CONF_DIR=”%HADOOP_HOME%\hadoop\etc\hadoop”
echo HADOOP_CONF_DIR: %HADOOP_CONF_DIR%
for /f %%i in (‘hadoop classpath’) do set SPARK_DIST_CLASSPATH=%%i
echo SPARK_DIST_CLASSPATH: %SPARK_DIST_CLASSPATH%
mkdir -p %SPARK_HOME%\temp
SET temp=%SPARK_HOME%\temp
echo %temp%
二、 Could not initialize class scala.tools.fusesource_embedded.jansi.internal.Kernel32
把scala的jline-2.11.jar 放到spark 的jars目录下