我一直在使用spark2.0.1,但是试图升级到更新的版本,即2.1.1,方法是将tar文件下载到本地并更改路径。在
但是,现在当我尝试运行任何程序时,它在sparkContext初始化时失败。i、 esc = SparkContext()
我尝试运行的整个示例代码是:
^{pr2}$
我得到的例外是在开始时,即:Traceback (most recent call last):
File "/home/vna/scripts/global_score_pipeline/test_code_here.py", line 47, in
sc = SparkContext()
File "/opt/apps/spark-2.1.1-bin-hadoop2.7/python/pyspark/context.py", line 118, in __init__
conf, jsc, profiler_cls)
File "/opt/apps/spark-2.1.1-bin-hadoop2.7/python/pyspark/context.py", line 182, in _do_init
self._jsc = jsc or self._initialize_context(self._conf._jconf)
File "/opt/apps/spark-2.1.1-bin-hadoop2.7/python/pyspark/context.py", line 249, in _initialize_context
return self._jvm.JavaSparkContext(jconf)
File "/opt/apps/spark-2.1.1-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1401, in __call__
File "/opt/apps/spark-2.1.1-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/protocol.py", line 319, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.NumberFormatException: For input string: "Ubuntu"
at java.base/java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
我不会在我的变量或ENV变量中传递Ubuntu。。在
我也尝试过更改sc=SparkContext(master='local'),但问题还是一样。在
请帮助确定此问题
编辑:火花的内容-默认值.conf在spark.master spark://master:7077
# spark.eventLog.enabled true
# spark.eventLog.dir hdfs://namenode:8021/directory
spark.serializer org.apache.spark.serializer.KryoSerializer
spark.driver.memory 8g
spark.executor.extraJavaOptions -XX:+PrintGCDetails -Dkey=value -Dnumbers="one two three"
spark.driver.extraClassPath /opt/apps/spark-2.1.1-bin-hadoop2.7/jars/mysql-connector-java-5.1.35-bin.jar
spark.executor.extraClassPath /opt/apps/spark-2.1.1-bin-hadoop2.7/jars/mysql-connector-java-5.1.35-bin.jar