提交./run-example SparkPi 10报错如下:
16/01/11 19:19:53 ERROR SparkContext: Error initializing SparkContext.
java.net.ConnectException: Call From sparkmaster/192.168.10.80 to sparkmaster:8021 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefusedat sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:731)
at org.apache.hadoop.ipc.Client.call(Client.java:1472)
at org.apache.hadoop.ipc.Client.call(Client.java:1399)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source)
发现原来是自己之前更改过conf/spark-defaults.conf文件。vim spark-defaults.conf并注释掉evenLog那两行,再次提交./run-example SparkPi 10就没有问题了
# Example:
spark.master spark://sparkmaster:7077
# spark.eventLog.enabled true
# spark.eventLog.dir hdfs://sparkmaster:8021/sparkevent/
spark.serializer org.apache.spark.serializer.KryoSerializer
spark.driver.memory 1g
spark.executor.extraJavaOptions -XX:+PrintGCDetails -Dkey=value -Dnumbers="one two three"
补记
如果想enable eventLog,那么spark.eventLog.dir配置成下面这个格式即可
spark.eventLog.dir file:///opt/spark-1.6.0-bin-hadoop2.6/sparkevent(确保此目录已存在)
以上是配置在本地。如果是hdfs,格式如默认设置。但需确保Hadoop的进程在监听8021端口