py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils... does not exist in the JVM
安装环境:
Win7 +anaconda4.3.21(python3.6.1)+spark2.3.2+java1.8
执行程序:
from pyspark import SparkContext
from pyspark import SparkConf
conf = SparkConf().setAppName("miniProject").setMaster("local[*]")
s...
原创
2019-01-04 12:52:02 ·
12771 阅读 ·
5 评论