py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils... does not exist in the JVM
安装环境:Win7 +anaconda4.3.21(python3.6.1)+spark2.3.2+java1.8执行程序:from pyspark import SparkContextfrom pyspark import SparkConfconf = SparkConf().setAppName("miniProject").setMaster("local[*]")s...
原创
2019-01-04 12:52:02 ·
12750 阅读 ·
5 评论