在执行一个程序的时候
conf = SparkConf().setAppName("miniProject").setMaster("local[1]")
报错内容
Traceback (most recent call last):
File "D:/py_data/py_spark/demo_pyspark.py", line 9, in <module>
sc = SparkContext.getOrCreate(conf)
File "D:\py_data\py_spark\venv\lib\site-packages\pyspark\context.py", line 384, in getOrCreate
SparkContext(conf=conf or SparkConf())
File "D:\py_data\py_spark\venv\lib\site-packages\pyspark\context.py", line 147, in __init__
conf, jsc, profiler_cls)
File "D:\py_data\py_spark\venv\lib\site-packages\pyspark\context.py", line 224, in _do_init
self._encryption_enabled = self._jvm.PythonUtils.isEncryptionEnabled(self._jsc)
File "D:\py_data\py_spark\venv\lib\site-packages\py4j\java_gateway.py", line 1531, in __getattr__
"{0}.{1} does not exist in the JVM".format(self._fqn, name))
py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.isEncryptionEnabled does not exist in the JVM
解决: 在程序开始添加
import findspark
findspark.init()
解决!!!