pyspark执行可能就遇到问题
ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=PySparkShell, master=local[])
其实蛮好解决的将原来的
from pyspark import SparkContext
from pyspark import SparkConf
改写为
from pyspark import SparkContext
try:
sc.stop()
except:
pass
from pyspark import SparkConf
就ok了。