安装环境:
Win7 +anaconda4.3.21(python3.6.1)+spark2.3.2+java1.8
执行程序:
from pyspark import SparkContext from pyspark import SparkConf conf = SparkConf().setAppName("miniProject").setMaster("local[*]") sc=SparkContext.getOrCreate(conf) rdd=sc.parallelize([1,2,3,4,5]) rdd1=rdd.map(lambda r:r+10) print(rdd1.collect())
出现如下错误信息:
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Traceback (most recent call last):
File "C:/Users/Administrator/PycharmProjects/untitled/Learn/fibonaqie.py", line 6, in <module>
sc=SparkContext.getOrCreate(conf)
File "C:\ProgramData\Anaconda3\lib\site-packages\pyspark\context.py", line 349, in getOrCreate
SparkContext(conf=conf or SparkConf())
File "C:\ProgramData\Anaconda3\lib\site-packages\pyspark\context.py", line 118, in _