一.环境介绍:
1.安装jdk 7以上
2.python 2.7.11
3.IDE pycharm
4.package: spark-1.6.0-bin-hadoop2.6.tar.gz
二.Setup
1.解压spark-1.6.0-bin-hadoop2.6.tar.gz 到目录D:\spark-1.6.0-bin-hadoop2.6
2.配置环境变量Path,添加D:\spark-1.6.0-bin-hadoop2.6\bin,此后可以在cmd端输入pySpark,返回如下则安装完成:
3.将D:\spark-1.6.0-bin-hadoop2.6\python下的pySpark文件拷贝到C:\Python27\Lib\site-packages
4.安装py4j , pip install py4j -i https://pypi.douban.com/simple
5.配置pychar环境变量:
三.Example
1.make a new python file: wordCount.py
#!/usr/bin/env python#-*- coding: utf-8 -*-
importsysfrom pyspark importSparkContextfrom operator importaddimportredefmain():
sc= SparkContext(appName= "wordsCount")
l