window10 + pyspark + conda下的envs下的虚拟环境配置
1、java配置
版本:jdk-8u291-windows-x64.exe
安装:双击exe,安装java
在系统变量中设置java相关变量
CLASS_PATH:C:\Program Files\Java\jdk1.8.0_291\lib
JAVA_HOME:C:\Program Files\Java\jdk1.8.0_291
#这样做是因为环境变量太长,无法添加路径
JAVA_PATH:C:\Program Files\Java
mypath1:%JAVA_PATH%\jdk1.8.0_291\bin;%SPARK_HOME%\bin
将mypath1添加到环境变量中:
%mypath1%
2、spark配置:
版本: spark-3.1.1-bin-hadoop3.2.tgz
安装:解压即可
配置路径:
设置变量名
SPARK_HOME:E:\chl_ubuntu_works\works\spark-3.1.1-bin-hadoop3.2
在mypath1中追加spark的路径,切记用英文的“;”隔开
mypath1:%JAVA_PATH%\jdk1.8.0_291\bin;%SPARK_HOME%\bin
路径太长参考:
https://blog.csdn.net/github_34777264/article/details/85342877
3、pycharm使用spark
安装pyspark ,采用阿里源,清华源太慢了
pip install pyspark -i https://mirrors.aliyun.com/pypi/simple
设置参考:
https://www.wangtongzhe.cn/post/173.html
https://www.pianshen.com/article/34351670234/
报未发现JAVA_HOME 的错误
Java not found and JAVA_HOME environment variable is not set.
Install Java and set JAVA_HOME to point to the Java installation directory.
Traceback (most recent call last):
File “H:/chl_workfiles/works/tobacco_project/spark_learn/test2.py”, line 3, in
sc = SparkContext(‘local’)
File “E:\chl_ubuntu_works\works\spark-3.1.1-bin-hadoop3.2\python\pyspark\context.py”, line 144, in init
SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
File “E:\chl_ubuntu_works\works\spark-3.1.1-bin-hadoop3.2\python\pyspark\context.py”, line 331, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway(conf)
File “E:\chl_ubuntu_works\works\spark-3.1.1-bin-hadoop3.2\python\pyspark\java_gateway.py”, line 108, in launch_gateway
raise Exception(“Java gateway process exited before sending its port number”)
Exception: Java gateway process exited before sending its port number
Process finished with exit code 1
在这里未发现系统环境变量存在JAVA_HOME,因此报错。
解决方案,:
手动添加以下两个环境变量:
JAVA_HOME=C:\Program Files\Java\jdk1.8.0_291
PYSPARK_PYTHON=H:\chl_workfiles\anzhuang\anconda3\envs\pytorch_1.4\python.exe
4、下载winutils_hadoop-3.2.1,并将其复制到Hadoop文件夹下的bin文件中,
下载链接:
https://github.com/cdarlint/winutils
用管理员身份,通过cmd进入hadoop-3.2.2\bin文件夹下,执行:
c:\tmp\Hive文件夹是我自己建的之前没有
winutils.exe chmod 777 c:\tmp\Hive
如果电脑重启之后,不报 winutils.exe的异常就不用以下操作。
同时将hadoop.dll复制到 C:\Windows\System32文件夹下
安装整体安装参考:
https://blog.csdn.net/weixin_38556445/article/details/78182264
https://www.cnblogs.com/yfb918/p/10978856.html