Pyspark on Yarn解决pythony依赖问题
前期准备
安装及编译python版本3.6.8 【需要每个YARN NodeManager节点编译安装python3】
下载spark版本 2.4.5
python虚拟环境创建及打包
-
压缩虚拟环境
cd /usr/local/thirdparty/ zip -q -r ai.zip ai/
-
虚拟环境put到HDFS
hdfs dfs -put ai.zip /ai
pyspark设置python3
-
修改spark-env.sh文件,新增配置
vim spark-env.sh export PYSPARK_PYTHON=/usr/local/bin/python3
-
修改bin目录下的pyspark
vim pyspark if [[ -z "$PYSPARK_PYTHON" ]]; then if [[ $PYSPARK_DRIVER_PYTHON == *ipython* && ! $WORKS_WITH_IPYTHON ]]; then echo "IPython requires Python 2.7+; please install python2.7 or set PYSPARK_PYTHON" 1>&2 exit 1 else PYSPARK_PYTHON=python3 #修改此处 fi fi export PYSPARK_PYTHON
提交spark任务
-
spark submit提交任务
spark-submit --master yarn \ --deploy-mode cluster \ --archives hdfs:///ai/ai_test.zip#py3 \ --conf spark.yarn.appMasterEnv.PYSPARK_PYTHON='py3/ai_test/bin/python' \ --conf spark.yarn.appMasterEnv.PYSPARK_DRIVER_PYTHON='py3/ai/bin/python' \ xxx.py --archives hdfs:///ai/ai_test.zip#py3 #py3 是一个别名,可任意取,便于spark.yarn.appMasterEnv.PYSPARK_PYTHON定位python环境