spark指定python版本_如何指定供Spark提交使用的Python版本?

1586010002-jmsa.png

I have two versions of Python. When I launch a spark application using spark-submit, the application uses the default version of Python. But, I want to use the other one.

How to specify the version of Python for spark-submit to use?

解决方案

You can set the PYSPARK_PYTHON variable in conf/spark-env.sh (in Spark's installation directory) to the absolute path of the desired Python executable.

Spark distribution contains spark-env.sh.template (spark-env.cmd.template on Windows) by default. It must be renamed to spark-env.sh (spark-env.cmd) first.

For example, if Python executable is installed under /opt/anaconda3/bin/python3:

PYSPARK_PYTHON='/opt/anaconda3/bin/python3'

Check out the configuration documentation for more information.

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值