pyspark修改python版本

ubuntu自带的python 版本是2.7,

我们要把pyspark默认改成anaconda python 3.6

down vot

You can specify the version of Python for the driver by setting the appropriate environment variables in the ./conf/spark-env.sh file. If it doesn't already exist, you can use the spark-env.sh.templatefile provided which also includes lots of other variables.

Here is a simple example of a spark-env.sh file to set the relevant Python environment variables:

#!/usr/bin/env bash

 

# This file is sourced when running various Spark programs.

export PYSPARK_PYTHON=/usr/bin/python3

export PYSPARK_DRIVER_PYTHON=/usr/bin/ipython

In this case it sets the version of Python used by the workers/executors to Python3 and the driver version of Python to iPython for a nicer shell to work in.

 

意思就是把spark文件夹下的./conf/spark-env.sh.tempalte 重命名成spark-env.sh

然后添加如下内容:

# This file is sourced when running various Spark programs.

export PYSPARK_PYTHON=/usr/bin/python3

export PYSPARK_DRIVER_PYTHON=/usr/bin/ipython

 

重启spark 即可

 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值