spark python 上传代码包_提交spark作业时可以在python代码中添加参数吗?

I'm trying to use spark-submit to execute my python code in spark cluster.

Generally we run spark-submit with python code like below.

# Run a Python application on a cluster

./bin/spark-submit \

--master spark://207.184.161.138:7077 \

my_python_code.py \

1000

But I wanna run my_python_code.pyby passing several arguments Is there smart way to pass arguments?

解决方案

Yes: Put this in a file called args.py

#import sys

print sys.argv

If you run

spark-submit args.py a b c d e

You will see:

['/spark/args.py', 'a', 'b', 'c', 'd', 'e']

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值