python文件存储服务器_如何使用python在运行时传递存储桶和本地文件夹值的同时将所有文件从s3存储桶下载到本地linux服务器...

I am making script to download files form s3 bucket to local linux folder. To achieve that i have to use dynamic values for buckets and folders where we want to download stuff.

I know how to do with

aws s3 cp s3://bucket /linux/local/folder --recursive --p alusta

But how to accept bucket value at runtime

dwn_cmd = "aws s3 cp s3://bucket/name/" + str(year_name) + '/' + str(month_name)

folder_path = "/local/linux/folder/" + folder_name

#subprocess.call(['aws','s3','cp',dwn_cmd,folder_path,'--recursive','--p', 'alusta'])

This is showing error that subprocess needs s3 bucket path and local folder path. I think it is not picking up the path. If i hard code the path it is working but not with this. How could I achieve my result

解决方案

With

dwn_cmd = "aws s3 cp s3://bucket/name/" + "2019" + '/' + "June"

folder_path = "/local/linux/folder/" + "test"

You will be calling

subprocess.call(['aws','s3','cp',

"aws s3 cp s3://bucket/name/2019/June",

"/local/linux/folder/test",

'--recursive', '--p', 'alusta']);

Delete the aws s3 cp parameters from dwn_command:

dwn_cmd = "s3://bucket/name/" + "2019" + '/' + "June"

Note: Do not use

subprocess.call([dwn_cmd, folder_path,'--recursive','--p', 'alusta']) # wrong

The space between aws and s3 will be considered as part of the command name, so it would look for the command in a subdirectory of the directory with 3 spaces aws s3 cp s3:.

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值