如题:现在需要将指定的文件或者文件夹下的全部文件上传到S3(这里还要区分一下是将文件的目录同步在桶里还是只将文件一股脑的放到桶里或者是将文件按照指定格式的文件夹放到桶里,这里只给出前两个的)
这里要注意一下,因为要上传到AWS的S3,所以首先肯定要匹配AWS的AKSK,这里是直接在ec2实例中使用aws-cli去配置的,如果不想用aws-cli配置,需要在demo中,也就是获取s3的resource对象(也可以获取client对象)时指定AK,SK,Region_name以及s3。
还有就是注意该示例执行时需要指定文件或者文件名的completepath作为参数。
代码示例
import boto3
import sys
import logging
import os
#Set log
logger = logging.getLogger()
logger.setLevel(logging.INFO)
#Specify the bucket name for S3
AWS_BUCKET_NAME = 'your_bucket_name'
#Get an S3 resource object
s3 = boto3.resource('s3')
def upload_iterate_dir_files_to_s3(folder_dir):
for root,dirs,files in os.walk(folder_dir):
for each_file in files:
print(os.path.join(root,each_file))
each_file_path = os.path.join(root,each_file)
data = open(os.path.normpath(each_file_path), 'rb')
file_basename = os.path.basename(each_file_path)
#这里的key如果指定file_basename,就是把所有的文件一股脑放到桶下,如果指定each_file_path 则同步文件夹
s3.Bucket(AWS_BUCKET_NAME).put_object(Key=each_file_path, Body=data)
for dir in dirs:
upload_iterate_dir_files_to_s3(dir)
def upload_file_to_s3(file_path):
data = open(os.path.normpath(file_path), 'rb')
file_basename = os.path.basename(file_path)
s3.Bucket(AWS_BUCKET_NAME).put_object(Key=file_basename, Body=data)
print("The file was successfully uploaded to S3")
if sys.argv[1] is None:
logger.error("Please enter a valid and complete file path")
print("Please enter a valid and complete file path")
elif os.path.isdir(sys.argv[1]):
upload_iterate_dir_files_to_s3(sys.argv[1])
print("All files in the folder were uploaded to S3 successfully")
elif os.path.isfile(sys.argv[1]):
upload_file_to_s3(sys.argv[1])
else:
print ("it's a special file(socket,FIFO,device file)"
logger.error("it's a special file(socket,FIFO,device file")