mysql到s3_Python实现备份EC2的重要文件和MySQL数据库到S3

今天尝试了使用boto这个工具来用python备份文件到s3,废话不说,上代码:

1. 备份重要文件到s3:

[python]

import os

connected = 0

def connect():

access_key = 'yourkey

secret_key = 'yourkey'

from boto.s3.connection import s3connection

global conn

conn = s3connection(access_key, secret_key)

global connected

connected = 1

def put(filename, bucketname):

if connected == 0:

print 'not connected!'

elif connected == 1:

local_file = filename.strip()

bucket = bucketname.strip()

from boto.s3.key import key

b = conn.get_bucket(bucket)

k = key(b)

k.key = local_file

k.set_contents_from_filename(local_file)

if __name__ == '__main__':

connect()

sourcefolder = '/var/www/www.ttgrow.com/ttgrow/photos/storyphotos'

print 'story photo sync in progress'

for root, dirs, files in os.walk(sourcefolder):

for file in files:

print '  '+str(os.path.join(root,file))

put(os.path.join(root,file),'ttgrow-photo')

sourcefolder = '/var/www/www.ttgrow.com/ttgrow/photos/thumbnails'

print 'thumbnail sync in progress'

for root, dirs, files in os.walk(sourcefolder):

for file in files:

print '  '+str(os.path.join(root,file))

put(os.path.join(root,file),'ttgrow-photo')

print 'finished'

2. 备份mysql到s3:

[python]

import os

connected = 0

def connect():

access_key = 'yourkey'

secret_key = 'yourkey'

from boto.s3.connection import s3connection

global conn

conn = s3connection(access_key, secret_key)

global connected

connected = 1

def put(filename, bucketname):

if connected == 0:

print 'not connected!'

elif connected == 1:

local_file = filename.strip()

bucket = bucketname.strip()

from boto.s3.key import key

b = conn.get_bucket(bucket)

k = key(b)

k.key = local_file

k.set_contents_from_filename(local_file)

if __name__ == '__main__':

from datetime import datetime

import os

temp = datetime.today()

filename = '/tmp/dbbak-'+str(temp.year)+'-'+str(temp.month)+'-'+str(temp.day)+'-'+str(temp.hour)+'-'+str(temp.minute)+'.sql'

os.system("mysqldump -h your_rds_location -u usrname -ppassword dbname > "+filename)

print 'backup db finished'

connect()

put(filename,'ttgrow-db')

print 'upload to s3 finished'

再把执行脚本加到定时器就每天可以定时执行了 :)

如您对本文有疑问或者有任何想说的,请点击进行留言回复,万千网友为您解惑!

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值