这个简单的备份脚本是为了备份所有数据库和网站服务器上,S3存储,但很容易可以适应大多数备份。
本指南假定你已经有你的目的地 bucket设置和配置。当你设置bucket,你可以设置默认过期的文件。这意味着备份脚本不需要担心清理旧的备份。
该脚本需要一个小的配置文件。我建议这/root/。只有根backuprc和可读性。如果你业务有所不同,你需要更新脚本。内容应该是:
[Config]
aws_access_key_id = <Your AWS Access Key ID>
aws_secret_access_key = <Your AWS Secret Key>
aws_bucket = <Your AWS Bucket ID>
webroot = <The root directory containing your websites, eg. /var/www>
如果您正在考虑这样的备份方式,那请根据你的需要更改脚本中相应的部分参数。保存在一个可执行文件例如/ usr / local / bin /备份
#!/usr/bin/python
# WhatTheHost Backup Script
# Requires boto
# Requires /root/.backuprc config file
#Importing the modules
import os
import ConfigParser
import time
from boto.s3.connection import S3Connection
from boto.s3.key import Key
import boto
import tarfile
import logging
import glob
logging.basicConfig(level=logging.INFO)
# Get DB Details
config = ConfigParser.ConfigParser()
config.read("/etc/mysql/debian.cnf")
username = config.get('client', 'user')
password = config.get('client', 'password')
hostname = config.get('client', 'host')
filestamp = time.strftime('%d-%m-%Y')
# Get Config
config = ConfigParser.ConfigParser()
config.read("/root/.backuprc")
accessid = config.get('Config', 'aws_access_key_id')
accesskey = config.get('Config', 'aws_secret_access_key')
bucketname = config.get('Config', 'aws_bucket')
webroot = config.get('Config', 'webroot')
# Connect to s3 Bucket
s3conn = S3Connection(aws_access_key_id=accessid,aws_secret_access_key=accesskey)
bucket = s3conn.get_bucket(bucketname)
# Back Up Databases
print 'Backing up Databases'
database_list_command="mysql -u %s -p%s -h %s --silent -N -e 'show databases'" % (username, password, hostname)
for d in os.popen(database_list_command).readlines():
d = d.strip()
if d == 'information_schema':
continue
if d == 'performance_schema':
continue
filename = "/backups/mysql/%s-%s.sql" % (d, filestamp)
os.popen("mysqldump --single-transaction -u %s -p%s -h %s -d %s | gzip -c > %s.gz" % (username, password, hostname, d, filename))
print 'Uploading ' + d + 'to s3'
k = Key(bucket)
k.key = "%s-%s.sql.gz" % (d, filestamp)
k.set_contents_from_filename(filename + '.gz')
os.remove(filename + '.gz')
# Back up Files
dirs = glob.glob(webroot + '/*')
print dirs
for d in dirs:
path = d.split("/")
file = path[-1]
print 'Backing up ' + file
tar = tarfile.open(os.path.join('/backups/websites/', file + '-' + filestamp + '.tar.bz2'), 'w:bz2')
tar.add(d)
tar.close()
print 'Uploading ' + file + ' to s3'
k = Key(bucket)
k.key = file + '-' + filestamp + '.tar.bz2'
k.set_contents_from_filename('/backups/websites/' + file + '-' + filestamp + '.tar.bz2')
os.remove('/backups/websites/' + file + '-' + filestamp + '.tar.bz2')
使用这个脚本。它将为每个数据库和网站在你所指定的目录创建一个单独的备份。