环境配置
在本机和目标机器安装对应包(注意,都要有)
sudo easy_install fabric
目前是1.8版本(或者用pip install,一样的)
安装完后,可以查看是否安装成功
[ken@~$] which fab
/usr/local/bin/fab
装完之后,可以浏览下官方文档
执行本机操作
远端操作
这时候,假设,你要到机器A的/home/ken/project对应项目目录把配置文件更新下来
设置远程服务器host
在本机和目标机器安装对应包(注意,都要有)
sudo easy_install fabric
目前是1.8版本(或者用pip install,一样的)
安装完后,可以查看是否安装成功
[ken@~$] which fab
/usr/local/bin/fab
装完之后,可以浏览下官方文档
执行本机操作
from fabric.api import local
def lsfab():
local('cd ~/tmp/fab')
local('ls')
结果:
[ken@~/tmp/fab$] pwd;ls
/Users/ken/tmp/fab
fabfile.py fabfile.pyc test.py test.pyc
[ken@~/tmp/fab$] fab -f test.py lsfab
[localhost] local: cd ~/tmp/fab
[localhost] local: ls
fabfile.py fabfile.pyc test.py test.pyc
Done.
远端操作
这时候,假设,你要到机器A的/home/ken/project对应项目目录把配置文件更新下来
#!/usr/bin/env python
# encoding: utf-8
from fabric.api import local,cd,run
env.hosts=['user@ip:port',] #ssh要用到的参数
env.password = 'pwd'
def setting_ci():
local('echo "add and commit settings in local"')
#刚才的操作换到这里,你懂的
def update_setting_remote():
print "remote update"
with cd('~/temp'): #cd用于进入某个目录
run('ls -l | wc -l') #远程操作用run
def update():
setting_ci()
update_setting_remote()
然后,执行之:
[ken@~/tmp/fab$] fab -f deploy.py update
[user@ip:port] Executing task 'update'
[localhost] local: echo "add and commit settings in local"
add and commit settings in local
remote update
[user@ip:port] run: ls -l | wc -l
[user@ip:port] out: 12
[user@ip:port] out:
注意,如果不声明env.password,执行到对应机器时会跳出要求输入密码的交互
设置远程服务器host
from fabric.api import run, env env.user= 'root' env.hosts = ['host1', 'jazywoo@host2'] #全局host env.password= '123456' env.passwords= { 'host1' : 123456, 'host2' : 123456789, } def taskA(): run('ls') def taskB(): run('whoami') taskA executed on host1 #执行结果 taskA executed on host2 taskB executed on host1 taskB executed on host2
from fabric.api import env env.roledefs['webservers'] = ['www1', 'www2', 'www3']
from fabric.api import env env.roledefs = { 'web': ['www1', 'www2', 'www3'], 'dns': ['ns1', 'ns2'] }
from fabric.api import env, run def set_hosts(): #本地设置host env.hosts = ['host1', 'host2'] def mytask(): run('ls /var/www')
$ fab -H host1,host2 mytask #命令行设置host $ fab mytask:hosts="host1;host2" 针对任务设置host
from fabric.api import env, run env.hosts.extend(['host3', 'host4']) #整合命令行,合并host def mytask(): run('ls /var/www')
from fabric.api import hosts, run @hosts('host1', 'host2') #使用host修饰来制定host def mytask(): run('ls /var/www') my_hosts = ('host1', 'host2') @hosts(my_hosts) def mytask(): # ...
from fabric.api import env, hosts, roles, run env.roledefs = {'role1': ['b', 'c']} @hosts('a', 'b') @roles('role1') #使用roles修饰来指定host def mytask(): run('ls /var/www')
执行任务execute和run_once
from fabric.api import run, roles, execute env.roledefs = { 'db': ['db1', 'db2'], 'web': ['web1', 'web2', 'web3'], } @roles('db') def migrate(): # Database stuff here. pass @roles('web') def update(): # Code updates here. pass def deploy(): execute(migrate) #执行任务 ,多次调用execute可以多次执行,而使用run_once保证只执行一次 execute(update) execute得到host结果
from fabric.api import task, execute, run, runs_once @task def workhorse(): return run("get my infos") @task @runs_once def go(): results = execute(workhorse) print results
使用task修饰
from fabric.api import task @task(alias='dwm') #task别名 def deploy_with_migrations(): pass @task def mytask(): run("a command") 这样使用终端命令 fab --list可以查看任务列别 也可以使用Task类封装所有操作
class MyTask(Task): name = "deploy" def run(self, environment, domain="whatever.com"): run("git clone foo") sudo("service apache2 restart") instance = MyTask() ============相当于=============== @task def deploy(environment, domain="whatever.com"): run("git clone foo") sudo("service apache2 restart")
from fabric.api import task from fabric.tasks import Task class CustomTask(Task): def __init__(self, func, myarg, *args, **kwargs): super(CustomTask, self).__init__(*args, **kwargs) self.func = func self.myarg = myarg def run(self, *args, **kwargs): return self.func(*args, **kwargs) @task(task_class=CustomTask, myarg='value', alias='at') def actual_task(): pass
当tabfile执行的时候,实际执行如下: task_obj = CustomTask(actual_task, myarg='value')
Parallel并行任务
上述任务的执行是串行serially的,
from fabric.api import * def update(): with cd("/srv/django/myapp"): run("git pull") def reload(): sudo("service apache2 reload")
终端执行 $ fab -H web1,web2,web3 update reload 之后结果如下: 1 update on web1 2 update on web2 3 update on web3 4 reload on web1 5 reload on web2 6 reload on web3 而并行结果希望是: 1 update on web1, web2, and web3 2 reload on web1, web2, and web3
并行处理等待的时间更短 from fabric.api import * @parallel #使用修饰副 def runs_in_parallel(): pass def runs_serially(): pass
终端 $ fab -H host1,host2,host3 runs_in_parallel runs_serially 结果: 1 runs_in_parallel on host1, host2, and host3 2 runs_serially on host1 3 runs_serially on host2 4 runs_serially on host3 或者直接在终端命令执行时用 —P 参数 $ fab -H host1,host2,host3 -P runs_in_parallel runs_serially
还可以设置并行的个数 from fabric.api import * @parallel(pool_size=5) #使用pool_size def heavy_task(): # lots of heavy local lifting or lots of IO here $ fab -P -z 5 heavy_task #命令行
关闭远程任务Connection
没有关闭连接。python程序会一直等待新资源发送
from fabric.state import connections for key in connections.keys(): connections[key].close() del connections[key]
实例
from fabric.api import abort, cd, env, get, hide, hosts, local, prompt, \ #fabric的操作命令 put, require, roles, run, runs_once, settings, show, sudo, warn from fabric.api import * @hosts('host1') def clean_and_upload(): local('find assets/ -name "*.DS_Store" -exec rm '{}' \;') local('tar czf /tmp/assets.tgz assets/') put('/tmp/assets.tgz', '/tmp/assets.tgz') with cd('/var/www/myapp/'): run('tar xzf /tmp/assets.tgz')