ssh批量分发数据

1、ssh安装包
[hadoop@hadoop ~]$ sudo rpm -qa |grep -Ei openss
openssh-clients-5.3p1-104.el6_6.1.x86_64
openssl-1.0.1e-30.el6_6.4.x86_64
openssh-askpass-5.3p1-104.el6_6.1.x86_64
openssh-5.3p1-104.el6_6.1.x86_64
openssh-server-5.3p1-104.el6_6.1.x86_64
openssl-devel-1.0.1e-30.el6_6.4.x86_64


2、优化过的ssh端口
[hadoop@hadoop ~]$ netstat -lnt |grep 52113
tcp        0      0 0.0.0.0:52113               0.0.0.0:*                   LISTEN      
tcp        0      0 :::52113                    :::*                        LISTEN  


[hadoop@hadoop ~]$ sudo netstat -lntup|grep ssh
tcp        0      0 0.0.0.0:52113               0.0.0.0:*                   LISTEN      1266/sshd           
tcp        0      0 :::52113                    :::*                        LISTEN      1266/sshd 


3、查看某个端口对应的服务
[root@hadoop local]# lsof -i tcp:52113
COMMAND  PID   USER   FD   TYPE DEVICE SIZE/OFF NODE NAME
sshd    1266   root    3u  IPv4  10275      0t0  TCP *:52113 (LISTEN)
sshd    1266   root    4u  IPv6  10277      0t0  TCP *:52113 (LISTEN)
sshd    1313   root    3r  IPv4  10434      0t0  TCP hadoop:52113->192.168.2.100:62936 (ESTABLISHED)
sshd    1347 hadoop    3u  IPv4  10434      0t0  TCP hadoop:52113->192.168.2.100:62936 (ESTABLISHED)


4、批量查看主机的IP地址
[root@hadoop local]# ssh -p52113 hadoop@192.168.2.124 /sbin/ifconfig eth0
hadoop@192.168.2.124's password: 
eth0      Link encap:Ethernet  HWaddr 00:0C:29:96:2F:9E  
         inet addr:192.168.2.124  Bcast:192.168.2.255  Mask:255.255.255.0
         inet6 addr: fe80::20c:29ff:fe96:2f9e/64 Scope:Link
         UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1
         RX packets:978 errors:0 dropped:0 overruns:0 frame:0
         TX packets:637 errors:0 dropped:0 overruns:0 carrier:0
         collisions:0 txqueuelen:1000 
         RX bytes:83714 (81.7 KiB)  TX bytes:73794 (72.0 KiB)


5、查看主机性能
[root@hadoop local]# ssh -p52113 hadoop@192.168.2.124 uptime
hadoop@192.168.2.124's password: 
00:17:49 up 16 min,  1 user,  load average: 0.00, 0.00, 0.00      [10,15,30分钟负载情况]


6、拷贝命令
scp -rp a.txt hadoop:/usr/local




7、sftp [CRT客户端sftp比较常用,传递windows文件]
[hadoop@hadoop ~]$ sftp -oPort=52113 hadoop@192.168.2.124
Connecting to 192.168.2.124...
The authenticity of host '[192.168.2.124]:52113 ([192.168.2.124]:52113)' can't be established.
RSA key fingerprint is 3d:56:ae:31:73:66:9c:21:02:02:bc:5a:6b:bd:bf:75.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added '[192.168.2.124]:52113' (RSA) to the list of known hosts.
hadoop@192.168.2.124's password: 
sftp> pwd
Remote working directory: /home/hadoop
sftp> ls
hadoopfile            hadoopfile_hard_link  hadoopfile_soft_link  mbr.bin               tachyon               
sftp> get /tmp/a  /home/hadoop
Fetching /tmp/a to a
/tmp/a                                                                                              100%    4     0.0KB/s   00:00    
sftp> ls
a                     hadoopfile            hadoopfile_hard_link  hadoopfile_soft_link  mbr.bin               tachyon               
sftp> rm ./a
Removing /home/hadoop/./a
sftp> ls
hadoopfile            hadoopfile_hard_link  hadoopfile_soft_link  mbr.bin               tachyon               
sftp> put /tmp/a
Uploading /tmp/a to /home/hadoop/a
/tmp/a                                                                                              100%    4     0.0KB/s   00:00    
sftp> ls
a                     hadoopfile            hadoopfile_hard_link  hadoopfile_soft_link  mbr.bin               tachyon               
sftp> ls -l
-rw-rw-r--    1 hadoop   hadoop          4 Jan 17 01:00 a
-rw-rw-r--    1 hadoop   hadoop         46 Aug 19 01:16 hadoopfile
-rw-rw-r--    1 hadoop   hadoop         51 Aug 19 01:09 hadoopfile_hard_link
lrwxrwxrwx    1 hadoop   hadoop         10 Dec 27 23:40 hadoopfile_soft_link
-rw-r--r--    1 root     root          512 Jan  7 20:21 mbr.bin
drwxrwxr-x    3 hadoop   hadoop       4096 Dec 28 11:19 tachyon


8、ssh免密码登录
[hadoop@hadoop ~]$ ssh-keygen  -t rsa 
Enter file in which to save the key (/home/hadoop/.ssh/id_rsa): 
Your identification has been saved in /home/hadoop/.ssh/id_rsa.
Your public key has been saved in /home/hadoop/.ssh/id_rsa.pub.
[hadoop@hadoop ~]$ cd .ssh/
[hadoop@hadoop .ssh]$ ll
total 12
-rw------- 1 hadoop hadoop 1675 Jan 17 08:56 id_rsa 私钥
-rw-r--r-- 1 hadoop hadoop  395 Jan 17 08:56 id_rsa.pub 公钥
-rw-r--r-- 1 hadoop hadoop  403 Jan 17 00:54 known_hosts


[hadoop@hadoop .ssh]$ ll -ld ../.ssh/
drwx------ 2 hadoop hadoop 4096 Jan 17 08:56 ../.ssh/ 700权限


[hadoop@hadoop .ssh]$ ssh -p52113 localhost
The authenticity of host '[localhost]:52113 ([::1]:52113)' can't be established.
RSA key fingerprint is 3d:56:ae:31:73:66:9c:21:02:02:bc:5a:6b:bd:bf:75.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added '[localhost]:52113' (RSA) to the list of known hosts.
hadoop@localhost's password: 
Last login: Sat Jan 17 00:53:55 2015 from 192.168.2.100
Welcome to hadoop training Compute Service\!
[hadoop@hadoop ~]$ exit


[hadoop@hadoop .ssh]$ cp id_rsa.pub authorized_keys
[hadoop@hadoop .ssh]$ ssh -p 52113 hadoop #免密码登录
The authenticity of host '[hadoop]:52113 ([192.168.2.124]:52113)' can't be established.
RSA key fingerprint is 3d:56:ae:31:73:66:9c:21:02:02:bc:5a:6b:bd:bf:75.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added '[hadoop]:52113' (RSA) to the list of known hosts.
Last login: Sat Jan 17 09:27:58 2015 from ::1
Welcome to hadoop training Compute Service\!
[hadoop@hadoop ~]$ 


9、ssh-copy-id 使用
[hadoop@hadoop .ssh]$ ssh-copy-id -i id_rsa.pub " -p 52113 hadoop@192.168.2.124" 
hadoop@192.168.2.124's password: 
Now try logging into the machine, with "ssh ' -p 52113 hadoop@192.168.2.124'", and check in:
 .ssh/authorized_keys
to make sure we haven't added extra keys that you weren't expecting.


[hadoop@hadoop .ssh]$ ll
total 16
-rw------- 1 hadoop hadoop  395 Jan 17 09:41 authorized_keys
-rw------- 1 hadoop hadoop 1675 Jan 17 08:56 id_rsa
-rw-r--r-- 1 hadoop hadoop  395 Jan 17 08:56 id_rsa.pub
-rw-r--r-- 1 hadoop hadoop 1198 Jan 17 09:31 known_hosts


[hadoop@hadoop .ssh]$ ssh -p 52113 hadoop     面密码登录
Last login: Sat Jan 17 09:31:03 2015 from 192.168.2.124
Welcome to hadoop training Compute Service\!
[hadoop@hadoop ~]$ 


[hadoop@hadoop ~]$ tree /home/hadoop/.ssh/
/home/hadoop/.ssh/
|-- authorized_keys
|-- id_rsa
|-- id_rsa.pub
`-- known_hosts
0 directories, 4 files


#为什么生成authorized_keys,因为ssh默认生成文件名
[hadoop@hadoop ~]$ sudo grep authorized_keys /etc/ssh/sshd_config
#AuthorizedKeysFile     .ssh/authorized_keys


10、查看隐藏目录
[hadoop@hadoop ~]$ ls -al      
total 88
drwx------.  8 hadoop hadoop  4096 Jan 17 08:56 .
drwxr-xr-x. 21 root   root    4096 Jan  3 22:02 ..
drwx------   2 hadoop hadoop  4096 Jan 17 09:41 .ssh


11、免密码远程查看其他主机信息
[hadoop@hadoop ~]$ ssh -p 52113 hadoop free -m
             total       used       free     shared    buffers     cached
Mem:          1869        707       1162          0         81        473
-/+ buffers/cache:        152       1717
Swap:         2047          0       2047
[hadoop@hadoop ~]$ ssh -p 52113 localhost free -m      
            total       used       free     shared    buffers     cached
Mem:          1869        707       1162          0         81        473
-/+ buffers/cache:        152       1717
Swap:         2047          0       2047


[root@hadoop ~]# crontab -e 修改定时任务
crontab: installing new crontab


12、分发数据到所有节点[rsync,scp]
[hadoop@hadoop ~]$ touch tachyon/test/dirs  
[hadoop@hadoop ~]$ ls tachyon/test/dirs 
tachyon/test/dirs


#手动分发数据
[hadoop@hadoop ~]$ scp -P52113 -rq tachyon/ hadoop@192.168.2.124:/home/hadoop/scp_dir
[hadoop@hadoop ~]$ ls scp_dir/
tachyon
[hadoop@hadoop ~]$ scp -P52113 -r -p tachyon/ hadoop@192.168.2.124:/home/hadoop/scp_dir  
dirs                                                                                                100%    0     0.0KB/s   00:00   


#自动分发[scp]
[hadoop@hadoop scripts]$ cat fenfa.sh 
#fenfa hadoop by 2015-1-17
scp -P52113 -r -p /home/hadoop/tachyon/ hadoop@192.168.2.124:/home/hadoop/scp_dir
cd /home/hadoop
tree scp_dir/
[hadoop@hadoop scripts]$ sh fenfa.sh     
dirs                                                                                                100%    0     0.0KB/s   00:00    
scp_dir/
`-- tachyon
   `-- test
       `-- dirs
2 directories, 1 file


#rsync分发
[hadoop@hadoop scripts]$ /usr/bin/rsync -avz --progress -e 'ssh -p 52113' /tmp/servers-2015011710.tag.gz  hadoop:/home/hadoop/
sending incremental file list
servers-2015011710.tag.gz
     127304 100%   45.08MB/s    0:00:00 (xfer#1, to-check=0/1)


sent 127438 bytes  received 31 bytes  84979.33 bytes/sec
total size is 127304  speedup is 1.00
[hadoop@hadoop scripts]$ ll -ls /home/hadoop/servers-2015011710.tag.gz     
128 -rw-r--r-- 1 hadoop hadoop 127304 Jan 17 10:00 /home/hadoop/servers-2015011710.tag.gz


同步后,如果本地没有了,那么远端就会被删除
[hadoop@hadoop scripts]$ /usr/bin/rsync -avz --delete --progress -e 'ssh -p 52113' /tmp/servers-2015011710.tag.gz  hadoop:/home/hadoop/


12、分发脚本
[hadoop@hadoop scripts]$ echo "192.168.2.124" > all_iplist.txt
[hadoop@hadoop scripts]$ cat all_iplist.txt 
192.168.2.124


[hadoop@hadoop scripts]$ sudo vi fenfa02.sh
[hadoop@hadoop scripts]$ cat fenfa02.sh 
#!/bin/sh
. /etc/init.d/functions
file="$1"
remote_dir="$2"


if [ $# -ne 2 ];then
 echo "usage:$0 argv1 argv2"
 echo "must have two argvs."
 exit
fi


for ip in `cat ./all_iplist.txt`
do
 scp -P52113 -r -p $file hadoop@$ip:$remote_dir >/dev/null 2>&1
 if [ $? -eq 0 ];then
action "target $ip $file is successful copied." /bin/true
 else 
action "target $ip $file is failure copied." /bin/false
 fi
done
[hadoop@hadoop scripts]$ sh fenfa02.sh  
usage:fenfa02.sh argv1 argv2
must have two argvs.


#测试
[hadoop@hadoop scripts]$ sh fenfa02.sh all_iplist.txt /tmp/
target 192.168.2.124 all_iplist.txt is successful copied.  [  OK  ]
[hadoop@hadoop scripts]$ ll -ls /tmp/all_iplist.txt 
4 -rw-rw-r-- 1 hadoop hadoop 14 Jan 17 10:58 /tmp/all_iplist.txt

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值