1、编写集群分发脚本xsync
scp定义:
scp可以实现服务器与服务器之间的数据拷贝
基本语法
scp :命令
-r :递归
p
d
i
r
/
pdir/
pdir/fname: 要拷贝的文件路径/名称
u
s
e
r
@
h
a
d
o
o
p
user@hadoop
user@hadoophost:
p
d
i
r
/
pdir/
pdir/fname:目的用户@主机:目的路径/名称
案例实操
(1)将hadoop101中/opt/hadoop/module/hadoop-2.7.2目录下的软件拷贝到hadoop100的/opt/hadoop/目录下
[root@hadoop100 ~]# scp -r hadoop101:/opt/hadoop/module/hadoop-2.7.2 hadoop100:/opt/hadoop/
root@hadoop100's password:
NOTICE.txt 100% 101 54.4KB/s 00:00
kms-log4j.properties 100% 1631 517.3KB/s 00:00
httpfs-signature.secret 100% 21 11.8KB/s 00:00
mapred-site.xml.template 100% 758 364.7KB/s 00:00
mapred-env.cmd 100% 951 304.1KB/s 00:00
ssl-client.xml.example 100% 2316 764.8KB/s 00:00
httpfs-env.sh
......
.....
VERSION 100% 229 42.3KB/s 00:00
VERSION 100% 130 32.2KB/s 00:00
dfsUsed 100% 19 3.9KB/s 00:00
scanner.cursor 100% 166 54.4KB/s 00:00
fsimage_0000000000000000001.md5 100% 62 25.6KB/s 00:00
fsimage_0000000000000000001 100% 350 58.9KB/s 00:00
edits_0000000000000000002-0000000000000000003 100% 42 10.1KB/s 00:00
fsimage_0000000000000000003.md5 100% 62 10.7KB/s 00:00
fsimage_0000000000000000003 100% 350 57.1KB/s 00:00
VERSION 100% 203 81.7KB/s 00:00
Connection to hadoop101 closed.
拷贝过来了,如下:
scp:只拷贝内容,其他都没有拷贝
[root@hadoop100 ~]# cd /opt/hadoop/
[root@hadoop100 hadoop]# ls
hadoop-2.7.2 module software
[root@hadoop100 hadoop]#
rsync 远程同步工具
rsync主要用于备份和镜像。具有速度快、避免复制相同内容和支持符号链接的优点。
rsync和scp区别:
用rsync:做文件的复制要比scp的速度快,
rsync:只对差异文件做更新。scp是把所有文件都复制过去。
当我们拷贝jdk,使用rsync拷贝,属于归档拷贝
拷贝所有
[root@hadoop100 hadoop]# rsync -av /usr/local/java/jdk/jdk1.8.0_221/ /opt/hadoop/
...........
...........
man/man1/rmiregistry.1
man/man1/schemagen.1
man/man1/serialver.1
man/man1/servertool.1
man/man1/tnameserv.1
man/man1/unpack200.1
man/man1/wsgen.1
man/man1/wsimport.1
man/man1/xjc.1
sent 406,940,699 bytes received 31,280 bytes 38,759,236.10 bytes/sec
total size is 406,719,945 speedup is 1.00
[root@hadoop100 hadoop]# ll
total 26016
drwxr-xr-x 2 10 143 4096 Jul 4 11:35 bin
-r--r--r-- 1 10 143 3244 Jul 4 11:35 COPYRIGHT
drwxr-xr-x 16 root root 259 Nov 19 13:53 hadoop-2.7.2
drwxr-xr-x 3 10 143 132 Jul 4 11:35 include
-rw-r--r-- 1 10 143 5216468 Jun 12 11:07 javafx-src.zip
drwxr-xr-x 5 10 143 185 Jul 4 11:35 jre
drwxr-xr-x 5 10 143 245 Jul 4 11:35 lib
-r--r--r-- 1 10 143 44 Jul 4 11:35 LICENSE
drwxr-xr-x 4 10 143 47 Jul 4 11:35 man
drwxr-xr-x. 3 MissZhou MissZhou 53 Nov 14 22:39 module
-r--r--r-- 1 10 143 159 Jul 4 11:35 README.html
-rw-r--r-- 1 10 143 424 Jul 4 11:35 release
drwxr-xr-x. 3 MissZhou MissZhou 20 Nov 14 17:16 software
-rw-r--r-- 1 10 143 21107447 Jul 4 11:35 src.zip
-rw-r--r-- 1 10 143 116468 Jun 12 11:07 THIRDPARTYLICENSEREADME-JAVAFX.txt
-r--r--r-- 1 10 143 169691 Jul 4 11:35 THIRDPARTYLICENSEREADME.txt
[root@hadoop100 hadoop]#
接着,我们可以进入home/MissZhou目录创建xsync文件
[root@hadoop101 ~]# cd /home/MissZhou
[root@hadoop101 MissZhou]# vim xsync
在该文件中编写如下代码
#!/bin/bash
#1 获取输入参数个数,如果没有参数,直接退出
pcount=$#
if ((pcount==0)); then
echo no args;
exit;
fi
#2 获取文件名称
p1=$1
fname=`basename $p1`
echo fname=$fname
#3 获取上级目录到绝对路径
pdir=`cd -P $(dirname $p1); pwd`
echo pdir=$pdir
#4 获取当前用户名称
user=`whoami`
#5 循环
for((host=103; host<105; host++)); do
echo ------------------- hadoop$host --------------
rsync -av $pdir/$fname $user@hadoop$host:$pdir
done
保存退出!
接着,修改脚本 xsync 具有执行权限
[root@hadoop101 MissZhou]# chmod +x xsync
[root@hadoop101 MissZhou]#
[root@hadoop101 MissZhou]# ll
total 8
-rw-rw-r--. 1 MissZhou MissZhou 89 Nov 14 17:38 test.sh
-rwxr-xr-x 1 root root 499 Nov 19 14:52 xsync
[root@hadoop101 MissZhou]#
调用脚本形式:xsync 文件名称
[root@hadoop101 MissZhou]# ./xsync xsync
fname=xsync
pdir=/home/MissZhou
------------------- hadoop103 --------------
ssh: Could not resolve hostname hadoop103: Name or service not known
rsync: connection unexpectedly closed (0 bytes received so far) [sender]
rsync error: unexplained error (code 255) at io.c(226) [sender=3.1.2]
------------------- hadoop104 --------------
ssh: Could not resolve hostname hadoop104: Name or service not known
rsync: connection unexpectedly closed (0 bytes received so far) [sender]
rsync error: unexplained error (code 255) at io.c(226) [sender=3.1.2]
[root@hadoop101 MissZhou]#