目录
1,功能
使用shell脚本将一台虚拟机上面数据分发到其他虚拟机上面。
2,注意点
需要修改的地方:hadoop250 hadoop251 hadoop252 hadoop253 hadoop254
这些是主机名,需要换成自己的主机名,注意中间是空格。
保存之后将权限改为777,可读写
3,shell脚本介绍
shell脚本用法是:提前将可执行的命令语句写入一个文件中,这些命令会按照顺序执行,解释器逐行解释代码。
常见的脚本有:shell、python、PHP…
(用什么解释器就是什么脚本,这里使用的shell,也就是shell脚本)
也是像其他脚本语句一样有不同的语法格式。
一般以“#!/bin/bash”作为脚本开头
4,使用方法
xsync 需要分发的内容
5,bash内容
cd /home/yiduoyun/bin
vim xsync
#!/bin/bash
#1. 判断参数个数
if [ $# -lt 1 ]
then
echo Not Enough Arguement!
exit;
fi
#2. 遍历集群所有机器
for host in hadoop250 hadoop251 hadoop252 hadoop253 hadoop254
do
echo ==================== $host ====================
#3. 遍历所有目录,挨个发送
for file in $@
do
#4 判断文件是否存在
if [ -e $file ]
then
#5. 获取父目录
pdir=$(cd -P $(dirname $file); pwd)
#6. 获取当前文件的名称
fname=$(basename $file)
ssh $host "mkdir -p $pdir"
rsync -av $pdir/$fname $host:$pdir
else
echo $file does not exists!
fi
done
done
将权限改为777:
chmod 777 xsync
ls -l
分发完成之后去被分发的虚拟机就可以看到分发的文件了,实现一键分发。
(相当于scp命令远程拷贝文件),这个shell脚本可以一个指令发送到多台虚拟机。
5, 若还是出现报错 ssh: Could not resolve hostname hadoop1: Name or service not known
,则检查 /etc/hosts 目录下文件是否配置
报错内容
[root@#localhost bin]# xsync /root/cloudera-manager-centos7-cm5.14.0_x86_64.tar.gz
==================== hadoop1 ====================
ssh: Could not resolve hostname hadoop1: Name or service not known
ssh: Could not resolve hostname hadoop1: Name or service not known
rsync: connection unexpectedly closed (0 bytes received so far) [sender]
rsync error: unexplained error (code 255) at io.c(605) [sender=3.0.9]
==================== hadoop2 ====================
ssh: Could not resolve hostname hadoop2: Name or service not known
ssh: Could not resolve hostname hadoop2: Name or service not known
rsync: connection unexpectedly closed (0 bytes received so far) [sender]
rsync error: unexplained error (code 255) at io.c(605) [sender=3.0.9]
==================== hadoop3 ====================
ssh: Could not resolve hostname hadoop3: Name or service not known
ssh: Could not resolve hostname hadoop3: Name or service not known
rsync: connection unexpectedly closed (0 bytes received so far) [sender]
rsync error: unexplained error (code 255) at io.c(605) [sender=3.0.9]
[root@#localhost bin]# hostname
#localhost.localdomain
hadoop1
编辑/etc/hosts 文件
127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
::1 localhost localhost.localdomain localhost6 localhost6.localdomain6
192.168.11.101 hadoop1
192.168.11.102 hadoop2
192.168.11.103 hadoop3
~
~
若报错bash: rsync: 未找到命令
可能是没有安装rsync
[root@master bin]# xsync /opt/cloudera-manager/
==================== master ====================
sending incremental file list
sent 230 bytes received 17 bytes 164.67 bytes/sec
total size is 1,214,327,520 speedup is 4,916,305.75
==================== slave1 ====================
bash: rsync: 未找到命令
rsync: connection unexpectedly closed (0 bytes received so far) [sender]
rsync error: remote command not found (code 127) at io.c(226) [sender=3.1.2]
==================== slave2 ====================
bash: rsync: 未找到命令
rsync: connection unexpectedly closed (0 bytes received so far) [sender]
rsync error: remote command not found (code 127) at io.c(226) [sender=3.1.2]
解决
yum -y install rsync,但发现安装了rsync依旧报错,最后发现同步和被同步的虚拟机都需要安装yum -y install rsync 。
另外写法
#!/bin/bash
#1 获取输入参数个数,如果没有参数,直接退出
pcount=$#
if((pcount==0)); then
echo no args;
exit;
fi
#2 获取文件名称
p1=$1
fname=`basename $p1`
echo fname=$fname
#3 获取上级目录到绝对路径
pdir=`cd -P $(dirname $p1); pwd`
echo pdir=$pdir
#4 获取当前用户名称
user=`whoami`
#5 循环
for((host=102; host<106; host++)); do
echo ------------------- hadoop$host --------------
rsync -av $pdir/$fname $user@hadoop$host:$pdir
done