总体流程:
获取centos7镜像
为centos7镜像安装ssh
使用pipework为容器配置IP
为centos7镜像配置java、hadoop
配置hadoop
1.获取centos7镜像
$ docker pull centos:7
//查看当前已下载docker镜像
$ docker image ls -a
2.编写Dockerfile
$ vim Dockerfile
写入以下内容:
FROM centos
MAINTAINER dys
RUN yum install -y openssh-server sudo
RUN sed -i ‘s/UsePAM yes/UsePAM no/g’ /etc/ssh/sshd_config
RUN yum install -y openssh-clients
RUN echo “root:0225” | chpasswd
RUN echo “root ALL=(ALL) ALL” >> /etc/sudoers
RUN ssh-keygen -t dsa -f /etc/ssh/ssh_host_dsa_key
RUN ssh-keygen -t rsa -f /etc/ssh/ssh_host_rsa_key
RUN mkdir /var/run/sshd
EXPOSE 22
CMD [“/usr/sbin/sshd”,”-D”]
构建镜像命令,设置镜像名为centos-ssh
$ docker build -t="centos-ssh" .
$ docker images
3.使用pipework设置容器ip
- 先安装git客户端
$ yum -y install git
$ git --version
- git clone pipeword
$ git clone https://github.com/jpetazzo/pipework
$ cd pipework
$ cp pipework /usr/loacl/bin
- 安装bridge-utils
$ yum -y install bridge-utils
//创建网络
$ brctl addbr br1
$ ip link set dev br1 up
$ ip addr add 192.168.3.1/24 dev br1
//启动centos-ssh image
$ docker run -d --name=centos.ssh centos.ssh
//设置ip
$ pipework br1 centos.ssh 192.168.3.20/24
$ ping 192.168.3.20
$ ssh 192.168.3.20
//再创建两个相同的容器,到此三台服务器就搭建好了
$ docker run -d --name=centos7.ssh2 centos-ssh
$ docker run -d --name=centos7.ssh3 centos-ssh
$ pipework br1 centos7.ssh2 192.168.3.22/24
$ pipework br1 centos7.ssh3 192.168.3.23/24
4.为centos7镜像配置java、hadoop
$ vim Dockerfile
写入以下内容:
前提:在Dockerfile所在目录下准备好 jdk-8u101-linux-x64.tar.gz 与 hadoop-2.7.3.tar.gz
FROM centos-ssh
ADD jdk-8u101-linux-x64.tar.gz /usr/local/
RUN mv /usr/local/jdk1.8.0_101 /usr/local/jdk1.8
ENV JAVA_HOME /usr/local/jdk1.8
ENV PATH $JAVA_HOME/bin:$PATH
ADD hadoop-2.7.3.tar.gz /usr/local
RUN mv /usr/local/hadoop-2.7.3 /usr/local/hadoop
ENV HADOOP_HOME /usr/local/hadoop
ENV PATH $HADOOP_HOME/bin:$PATH
RUN yum install -y which sudo
$ docker build -t="hadoop" .
//容器hadoop0启动时,映射了端口号,50070和8088,是用来在浏览器中访问hadoop WEB界面的
$ docker run --name hadoop0 --hostname hadoop0 -d -P -p 50070:50070 -p 8088:8088 hadoop
$ docker run --name hadoop1 --hostname hadoop1 -d -P hadoop
$ docker run --name hadoop2 --hostname hadoop2 -d -P hadoop
//配置ip
$ pipework br1 hadoop0 192.168.3.30/24
$ pipework br1 hadoop1 192.168.3.31/24
$ pipework br1 hadoop2 192.168.3.32/24
//连接到container
$ docker exec -it hadoop0 /bin/bash
$ docker exec -it hadoop1 /bin/bash
$ docker exec -it hadoop2 /bin/bash
//在各个centos中修改/etc/hosts
192.168.3.30 master
192.168.3.31 slave1
192.168.3.32 slave2
引用参考: