- 1 基础环境
-
1.1.1 关闭selinux
#查看状态:/usr/sbin/sestatus
#disabled vim /etc/selinux/config
#重启 reboot -
1.1.2 关闭防火墙
#查看防火墙规则 firewall-cmd --list-all
#查询端口是否开放 firewall-cmd --query-port=8080/tcp
#开放80端口firewall-cmd --permanent --add-port=80/tcp
#移除端口firewall-cmd --permanent --remove-port=8080/tcp
#重启防火墙firewall-cmd --reload
#开启 service firewalld start
#重启 service firewalld restart
#关闭 service firewalld stop
#禁止防火墙开机启动 systemctl disable firewalld.service -
1.1.3本土化yum源
mv /etc/yum.repos.d/CentOS-Base.repo /etc/yum.repos.d/CentOS-Base.repo.backup
cd /etc/yum.repos.d/
wget http://mirrors.163.com/.help/CentOS7-Base-163.repo
yum clean all
yum makecache
yum -y update
1.1.4创建hadoop用户组
#root groupadd -g 1001 hadoop
#root useradd -u 2000 -g hadoop hadoop
#root mkdir -p /data/app/hadoop
#root chown -R hadoop.hadoop /data/app -
1.1.5双机免密交互
#root cd /home/hadoop
#root ssh−keygen−trsa
#root scp -r root@hadoop01:/home/hadoop/.ssh/id_rsa.pub authorized_keys_hadoop01
#root scp -r root@hadoop02:/home/hadoop/.ssh/id_rsa.pub authorized_keys_hadoop02
#root cat authorized_keys_hadoop01 authorized_keys_hadoop02 >> authorized_keys
#hadoop ssh hadoop01 -
1.1.6安装python3.6.6
$ whereis python
$ ll python*(python指向的是python2,python2指向的是python2.7)
$ python3编译依赖包
$ root yum install zlib-devel bzip2-devel openssl-devel ncurses-devel sqlite-devel readline-devel tk-devel gcc make#python3源
#wget https://www.python.org/ftp/python/3.6.4/Python-3.6.4.tar.xz
#解压
#root xz -d Python-3.6.4.tar.xz
#root tar -xf Python-3.6.4.tar
#root cd ./Python-3.6.4
#root ./configure prefix=/usr/local/python3
#root make && make install#软连接
#root mv /usr/bin/python /usr/bin/python.bak
#root ln -s /usr/local/python3/bin/python3.6 /usr/bin/python
#root python -V#修改yum python配置
#root vim /usr/bin/yum
把#! /usr/bin/python修改为#! /usr/bin/python2vim /usr/libexec/urlgrabber-ext-down
把#! /usr/bin/python 修改为#! /usr/bin/python2 -
1.1.7安装htop
#root yum install epel-release -y
#root yum install htop -y1.1基础环境
Item Version Mark
Centos 7.3
Jdk 1.8
Python 3.6.6
Hadoop 2.7 使用spark推荐组合版本
Spark 2.4
Hadoop01 192.168.0.101
Hadoop02 192.168.0.102
/data/app/hadoop Hadoop.hadoop
/data/software