1.ifconfig 查看网络,修改IP**
2.修改主机名
sudo vim /etc/hostname
sudo vim /etc/hosts
还可增加192.168.. itcast
重启主机名:
sudo reboot
查看主机名:
3.安装Hadoop:解压的目录是/home/cr/app/
解压:
tar -zxfv hadoop-3.2.0.tar.gz
Hadoop安装包中的etc
cd etc
cd hadoop/
echo $JAVA_HOME查看java_home路径
1).改hadoop-env.sh
export JAVA_HOME=usr/lib/jdk/jdk1.8.0_211
2).改 core-site.xml
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://cr-hadoop01:9000</value> 本机用户名
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/home/cr/app/hadoop-3.2.0/data/</value> Hadoop地址
</property>
</configuration>
3)改 hdfs-site.xml hdfs运行的具体的参数(副本的数量)
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value> 伪分布备份为1
</property>
</configuration>
4).改mapred-site.xml hadoop3.x在该文件中要比hadoop2.x配置的内容多
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
<property>
<name>yarn.app.mapreduce.am.env</name>
<value>HADOOP_MAPRED_HOME=/home/cr/app/hadoop-3.2.0/</value>
</property>
<property>
<name>mapreduce.map.env</name>
<value>HADOOP_MAPRED_HOME=/home/cr/app/hadoop-3.2.0/</value>
</property>
<property>
<name>mapreduce.reduce.env</name>
<value>HADOOP_MAPRED_HOME=/home/cr/app/hadoop-3.2.0/</value>
</property>
<property>
<name>mapreduce.application.classpath</name>
<value>
/home/cr/app/hadoop-3.2.0/etc/hadoop,
/home/cr/app/hadoop-3.2.0/share/hadoop/common/*,
/home/cr/app/hadoop-3.2.0/share/hadoop/common/lib/*,
/home/cr/app/hadoop-3.2.0/share/hadoop/hdfs/*,
/home/cr/app/hadoop-3.2.0/share/hadoop/hdfs/lib/*,
/home/cr/app/hadoop-3.2.0/share/hadoop/mapreduce/*,
/home/cr/app/hadoop-3.2.0/share/hadoop/mapreduce/lib/*,
/home/cr/app/hadoop-3.2.0/share/hadoop/yarn/*,
/home/cr/app/hadoop-3.2.0/share/hadoop/yarn/lib/*
</value>
</property>
</configuration>
5).改yarn-site.xml
<configuration>
<!-- Site specific YARN configuration properties -->
<property>
<name>yarn.resourcemanager.hostname</name>
<value>cr-hadoop01</value>
</property>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.application.classpath</name>
<value>/home/cr/app/hadoop-3.2.0/etc/hadoop:/home/cr/app/hadoop-3.2.0/share/hadoop/common/lib/*:/home/cr/app/hadoop-3.2.0/share/hadoop/common/*:/home/cr/app/hadoop-3.2.0/share/hadoop/hdfs:/home/cr/app/hadoop-3.2.0/share/hadoop/hdfs/lib/*:/home/cr/app/hadoop-3.2.0/share/hadoop/hdfs/*:/home/cr/app/hadoop-3.2.0/share/hadoop/mapreduce/lib/*:/home/cr/app/hadoop-3.2.0/share/hadoop/mapreduce/*:/home/cr/app/hadoop-3.2.0/share/hadoop/yarn:/home/cr/app/hadoop-3.2.0/share/hadoop/yarn/lib/*:/home/cr/app/hadoop-3.2.0/share/hadoop/yarn/*</value>
</property>
</configuration>
4.配置环境变量
sudo vim /etc/profile
source /etc/profile
hadoop namenode -format
启动:
cd /home/cr/app/hadoop-3.2.0/sbin/
start-dfs.sh
start-dfs.sh
Hadoop2端口:50070
Hadoop3端口:9870