1 、Hadoop的安装与使用
- liunx选择Centos7 64位
- 建议电脑16G内存
- 开三台虚拟机
主机\Hadoop节点 | NameNode | SecondaryNameNode | DataNode |
---|---|---|---|
node01 | * | * | |
node02 | * | * | |
node03 | * |
1.1、准备安装环境
-
[root@node01 ~]# tar -zxvf hadoop-3.1.2.tar.gz [root@node01 ~]# mv hadoop-3.1.2 /opt/yjx/ [root@node01 ~]# cd /opt/yjx/hadoop-3.1.2/etc/hadoop/
1.2、修改集群环境
-
[root@node01 hadoop]# vim hadoop-env.sh
-
##直接在文件的最后添加 export JAVA_HOME=/usr/java/jdk1.8.0_231-amd64 export HDFS_NAMENODE_USER=root export HDFS_DATANODE_USER=root export HDFS_SECONDARYNAMENODE_USER=root
-
1.3、修改配置文件
-
[root@node01 hadoop]# vim core-site.xml
-
<property> <name>fs.defaultFS</name> <value>hdfs://node01:9000</value> </property> <property> <name>hadoop.tmp.dir</name> <value>/var/yjx/hadoop/full</value> </property>
-
-
[root@node01 hadoop]# vim hdfs-site.xml
-
<property> <name>dfs.namenode.secondary.http-address</name> <value>node02:50090</value> </property> <property> <name>dfs.namenode.secondary.https-address</name> <value>node02:50091</value> </property> <property> <name>dfs.replication</name> <value>2</value> </property>
-
-
[root@node01 hadoop]# vim workers
-
node01 node02 node03
-
1.4、拷贝分发软件
-
将配置好的软件分发到其他主机
-
[root@node01 ~]# cd /opt/yjx/ [root@node01 yjx]# scp -r hadoop-3.1.2 root@node02:`pwd` [root@node01 yjx]# scp -r hadoop-3.1.2 root@node03:`pwd`
-
1.5、修改环境变量
-
[root@node01 hadoop]# vim /etc/profile
-
export HADOOP_HOME=/opt/yjx/hadoop-3.1.2 export PATH=$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$PATH
-
-
将环境变量拷贝到其他主机
-
[root@node01 yjx]# scp /etc/profile root@node02:/etc/profile [root@node01 yjx]# scp /etc/profile root@node03:/etc/profile
-
-
重新加载三台服务器的环境变量
-
【123】# source /etc/profile
-
1.6、格式化NameNode
-
[root@node01 yjx]# hdfs namenode -format
-
[root@node01 yjx]# start-dfs.sh
-
Starting namenodes on [node01] Last login: Fri Oct 30 21:32:11 CST 2020 from 192.168.58.1 on pts/0 node01: Warning: Permanently added 'node01,192.168.58.101' (ECDSA) to the list of known hosts. Starting datanodes Last login: Fri Oct 30 22:06:12 CST 2020 on pts/0 node03: Warning: Permanently added 'node03,192.168.58.103' (ECDSA) to the list of known hosts. node02: Warning: Permanently added 'node02,192.168.58.102' (ECDSA) to the list of known hosts. node01: Warning: Permanently added 'node01,192.168.58.101' (ECDSA) to the list of known hosts. node03: WARNING: /opt/yjx/hadoop-3.1.2/logs does not exist. Creating. node02: WARNING: /opt/yjx/hadoop-3.1.2/logs does not exist. Creating. Starting secondary namenodes [node02] Last login: Fri Oct 30 22:06:14 CST 2020 on pts/0 node02: Warning: Permanently added 'node02,192.168.58.102' (ECDSA) to the list of known hosts.
-
1.7、测试集群
- http://node01:9870
- [root@node01 ~]# hdfs dfs -mkdir -p /yjx
- [root@node01 ~]# hdfs dfs -put zookeeper-3.4.5.tar.gz /yjx/
- [root@node01 ~]# hdfs dfs -D dfs.blocksize=1048576 -put zookeeper-3.4.5.tar.gz /yjx/
- Hadoop的Shell命令
- http://hadoop.apache.org/docs/r1.0.4/cn/hdfs_shell.html
- https://www.cnblogs.com/duanxz/p/3799467.html
4.8、关闭集群
-
[root@node01 ~]# stop-dfs.sh
-
Stopping namenodes on [node01] Last login: Fri Oct 30 22:06:20 CST 2020 on pts/0 node01: Warning: Permanently added 'node01,192.168.58.101' (ECDSA) to the list of known hosts. Stopping datanodes Last login: Fri Oct 30 22:16:34 CST 2020 on pts/0 node03: Warning: Permanently added 'node03,192.168.58.103' (ECDSA) to the list of known hosts. node02: Warning: Permanently added 'node02,192.168.58.102' (ECDSA) to the list of known hosts. node01: Warning: Permanently added 'node01,192.168.58.101' (ECDSA) to the list of known hosts. Stopping secondary namenodes [node02] Last login: Fri Oct 30 22:16:35 CST 2020 on pts/0 node02: Warning: Permanently added 'node02,192.168.58.102' (ECDSA) to the list of known hosts.
-
-
关机拍摄快照
- [123]# shutdown -h now