基础环境
centos 7.7 三台
hadoop需要的环境
Requirements:
* Unix System
* JDK 1.8
* Maven 3.3 or later
* ProtocolBuffer 2.5.0
* CMake 3.1 or newer (if compiling native code)
* Zlib devel (if compiling native code)**
* Cyrus SASL devel (if compiling native code)
* One of the compilers that support thread_local storage: GCC 4.8.1 or later, Visual Studio,
Clang (community version), Clang (version for iOS 9 and later) (if compiling native code)
* openssl devel (if compiling native hadoop-pipes and to get the best HDFS encryption performance)
* Linux FUSE (Filesystem in Userspace) version 2.6 or above (if compiling fuse_dfs)
* Doxygen ( if compiling libhdfspp and generating the documents )
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
* python (for releasedocs)
* bats (for shell code testing)
* Node.js / bower / Ember-cli (for YARN UI v2 building)
修改hosts
192.168.1.26 master01 v26
192.168.1.43 master02 v43
192.168.1.187 node01 v187
升级内核到4.4
yum update -y
yum install -y gcc gcc-c++
下载jdk1.8
mv jdk1.8.0_231 /usr/local/
alternatives --install /usr/bin/java java /usr/local/jdk1.8.0_231/bin/java 18000
update-alternatives --config java
安装maven
tar -zxvf apache-maven-3.6.1-bin.tar.gz
mv apache-maven-3.6.1 /opt/
ln -s /opt/apache-maven-3.6.1 /opt/apache-maven 进行软连接
01更改环境变量
vi /etc/profile
export MAVEN_HOME=/usr/local/apache-maven-3.6.1
export PATH=.:$PATH:$JAVA_HOME/bin:$MAVEN_HOME/bin
source /etc/profile
mvn -version #查看安装是否完成
安装hadoop
tar -zxvf Hadoop-xxx-src.tar.gz
cd Hadoop
mvn package -Pdist -DskipTests -Dtar -Dmaven.javadoc.skip=true
(PS.编译安装需要等的时间比较久)
02修改环境变量
修改core-site.xml
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://V70:9000</value>
</property>
</configuration>
vi hdfs-site.xml
<configuration>
<property>
<name> </name>
<value>v110:50090</value>
</property>
<property>
<name>dfs.replication</name>
<value>2</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>file:/usr/local/data/hadoop/hdfs/name</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file:/usr/local/data/hadoop/hdfs/data</value>
</property>
</configuration>
vi yarn-site.xml
<configuration>
<!-- Site specific YARN configuration properties -->
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.log-aggregation-enable</name>
<value>true</value>
</property>
<property>
<name>yarn.nodemanager.localizer.address</name>
<value>0.0.0.0:8140</value>
</property>
<property>
<name>yarn.log-aggregation.retain-seconds</name>
<value>106800</value>
</property>
<property>
<name>yarn.log.server.url</name>
<value>http://v110:19888/jobhistory/logs</value>
</property>
</configuration>
vi mapred-site.xml
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
<property>
<name>mapreduce.jobhistory.address</name>
<value>v110:10020</value>
</property>
<property>
<name>mapreduce.jobhistory.webapp.address</name>
<value>v110:19888</value>
</property>
</configuration>
总结:安装文档仅供参考,配合官方文档一起食用效果更佳,此次为第二次搭建hadoop,多做搭建,手到擒来