修改HOSTS配置
1. /etc/hosts 添加下面
127.0.0.1 master
2.修改core-site.xml
<property>
<name>fs.defaultFS</name>
<value>hdfs://master:9000</value> # 这里是hosts 配置里面的名字
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/mytool/hadoop/tmp</value>
</property>
3. 修改 hdfs-site.xml
#<property>
#<name>dfs.namenode.secondary.http-address</name>
#<value>master:9001</value>
#</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>/mytool/hadoop/namenode</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>/mytool/hadoop/data</value>
</property>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.http.address</name>
<value>0.0.0.0:50070</value>
</property>
3. yarn-site.xml、mapred-site.xml、hadoop-env.sh
4.NoClassDefFoundError: javax/activation/DataSource
cd ${HADOOP_HOME}/share/hadoop/yarn/lib
wget https://repo1.maven.org/maven2/javax/activation/activation/1.1.1/activation-1.1.1.jar
5.org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool service to master/127.0.0.1:9000. Exiting
# 针对已搭建好环境的
从hadoop的namenode所在文件夹下的current文件中找到VERSION文件,
把里面的clusterID覆盖到datanode的VERSION文件里面
6.配置文件的路径名问题
如果写 master:port 只能本机访问,master在hosts指向127.0.0.1
外网访问写成0.0.0.0:port
7.hadoop3.2版本用root用户启动会有报错(不确定)
我的 /etc/profile 配置
# java environment
export JAVA_HOME=/mytool/java/jdk-15.0.1
export CLASSPATH=.:${JAVA_HOME}/jre/lib/rt.jar:${JAVA_HOME}/lib/dt.jar:${JAVA_HOME}/lib/tools.jar
export PATH=$PATH:${JAVA_HOME}/bin
# hadoop environment
export HADOOP_HOM=/mytool/hadoop/hadoop-3.2.1/
export PATH=$PATH:$HADOOP_HOME/bin
# add user
export HDFS_NAMENODE_USER=root
export HDFS_DATANODE_USER=root
export HDFS_SECONDARYNAMENODE_USER=root
export YARN_RESOURCEMANAGER_USER=root
export YARN_NODEMANAGER_USER=root
# hadoop
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS=-Djava.library.path=$HADOOP_HOME/lib