一、 JDK安装
创建目录
sudo mkdir /usr/lib/jvm
将下载好的jdk压缩包解压
sudo tar -zxvf dk-8u291-linux-x64.tar.gz -C /usr/lib/jvm/
环境变量
sudo vim /etc/profile
添加
export JAVA_HOME=/usr/lib/jvm/jdk1.8.0_291
export JRE_HOME=${JAVA_HOME}/jre
export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib
export PATH=${JAVA_HOME}/bin:$PATH
文件生效
source /etc/profile
检查jdk是否安装好
java -version
二、Hadoop安装(单机、伪分布式部署)
下载好Hadoop压缩包后
mkdir /usr/local/hadoop
sudo tar -zxvf hadoop-3.3.0.tar.gz -C /usr/local/hadoop/
设置环境变量
sudo vi /etc/profile
追加内容
export HADOOP_HOME=/usr/local/hadoop/hadoop-3.3.0
export PATH=${HADOOP_HOME}/bin:${HADOOP_HOME}/sbin:${PATH}
# 启动dfs时出现这个报错时
# util.NativeCodeLoader: Unable to load native-hadoop library for your platform...
# using builtin-java classes where applicable
# 添加这句
export JAVA_LIBRARY_PATH=${HADOOP_HOME}/lib/native
使变量生效
在hadoop-env.sh文件中需要另外添加
export JAVA_HOME=/usr/lib/jvm/jdk1.8.0_291
export HADOOP_HOME=/usr/local/hadoop/hadoop-3.3.0
修改hadoop配置文件参考https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-common/SingleCluster.html#Pseudo-Distributed_Operation