Hadoop2.6.5单机安装
JDK的安装
配置JDK环境变量
[root@spark1 soft]# vim /etc/profile
#JDK环境变量配置
#export JAVA_HOME=/application/jdk1.7.0_79
export JAVA_HOME=/application/jdk1.8.0_172
export JRE_HOME=$JAVA_HOME/jre
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$JRE_HOME/lib/rt.jar
export PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin
环境变量生效
[root@spark1 soft]# source /etc/profile
[root@spark1 soft]# java -version
openjdk version "1.8.0_121"
OpenJDK Runtime Environment (build 1.8.0_121-b13)
OpenJDK 64-Bit Server VM (build 25.121-b13, mixed mode)
[root@spark1 soft]#
配置SSH无密码登陆
$ ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
$ cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys
验证ssh,# ssh localhost
不需要输入密码即可登录。
Hadoop安装
下载
下载地址:
https://www.apache.org/dyn/closer.cgi/hadoop/common/
https://mirrors.tuna.tsinghua.edu.cn/apache/hadoop/common/hadoop-2.6.5/hadoop-2.6.5.tar.gz
解压安装
[root@spark1 soft]# tar -zxvf hadoop-2.6.5.tar.gz -C /application/
创建hadoop安装所需目录
在/root /hadoop/目录下,建立tmp、hdfs/name、hdfs/data目录,执行如下命令
#mkdir /root/hadoop/tmp
#mkdir /root/hadoop/hdfs
#mkdir /root/hadoop/hdfs/data
#mkdir /root/hadoop/hdfs/name
设置Hadoop环境变量
#Hadoop环境变量配置
export HADOOP_HOME=/application/hadoop-2.6.5
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
[root@spark1 soft]# source /etc/profile
Hadoop配置
进入$HADOOP_HOME/etc/hadoop目录,配置 hadoop-env.sh等。涉