centos7.0配置
ip地址,hostname划分
192.168.56.101 hadoop101 #master
192.168.56.102 hadoop102 #slave
192.168.56.103 hadoop103 #slave
- jdk已安装
开启远程免密登录配置
- 生成秘钥:
ssh-keygen -t rsa -P ""
- 拷贝秘钥:
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
- 开启远程免密登录配置:
ssh-copy-id -i .ssh/id_rsa.pub -p22 [root@192.168.56.101]
ssh -p 22 [root@192.168.56.102](mailto:root@192.168.56.122)
配置hadoop
文件目录 /opt/hadoop/etc/hadoop
- hadoop-env.sh
# The java implementation to use.
export JAVA_HOME=/opt/java8 //javehome的环境变量
- core-site.xml
<configuration>
<property>
<name>fs.defaultFS</name> #默认文件系统的名称。
<value>hdfs://192.168.56.101:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name> #临时目录设定
<value>/opt/hadoop/tmp</value>
</property>
<property>
<name>hadoop.proxyuser.root.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.root.groups</name>
<value>*</value>
</property>
</configuration>
- hdfs-site.xml
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.secondary.http-address</name>
<value>hadoop101:50090</value>
</property>
</configuration>
- mapred-site.xml
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
<property>
<name>mapreduce.jobhistory.address</name>
<value>hadoop101:10020</value>
</property>
<property>
<name>mapreduce.jobhistory.webapp.address</name>
<value>hadoop101:19888</value>
</property>
</configuration>
- yarn-site.xml
<configuration>
<!-- reducer获取数据方式 -->
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
<value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>
<!-- 指定YARN的ResourceManager的地址 -->
<property>
<name>yarn.resourcemanager.hostname</name>
<value>hadoop101</value>
</property>
<!-- 日志聚集功能使用 -->
<property>
<name>yarn.log-aggregation-enable</name>
<value>true</value>
</property>
<!-- 日志保留时间设置7天 -->
<property>
<name>yarn.log-aggregation.retain-seconds</name>
<value>604800</value>
</property>
</configuration>
- vi ./slaves
hadoop101
hadoop102
hadoop103
Hadoop环境变量配置
-
vi /etc/profile
-
export HADOOP_HOME=/opt/hadoop export HADOOP_MAPRED_HOME=$HADOOP_HOME export HADOOP_COMMON_HOME=$HADOOP_HOME export HADOOP_HDFS_HOME=$HADOOP_HOME export YARN_HOME=$HADOOP_HOME export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib" export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
-
source /etc/profile
启动hadoop
-
格式化HDFS:
hadoop namenode -format
-
启动hadoop:
start-all.sh
-
访问Hadoop: http://192.168.56.101:50070 HDFS页面
-
http://192.168.56.101:8088 YARN的管理界面
hadoop集群
- 复制虚拟机,重新生成秘钥
- 生成秘钥:
ssh-keygen -t rsa -P ""
- 覆盖秘钥:
cat ~/.ssh/id_rsa.pub > ~/.ssh/authorized_keys
- 修改core-site.xml 和 hdfs-site.xml文件中ip地址