centos 下 hadoop 3 单机安装配置

参考: www.cnblogs.com/forbeat/p/8179877.html


修改主机名:
    1、在/etc/hosts 增加 192.168.56.102 hadoopmaster
    2、在/etc/sysconfig/network 修改 HOSTNAME=hadoopmaster

重启centos:reboot

创建用户:useradd hadoop
创建用户密码:passwd hadoop 然后根据提示修改密码

创建目录:mkdir /home/hadoop/program
修改权限:chown -R hadoop:hadoop /home/hadoop

配置免密登陆:
ssh-keygen -t rsa 
cd ~/.ssh
cp id_rsa.pub authorized_keys
测试免密登陆:ssh localhost,不用输入密码为成功



1、jdk安装配置:

下载:wget http://download.oracle.com/otn-pub/java/jdk/8u151-b12/e758a0de34e24606bca991d704f6dcbf/jdk-8u151-linux-x64.tar.gz

解压至/home/hadoop/program:tar -zxvf jdk-8u151-linux-x64.tar.gz

配置环境变量(root下):在/etc/profile 末尾加上:
#set JAVA_HOME
export JAVA_HOME=/home/hadoop/program/jdk1.8.0_151   ## 这里要注意目录要换成自己解压的jdk 目录
export JRE_HOME=${JAVA_HOME}/jre  
export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib  
export PATH=${JAVA_HOME}/bin:$PATH
然后让它生效:source /etc/profile
验证:java -version

2、hadoop安装配置:

下载:wget http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-3.0.0/hadoop-3.0.0.tar.gz
解压至/home/hadoop/program:  tar -zxvf hadoop-3.0.0
配置环境变量:在/etc/profile 增加
# set HADOOP_HOME
export HADOOP_HOME=/home/hadoop/program/hadoop-3.0.0
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export  HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export  HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib
然后让它生效:source /etc/profile
配置hadoop相关文件:
进入/home/hadoop/program/hadoop-3.0.0/sbin

hadoop-env.sh 导入JAVA_HOME:

    export JAVA_HOME=/home/hadoop/program/jdk1.8.0_151   ## 这里要注意目录要换成自己解压的jdk 目录

core-site.xml,配置hdfs端口和地址,临时文件存放地址
<configuration>
    <property>
        <name>fs.default.name</name>
        <value>hdfs://hadoopmaster:9000</value>
    </property>
    <property>
        <name>hadoop.tmp.dir</name>
        <value>/home/hadoop/hadoop/tmp</value>
</configuration>
修改hdfs-site.xml 配置副本个数以及数据存放的路径
<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
    <property>
        <name>dfs.namenode.name.dir</name>
        <value>/home/hadoop/hadoop/hdfs/name</value>
    </property>
    <property>
        <name>dfs.namenode.data.dir</name>
        <value>/home/hadoop/hadoop/hdfs/data</value>
    </property>
</configuration>
修改mapred-site.xml,配置使用yarn框架执行mapreduce处理程序,与之前版本多了后面两部
不配置mapreduce.application.classpath这个参数mapreduce运行时会报错:
Error: Could not
find or load main class org.apache.hadoop.mapreduce.v2.app.MRAppMaster
<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
    <property>
        <name>mapreduce.application.classpath</name>
        <value>
            /usr/local/hadoop3/etc/hadoop,
            /usr/local/hadoop3/share/hadoop/common/*,
            /usr/local/hadoop3/share/hadoop/common/lib/*,
            /usr/local/hadoop3/share/hadoop/hdfs/*,
            /usr/local/hadoop3/share/hadoop/hdfs/lib/*,
            /usr/local/hadoop3/share/hadoop/mapreduce/*,
            /usr/local/hadoop3/share/hadoop/mapreduce/lib/*,
            /usr/local/hadoop3/share/hadoop/yarn/*,
            /usr/local/hadoop3/share/hadoop/yarn/lib/*
        </value>
    </property>
</configuration>

修改yar-site.xml
<configuration>

<!-- Site specific YARN configuration properties -->
    <property>
        <name>yarn.resourcemanager.hostname</name>
        <value>hadoopmaster</value>
    </property>
    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
</configuration>

workers文件里添加主机名:hadoopmaster


验证HADOOP:
1、namenode格式化:hadoop namenode -format
2、启动Hadoop:start-all.sh
3、输入jps ,显示NameNode、DataNode、SecondaryNameNode、NodeManager、ResourceManager
4、若有缺漏,查看对应日志:/home/hadoop/program/hadoop-3.0.0/logs目录下

阅读更多
想对作者说点什么? 我来说一句

没有更多推荐了,返回首页

不良信息举报

centos 下 hadoop 3 单机安装配置

最多只允许输入30个字

加入CSDN,享受更精准的内容推荐,与500万程序员共同成长!
关闭
关闭