7 centos ssh 单机_CentOS 7.4下Hadoop 3.0单机安装教程

简单记录CentOS 7.4下Hadoop 3.0单机安装教程。

相关下载

hive下载地址

http://mirrors.hust.edu.cn/apache/hive/stable-2/apache-hive-2.3.2-bin.tar.gz

hadoop下载地址

https://mirrors.tuna.tsinghua.edu.cn/apache/hadoop/common/hadoop-3.0.0/hadoop-3.0.0.tar.gz

1. hive安装目录  /opt/hive/apache-hive-2.3.2-bin

2. hadoop安装目录 /opt/hadoop/hadoop-3.0.0

3. JDK安装目录 /usr/java/jdk1.8.0_65

4. 环境变量配置

export  JAVA_HOME=/usr/java/jdk1.8.0_65

export  HADOOP_HOME=/opt/hadoop/hadoop-3.0.0

export  HADOOP_CONF_DIR=${HADOOP_HOME}/etc/hadoop

export  HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_HOME}/lib/native

export  HADOOP_OPTS="-Djava.library.path=${HADOOP_HOME}/lib"

export  HIVE_HOME=/opt/hive/apache-hive-2.3.2-bin

export  HIVE_CONF_DIR=${HIVE_HOME}/conf

export  CLASS_PATH=.:${JAVA_HOME}/lib:${HIVE_HOME}/lib:$CLASS_PATH

export  PATH=.:${JAVA_HOME}/bin:${HADOOP_HOME}/bin:${HADOOP_HOME}/sbin:${HIVE_HOME}/bin:$PATH

5. 让环境变量生效

source  /etc/profile

6.  vim /opt/hadoop/hadoop-3.0.0/etc/hadoop/core-site.xml 修改

fs.defaultFS

hdfs://localhost:9000

hadoop.tmp.dir

/opt/hadoop/tmp

7. vim /opt/hadoop/hadoop-3.0.0/etc/hadoop/hdfs-site.xml 修改增加以下内容

dfs.name.dir

/opt/hadoop/hdfs/name

namenode上存储hdfs名字空间元数据

dfs.data.dir

/opt/hadoop/hdfs/data

datanode上数据块的物理存储位置

dfs.replication

1

8. SSH 设置免密码登录

ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa

cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys

chmod 0600 ~/.ssh/authorized_keys

9. 启动命令

9.1 初始化

cd /opt/hadoop/hadoop-3.0.0

./bin/hdfs namenode -format

9.2 启动命令

./sbin/start-dfs.sh

9.3 停止命令

./sbin/stop-dfs.sh

错误处理

Starting namenodes on [localhost]

ERROR: Attempting to operate on hdfs namenode as root

ERROR: but there is no HDFS_NAMENODE_USER defined. Aborting operation.

Starting datanodes

ERROR: Attempting to operate on hdfs datanode as root

ERROR: but there is no HDFS_DATANODE_USER defined. Aborting operation.

Starting secondary namenodes [bogon]

ERROR: Attempting to operate on hdfs secondarynamenode as root

ERROR: but there is no HDFS_SECONDARYNAMENODE_USER defined. Aborting operation.

处理1

$ vim sbin/start-dfs.sh

$ vim sbin/stop-dfs.sh

两处增加以下内容

HDFS_DATANODE_USER=root

HADOOP_SECURE_DN_USER=hdfs

HDFS_NAMENODE_USER=root

HDFS_SECONDARYNAMENODE_USER=root

处理2

$ vim sbin/start-yarn.sh

$ vim sbin/stop-yarn.sh

两处增加以下内容

YARN_RESOURCEMANAGER_USER=root

HADOOP_SECURE_DN_USER=yarn

YARN_NODEMANAGER_USER=root

10. 验证安装

http://192.168.50.48:9870/dfshealth.html#tab-overview

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值