【配置】Hadoop单节点集群搭建

准备工作(电脑是Windows64位的操作系统)

  • VMware Workstation(CentOS-6.5-x86_64-bin-DVD1.iso);
  • Xshell;
  • jdk-7u79-linux-x64.tar.gz;
  • hadoop-2.6.0.tar.gz;

关闭防火墙

这里写图片描述

重启之后查看防火墙状态
这里写图片描述

绑定hostname与IP

[root@node4 ~]# vi /etc/hosts 

127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
::1 localhost localhost.localdomain localhost6 localhost6.localdomain6
192.168.70.104 node4 /* IP hostname */

创建hadoop用户和用户组

[root@node4 ~]# useradd -m hadoop /*创建hadoop用户及其家目录*/
[root@node4 ~]# cd /home
[root@node4 home]# ls
hadoop
[root@node4 home]# su root
[root@node4 home]# passwd hadoop /*为hadoop用户设置密码*/
Changing password for user hadoop.
New password:
BAD PASSWORD: it is based on a dictionary word
Retype new password:
passwd: all authentication tokens updated successfully.
[root@node4 home]#

配置ssh免密码登录

这里写图片描述

切换到hadoop用户的根目录,
创建.ssh文件夹,ss-keygen生成、管理和转换认证秘钥,-t指定秘钥类型。

[hadoop@node4 ~]$ cd .ssh
[hadoop@node4 .ssh]$ ls
id_rsa  id_rsa.pub
[hadoop@node4 .ssh]$ cat id_rsa.pub >> authorized_keys /*复制*/
[hadoop@node4 .ssh]$ ls
authorized_keys  id_rsa  id_rsa.pub
[hadoop@node4 .ssh]$ cat authorized_keys /*查看*/
ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA7t8/VWOdBRq7y2r5iiorTB4WWYit+MCI3FeMtyhD9txbJf0n1rFcoOkszEO2JPFBGyPdXmd7hjOGGhs9aHvj6XT92WsdmrHBoJeZ+XPtxr2OPqLjK1AdSbLwY+kKh22Ko+UVOJbO3AChilQI4ziWrpJ5+r3IvwKFNGFnuNsTcg4Q3+so6TsPX5Mkh2NAoxBG6F982BAet7cKBgA9YcrhXEzyr8daFrEo8Hpf7103PJjqJvKXvFVpwgbpa6m1suRzEu3KU6ZihiV65nU6WqV/78ZKLP1nqqLtdNNNaqbhFlToCHKdhgM1uLC6QMy5eVEVhm/Hw/EPxg+OT5jBpHM7WQ== hadoop@node4
[hadoop@node4 .ssh]$ cd ..
[hadoop@node4 ~]$ chmod 700 .ssh /*赋予user读写操作.ssh文件的目录权限*/
[hadoop@node4 ~]$ chmod 600 .ssh/*

ps:
这里写图片描述

JDK安装

[hadoop@node4 ~]$ pwd
/home/hadoop
[hadoop@node4 ~]$ mkdir app /*创建app目录*/
[hadoop@node4 ~]$ cd app
[hadoop@node4 app]$ rz /*上传*/

[hadoop@node4 app]$ ls
jdk-7u79-linux-x64.tar.gz
[hadoop@node4 app]$tar -zxvf jdk-7u79-linux-x64.tar.gz /*解压*/
[hadoop@node4 app]$ ln  -s jdk1.7.0_79 jdk/*创建软连接*/
[hadoop@node4 app]$ ls
jdk  jdk1.7.0_79
[hadoop@node4 app]$

配置环境变量

[hadoop@node4 app]$  vi ~/.bashrc

# .bashrc

# Source global definitions
if [ -f /etc/bashrc ]; then
        . /etc/bashrc
fi

# User specific aliases and functions
JAVA_HOME=/home/hadoop/app/jdk
CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
PATH=$JAVA_HOME/bin:$PATH
export JAVA_HOME CLASSPATH PATH
hadoop@node4 app]$ source ~/.bashrc/*使配置生效*/
[hadoop@node4 app]$ java -version /*出现以下版本信息说明安装成功*/
java version "1.7.0_79"
Java(TM) SE Runtime Environment (build 1.7.0_79-b15)
Java HotSpot(TM) 64-Bit Server VM (build 24.79-b02, mixed mode)
[hadoop@node4 app]$

安装hadoop
将hadoop安装包上传到/home/hadoop/app目录下,解压(同上)。

[hadoop@node4 app]$ ls
hadoop-2.6.0  jdk  jdk1.7.0_79
[hadoop@node4 app]$ cd hadoop-2.6.0
[hadoop@node4 hadoop-2.6.0]$ ls
bin  etc  include  lib  libexec  LICENSE.txt  NOTICE.txt  README.txt  sbin  share
[hadoop@node4 hadoop-2.6.0]$ bin/hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /home/hadoop/app/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
[hadoop@node4 hadoop-2.6.0]$ vi hello.txt/*自行编辑一个txt文件*/
[hadoop@node4 hadoop-2.6.0]$ bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar wordcount hello.txt output/*运行hadoop自带的Wordcount实例*/

hello.txt
这里写图片描述

运行结果
这里写图片描述

修改并替换core-site.xml、hdfs-site.xml、hadoop-env.sh、mapred-site.xml、yarn-site.xml、slaves这些文件。

这里写图片描述

这里写图片描述

这里写图片描述

这里写图片描述

这里写图片描述

这里写图片描述
这里写图片描述

配置环境变量

这里写图片描述

[hadoop@node4 ~]$ vi ~/.bashrc

这里写图片描述

[hadoop@node4 ~]$ source ~/.bashrc

创建Hadoop相关数据目录

[hadoop@node4 ~]$ mkdir -p /home/hadoop/data/tmp
[hadoop@node4 ~]$ mkdir -p /home/hadoop/data/dfs/name
[hadoop@node4 ~]$ mkdir -p /home/hadoop/data/dfs/data

格式化namenode

[hadoop@node4 hadoop]$ ls
bin  etc  hello.txt  include  lib  libexec  LICENSE.txt  NOTICE.txt  output  README.txt  sbin  share
[hadoop@node4 hadoop]$ bin/hdfs  namenode -format/*格式化namenode*/
[hadoop@node4 hadoop]$ sbin/start-all.sh/*启动集群*/
[hadoop@node4 hadoop]$ jps
1685 NameNode
2204 Jps
2174 NodeManager
1769 DataNode
2088 ResourceManager
1947 SecondaryNameNode

启动集群的部分信息

这里写图片描述

使用webUI查看namenode和yarn信息
在C盘的Windows32目录下找到hosts文件,编辑IP 对应的hostname

这里写图片描述

这里写图片描述

测试运行hadoop伪分布式集群

[hadoop@node4 hadoop]$ bin/hdfs dfs -ls /
[hadoop@node4 hadoop]$ bin/hdfs dfs -mkdir /abc
[hadoop@node4 hadoop]$ bin/hdfs dfs -ls /
Found 1 items
drwxr-xr-x   - hadoop supergroup         0 2017-11-22 15:41 /abc
[hadoop@node4 hadoop]$ ls
bin  etc  hello.txt  include  lib  libexec  LICENSE.txt  logs  NOTICE.txt  output  README.txt  sbin  share
[hadoop@node4 hadoop]$ bin/hdfs dfs -put hello.txt /abc/
[hadoop@node4 hadoop]$ bin/hdfs dfs -ls /abc/
Found 1 items
-rw-r--r--   1 hadoop supergroup         23 2017-11-22 15:43 /abc/hello.txt
[hadoop@node4 hadoop]$ bin/hdfs dfs -cat /abc/hello.txt
hello hello iiii



[hadoop@node4 hadoop]$

以上说明单机版的hadoop集群安装成功

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值