Hadoop2.4.1伪分布式安装过程记录

一、准备环境
a)安装jdk
[root@localhost java]# ll
total 4
lrwxrwxrwx. 1 root root   11 Aug 28 02:53 jdk -> jdk1.7.0_45
drwxr-xr-x. 8 uucp  143 4096 Oct  8  2013 jdk1.7.0_45
b)关闭防火墙
[root@localhost java]# chkconfig iptables off
[root@localhost java]# service iptables stop
c)修改主机名
[root@localhost java]# vim /etc/sysconfig/network
[root@localhost java]# hostname localhost
[root@localhost java]# hostname
localhost
[root@localhost java]# hostname -f
localhost
d)修改nofile和nproc
[root@localhost java]# vim /etc/security/limits.conf
  * soft nofile 65535
* hard nofile 65535
* soft nproc  65535
* hard nproc  65535
[root@localhost security]# ulimit -n
65535
[root@localhost security]# ulimit -a
core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 32767
max locked memory       (kbytes, -l) 64
max memory size         (kbytes, -m) unlimited
open files                      (-n) 65535
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 10240
cpu time               (seconds, -t) unlimited
max user processes              (-u) 1024
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited
e)创建hadp用户
[root@localhost security]# groupadd hadp
[root@localhost security]# useradd -g hadp hadp
[root@localhost security]# passwd hadp




二、安装Hadoop
a)下载分发包
http://archive.apache.org/dist/hadoop/common/hadoop-2.4.1/
将其放置到待安装机器的/usr/local/src目录下


b)解压包
[root@localhost src]# ll
total 323244
-rw-r--r--. 1 root root  54246778 Aug 28 02:51 apache-hive-0.13.1-bin.tar.gz
-rw-r--r--. 1 root root 138656756 Aug 28 02:51 hadoop-2.4.1.tar.gz
-rw-r--r--. 1 root root 138094686 Aug 28 02:51 jdk-7u45-linux-x64.tar.gz
[root@localhost src]# pwd
/usr/local/src
[root@localhost src]# tar xvf hadoop-2.4.1.tar.gz -C /home/hadp/
[root@localhost hadp]# cd /home/hadp
[root@localhost hadp]# ln -s hadoop-2.4.1 hadoop
[root@localhost hadp]# ll
total 4
lrwxrwxrwx.  1 hadp hadp   12 Aug 28 03:13 hadoop -> hadoop-2.4.1
drwxr-xr-x. 10 hadp hadp 4096 Aug 28 03:21 hadoop-2.4.1
[root@localhost hadp]# chown -R hadp:hadp /home/hadp
[root@localhost hadp]# mkdir /export
[root@localhost hadp]# chown -R hadp:hadp /export




c)修改环境变量
[root@localhost hadp]# su - root
[root@localhost ~]# vim /etc/profile
[root@localhost ~]# source /etc/profile
export JAVA_HOME=/usr/java/jdk
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
export PATH=$JAVA_HOME/bin:$PATH
export HADOOP_HOME=/home/hadp/hadoop
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_YARN_HOME=$HADOOP_HOME
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOOME/sbin:$HADOOP_HOME/lib
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"


d)修改Hadoop配置文件
[root@localhost ~]# su hadp
[hadp@localhost root]$ cd ~/hadoop/etc/hadoop/
[hadp@localhost hadoop]$ vim hadoop-env.sh  主要修改JAVA_HOME等
[hadp@localhost hadoop]$ vim core-site.xml
<property>
          <name>fs.defaultFS</name>
          <value>hdfs://127.0.0.1:9000</value>
      </property>


[hadp@localhost hadoop]$ vim hdfs-site.xml
<property>
          <name>dfs.replication</name>
          <value>1</value>
      </property>
      <property>
        <name>dfs.namenode.name.dir</name>
          <value>/export/hadoop/hdfs/name</value>
      </property>
      <property>
          <name>dfs.datanode.data.dir</name>
          <value>/export/hadoop/hdfs/data</value>
      </property>


[hadp@localhost hadoop]$ vim mapred-site.xml
<property>
          <name>mapreduce.framework.name</name>
          <value>yarn</value>
      </property>


e)格式化、启动
[hadp@localhost hadoop]$ cd ~/hadoop
[hadp@localhost hadoop]$ bin/hdfs namenode -format
[hadp@localhost hadoop]$ sbin/start-dfs.sh
[hadp@localhost hadoop]$ sbin/start-yarn.sh
然后就可以在浏览器中查看hadoop和job的管理界面,IP需根据你机器情况更改
http://192.168.20.64:50070
http://192.168.20.64:8088





评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值