3个节点hadoop搭建

###1 安装linux jdk
###2 检查3个节点时间是否相同
###3 主节点ssh免登陆从节点
分别在三个节点生成公钥和公钥追加,节点自己免登陆,不生成追加时会失败;将主节点公钥复制到从节点,在从节点公钥追加。指令如下:

生成公钥
ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
公钥追加
cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys   
复制公钥
scp ./id_dsa.pub root@192.168.1.9:/opt/    

###4 配置文件

4.1 vi hadoop-env.sh NN 主机端口
	  数据上传下载端口 rpc协议
	 <property>
                <name>fs.defaultFS</name>
                <value>hdfs://192.168.1.8:9000</value>
    </property>
    默认数据会丢失,Linux重启temp默认清空,零时数据丢失
    <property>
             <name>hadoop.tmp.dir</name>
             <value>/opt/hadoop2.5</value>                     
    </property>
4.2 hdfs-site.xml
	SNN   浏览器访问
	<property>
             <name>dfs.namenode.secondary.http-address</name>
             <value>192.168.1.9:50090</value>
    </property>
<property>
             <name>dfs.namenode.secondary.https-address</name>
             <value>192.168.1.9:50091</value>
    </property>
4.3 slaves DNN主机名

192.168.1.8
192.168.1.9
192.168.1.10

4.4 masters SNN主机名

192.168.1.9
###5 复制

5.1 复制目录到其他节点
 scp -r hadoop-2.5.1/ root@192.168.1.9:/home/   
5.2 配置hadoop环境变量,复制目录到其他节点
 vi ~/.bash_profile 
export HADOOP_HOME=/home/hadoop-2.5.1
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin

scp ~/.bash_profile root@192.168.1.9:/root/

###6初始化fsimge
hdfs namenode -format
###7 启动

start -dfs.sh

[root@localhost current]# start-dfs.sh
Starting namenodes on [192.168.1.8]
192.168.1.8: starting namenode, logging to /home/hadoop-2.5.1/logs/hadoop-root-namenode-localhost.localdomain.out
192.168.1.10: starting datanode, logging to /home/hadoop-2.5.1/logs/hadoop-root-datanode-node3.out
192.168.1.9: starting datanode, logging to /home/hadoop-2.5.1/logs/hadoop-root-datanode-node2.out
192.168.1.8: starting datanode, logging to /home/hadoop-2.5.1/logs/hadoop-root-datanode-localhost.localdomain.out
Starting secondary namenodes [192.168.1.9]
192.168.1.9: starting secondarynamenode, logging to /home/hadoop-2.5.1/logs/hadoop-root-secondarynamenode-node2.out

###8 访问监控页面关防火墙
service iptables stop
###9关闭

[root@localhost current]# stop-dfs.sh
Stopping namenodes on [node1]
The authenticity of host 'node1 (192.168.1.8)' can't be established.
RSA key fingerprint is b9:4a:f5:3f:90:b4:77:67:dd:24:1f:e4:5b:ef:65:b2.
Are you sure you want to continue connecting (yes/no)? yes
node1: Warning: Permanently added 'node1' (RSA) to the list of known hosts.
node1: stopping namenode
192.168.1.10: no datanode to stop
192.168.1.8: stopping datanode
192.168.1.9: no datanode to stop
Stopping secondary namenodes [node2]
The authenticity of host 'node2 (192.168.1.9)' can't be established.
RSA key fingerprint is b9:4a:f5:3f:90:b4:77:67:dd:24:1f:e4:5b:ef:65:b2.
Are you sure you want to continue connecting (yes/no)? yes
node2: Warning: Permanently added 'node2' (RSA) to the list of known hosts.
node2: stopping secondarynamenode

注:DNN有点问题,待调

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值