下载hadoop:
http://www.apache.org/dyn/closer.cgi/hadoop/common/
选择了最新版的2.6.0.tar.gz
为三台虚拟机安装jdk(百度:在linux下安装jdk)
参看:http://jingyan.baidu.com/article/d621e8dae805272865913fa7.html
注:以前安装过hadoop0.20,先删除用户和用户文件(百度:linux下删除用户、linux下删除目录)
[skx1@skx1 Desktop]$ su
Password:
[root@skx1 Desktop]# userdel -r grid
-r连同grid的目录文件一块删除;
对其他2个节点进行相同操作,即可;
1.配置hosts文件(对各主机名的解析称ip地址)
cd /etc
vi hosts
192.168.159.131 skx1
192.168.159.132 skx2
192.168.159.133 skx3
测试下
2.建立hadoop账号(使用root有安全隐患)
[root@skx1 Desktop]# useradd hadoop
[root@skx1 Desktop]# passwd hadoop
Changing password for user hadoop.
New password:
BAD PASSWORD: it does not contain enough DIFFERENT characters
BAD PASSWORD: is too simple
Retype new password:
passwd: all authentication tokens updated successfully.
qw1122
3.以hadoop登陆,创建ssh公钥(公钥私密、私钥公密)
[root@skx1 Desktop]# su hadoop
[hadoop@skx1 Desktop]$ cd
[hadoop@skx1 ~]$ ssh-keygen -t rsa
Generating public/private rsa key pair.
Enter file in which to save the key (/home/hadoop/.ssh/id_rsa): /home/hadoop/.ssh/id_rsa
Created directory '/home/hadoop/.ssh'.
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
****22
[hadoop@skx1 ~]$ cd .ssh
[hadoop@skx1 .ssh]$ cp id_rsa.pub authorized_keys
查看一下authorized_keys内容
[hadoop@skx1 .ssh]$ cat authorized_keys
4,在其他两个节点中配置hosts文件、创建hadoop用户和ssh文件,最后将三个ssh文件合并成为一个ssh文件,放在三个节点中,即完成节点间各自拥有其他节点的公钥。
测试:ssh hadoop@skx2 提示输入密码****22 再输一遍 ,就不需要输入密码了,表明ssh信任添加成功;
对每个节点进行测试.
5.下载hadoop,拷贝在home/hadoop中
[root@skx1 tmp]# cp hadoop-2.6.0.tar.gz /home/hadoop