系统:Ubuntu15
1.创建Java环境(安装jdk7):
(1)输入jdk:会提示下载版本
(2)下载jdk :apt-get install openjdk-7-jdk(所有命令均在root权限下)
(3)修改配置文件/etc/profile:gedit /etc/profile(添加标注字段)
source /etc/profile(使文件生效)这部很重要,不要忘记
(4)测试是否安装成功:java -version
出现以上版本信息,为安装成功
2.SSH免密登录
(1)下载ssh:apt-get install ssh
(2)查看用户目录下是否存在“”“.ssh”文件夹:ls -a
如果没有,自己创建:mkdir .ssh
(3)输入:ssh-keygen -t rsa(一直按回车)
(4)输入:cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys(一直按回车)
(5)验证是否安装成功:ssh -version
出现版本信息为安装成功
(6)输入:ssh localhost
出现以上信息为免密成功
2.安装Hadoop
(1)将下载好的压缩包移至/opt目录下(为了方便查找)
mv hadoop2.7.3.tar.gz /opt/
(2)解压:切换到/opt目录下 cd /opt
tar -zxvf hadoop2.7.3.tar.gz
(3)ls
cd hadoop-2.7.3
ls
cd etc
ls
cd hadoop
ls
(5)修改四个配置文件
core-site.xml:gedit core-site.xml
添加:
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/home/work/hadoop-2.7.3/data/tmp</value>
</property>
</configuration>
hdfs-site.xml:gedit hdfs-site.xml
添加:
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.block.size</name>
<value>10485760</value>
</property>
</configuration>
mapred-site.xml.template
添加:
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
<property>
<name>mapreduce.jobhistory.address</name>
<value>127.0.0.1:10020</value>
</property>
<property>
<name>mapreduce.jobhistory.webapp.address</name>
<value>127.0.0.1:19888</value>
</property>
<property>
<name>maxsize</name>
<value>10485760</value>
</property>
<property>
<name>minsize</name>
<value>0</value>
</property>
<property>
<name>mapred.child.java.opts</name>
<value>-Xmx800m -Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8788</value>
</property>
</configuration>
yarn-site.xml
添加:
<configuration>
<!-- Site specific YARN configuration properties -->
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
</configuration>
hadoop-env.sh
添加:export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64()
(6)格式化namenode
cd /opt/hadoop-2.7.3(切至/opt/hadoop-2.7.3目录下)
bin/hdfs namenode -format
(7)修改配置文件/etc/profile:gedit /etc/profile(添加标注字段)
source /etc/profile(使其生效)
(8)安装配置过程结束,现在开启Hadoop:切至/opt/hadoop-2.7.3目录下:sbin/start-all.sh
查看进程:jps(如果出现如下6个进程,说明已安装成功)
(8)关闭Hadoop:sbin/stop-all.sh