下载Hadoop和JDK
下载Hadoop地址:http://archive.cloudera.com/cdh5/cdh/5/hadoop-2.6.0-cdh5.7.0.tar.gz
jdk推荐1.7版本
安装JDK
- 解压jdk压缩包
tar -zxvf /home/hadoop/software/jdk-7u80-linux-x64.tar.gz -C /usr/java
- 配置jdk环境变量
-
hadoop:root:/usr/java:>vi /etc/profile
-
-
# /etc/profile
-
-
# System wide environment and startup programs, for login setup
-
# Functions and aliases go in /etc/bashrc
-
-
# It's NOT a good idea to change this file unless you know what you
-
# are doing. It's much better to create a custom.sh shell script in
-
# /etc/profile.d/ to make custom changes to your environment, as this
-
# will prevent the need for merging in future updates.
-
-
#add path
-
export JAVA_HOME=/usr/java/jdk1
.7
.0_80
-
export PATH=
$JAVA_HOME</span>/bin:<span class="hljs-variable">$PATH
-
#show path
-
-
-
-
-
hadoop:root:/usr/java:>
source /etc/profile
-
hadoop:root:/usr/java:>java -version
-
java version
"1.7.0_80"
-
Java(TM)
SE Runtime Environment (build
1.7
.0_80-b15)
-
Java HotSpot(TM)
64-Bit Server
VM (build
24.80-b11, mixed mode)
配置shh
-
hadoop:hadoop:/home/hadoop:>ssh-keygen -t rsa
-
Generating
public/
private rsa
key pair.
-
Enter file
in which
to save the
key (/home/hadoop/.ssh/id_rsa):
-
Enter passphrase (empty
for no passphrase):
-
Enter same passphrase again:
-
Your identification has been saved
in /home/hadoop/.ssh/id_rsa.
-
Your
public
key has been saved
in /home/hadoop/.ssh/id_rsa.pub.
-
The
key fingerprint
is:
-
ca:
20:e2:
68:
64:
46:e0:f2:
62:
63:b9:
60:
71:a5:
75:
4a hadoop@hadoop
-
The
key
's randomart image is:
-
+--[ RSA
2048]----+
-
|. E . |
-
|o = o |
-
|.+ o . |
-
|o.+ |
-
|+Xo . S |
-
|@oo. o . |
-
|.+ o |
-
|. |
-
| |
-
+-----------------+
-
hadoop:hadoop:/home/hadoop:>
-
hadoop:hadoop:/home/hadoop:>cp .ssh/id_rsa.pub ~/.ssh/authorized_keys
-
hadoop:hadoop:/home/hadoop:>cd .ssh/
-
hadoop:hadoop:/home/hadoop/.ssh:>ll
-
total
12
-
-rw-r--r--
1 hadoop hadoop
395 Jan
2
02:
16 authorized_keys
-
-rw-------
1 hadoop hadoop
1675 Jan
2
02:
16 id_rsa
-
-rw-r--r--
1 hadoop hadoop
395 Jan
2
02:
16 id_rsa.pub
安装Hadoop
解压Hadoop
hadoop:hadoop:/home/hadoop/app:>tar -zxvf /home/hadoop/software/hadoop-2.6.0-cdh5.7.0.tar.gz -C /home/hadoop/app/配置环境
-
hadoop:hadoop:/home/hadoop:>vi .bash_profile
-
-
# .bash_profile
-
-
# Get the aliases and functions
-
if [
-f ~/.bashrc ];
then
-
. ~/.bashrc
-
fi
-
-
# User specific environment and startup programs
-
-
export HADOOP_HOME=/home/hadoop/app/hadoop-
2.6.
0-cdh5.
7.0
-
export PATH=
$HADOOP_HOME</span></span>/bin:<span class="hljs-variable"><span class="hljs-variable">$HADOOP_HOME/sbin:
$PATH</span></span></div></div></li><li><div class="hljs-ln-numbers"><div class="hljs-ln-line hljs-ln-n" data-line-number="14"></div></div><div class="hljs-ln-code"><div class="hljs-ln-line"> </div></div></li><li><div class="hljs-ln-numbers"><div class="hljs-ln-line hljs-ln-n" data-line-number="15"></div></div><div class="hljs-ln-code"><div class="hljs-ln-line"> </div></div></li><li><div class="hljs-ln-numbers"><div class="hljs-ln-line hljs-ln-n" data-line-number="16"></div></div><div class="hljs-ln-code"><div class="hljs-ln-line">hadoop:hadoop:/home/hadoop:><span class="hljs-built_in"><span class="hljs-built_in">source</span></span> .bash_profile </div></div></li><li><div class="hljs-ln-numbers"><div class="hljs-ln-line hljs-ln-n" data-line-number="17"></div></div><div class="hljs-ln-code"><div class="hljs-ln-line">hadoop:hadoop:/home/hadoop:><span class="hljs-built_in"><span class="hljs-built_in">echo</span></span> <span class="hljs-variable"><span class="hljs-variable">$HADOOP_HOME
-
/home/hadoop/app/hadoop-
2.6.
0-cdh5.
7.0
修改配置文件
- hadoop-env.sh
export JAVA_HOME=/usr/java/jdk1.8.0_45
- core-site.xml
-
-
<property>
-
<name>fs.default.name
</name>
-
<value>hdfs://hadoop-01:9000
</value>
-
</property>
-
-
-
<property>
-
<name>hadoop.tmp.dir
</name>
-
<value>/home/hadoop/app/tmp
</value>
-
</property>
- hdfs-site.xml
-
<configuration>
-
<property>
-
<name>dfs.namenode.name.dir
</name>
-
<value>/home/hadoop/app/tmp/dfs/name
</value>
-
</property>
-
<property>
-
<name>dfs.datanode.data.dir
</name>
-
<value>/home/hadoop/app/tmp/dfs/data
</value>
-
</property>
-
<property>
-
<name>dfs.namenode.secondary.http-address
</name>
-
<value>hadoop-01:50090
</value>
-
</property>
-
<property>
-
<name>dfs.namenode.secondary.https-address
</name>
-
<value>hadoop-01:50091
</value>
-
</property>
-
<property>
-
<name>dfs.replication
</name>
-
<value>1
</value>
-
</property>
</configuration>
- slaves
echo "hadoop-01" > ./etc/hadoop/slaves
- mapred-site.xml(正常情况下没有这个文件,可由 mapred-site.xml.template
复制而来) cp mapred-site.xml.template mapred-site.xml
-
<configuration>
-
<property>
-
<name>mapreduce.framework.name
</name>
-
<value>yarn
</value>
-
</property>
-
</configuration>
- yarn-site.xml
-
<configuration>
-
<property>
-
<name>yarn.nodemanager.aux-services
</name>
-
<value>mapreduce_shuffle
</value>
-
</property>
-
-
</configuration>
格式化Hadoop
hdfs namenode -format
启动Hadoop
-
hadoop:hadoop:/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/etc/hadoop:>start-all.sh
-
This script
is
Deprecated. Instead use start-dfs.sh
and start-yarn.sh
-
18/
01/
02
02:
49:
33 WARN util.NativeCodeLoader: Unable
to load native-hadoop
library
for your
platform...
using builtin-java classes
where applicable
-
Starting namenodes
on [hadoop]
-
hadoop: starting namenode, logging
to /home/hadoop/app/hadoop-
2.6.
0-cdh5.
7.0/logs/hadoop-hadoop-namenode-hadoop.
out
-
hadoop: starting datanode, logging
to /home/hadoop/app/hadoop-
2.6.
0-cdh5.
7.0/logs/hadoop-hadoop-datanode-hadoop.
out
-
Starting secondary namenodes [
0.0.
0.0]
-
hadoop-01: starting secondarynamenode, logging
to /home/hadoop/app/hadoop-
2.6.
0-cdh5.
7.0/logs/hadoop-hadoop-secondarynamenode-hadoop.
out
-
18/
01/
02
02:
50:
15 WARN util.NativeCodeLoader: Unable
to load native-hadoop
library
for your
platform...
using builtin-java classes
where applicable
-
starting yarn daemons
-
starting resourcemanager, logging
to /home/hadoop/app/hadoop-
2.6.
0-cdh5.
7.0/logs/yarn-hadoop-resourcemanager-hadoop.
out
-
hadoop-
01: starting nodemanager, logging
to /home/hadoop/app/hadoop-
2.6.
0-cdh5.
7.0/logs/yarn-hadoop-nodemanager-hadoop.
out
-
hadoop:hadoop:/home/hadoop/app/hadoop-
2.6.
0-cdh5.
7.0/etc/hadoop:>jps
-
8345 NodeManager
-
8066 SecondaryNameNode
-
7820 NameNode
-
7914 DataNode
-
8249 ResourceManager
-
8613 Jps