18台机器:1台namenode+17台datanode
!!!配置hadoop时候先部署namenode,再利用rsync把hadoop目录同步到所有的datanode上面!!!
1. 安装JDK:
- mkdir -p /usr/local/java;
- wget http://100.100.144.187/jdk-7u51-linux-x64.gz;
- tar xzvf jdk-7u51-linux-x64.gz -C /usr/local/java;
- rm -f jdk-7u51-linux-x64.gz;
最终/etc/profile加上以下内容:
export JAVA_HOME=/usr/local/java/jdk1.7.0_51
export CLASSPATH=${JAVA_HOME}/lib/dt.jar:${JAVA_HOME}/lib/tool.jar:${JAVA_HOME}/jre/lib/rt.jar:.
export HADOOP_HOME=/data/hadoop/hadoop-2.6.0
export HADOOP_MAPRED_HOME=${HADOOP_HOME}
export HADOOP_COMMON_HOME=${HADOOP_HOME}
export HADOOP_HDFS_HOME=${HADOOP_HOME}
export YARN_HOME=${HADOOP_HOME}
# Native Path
export HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_HOME}/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_COMMON_LIB_NATIVE_DIR"
export MAVEN_HOME=/usr/local/apache-maven-3.2.1
export ANT_HOME=/usr/local/apache-ant-1.9.4
export PATH=${JAVA_HOME}/bin:$ANT_HOME/bin:$MAVEN_HOME/bin:${HADOOP_HOME}/bin/:${HADOOP_HOME}/sbin/:$PATH
export CLASSPATH=${JAVA_HOME}/lib/dt.jar:${JAVA_HOME}/lib/tool.jar:${JAVA_HOME}/jre/lib/rt.jar:.
export HADOOP_HOME=/data/hadoop/hadoop-2.6.0
export HADOOP_MAPRED_HOME=${HADOOP_HOME}
export HADOOP_COMMON_HOME=${HADOOP_HOME}
export HADOOP_HDFS_HOME=${HADOOP_HOME}
export YARN_HOME=${HADOOP_HOME}
# Native Path
export HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_HOME}/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_COMMON_LIB_NATIVE_DIR"
export MAVEN_HOME=/usr/local/apache-maven-3.2.1
export ANT_HOME=/usr/local/apache-ant-1.9.4
export PATH=${JAVA_HOME}/bin:$ANT_HOME/bin:$MAVEN_HOME/bin:${HADOOP_HOME}/bin/:${HADOOP_HOME}/sbin/:$PATH
2. 设置SSH免密码:
- ssh-keygen -t rsa 之后一路回 车(产生秘钥)
- 把id_rsa.pub 追加到授权的 key 里面去(cat id_rsa.pub >> authorized_keys)
- 重启 SSH 服 务命令使其生效 :service sshd restart
3. 设置hosts:修改/etc/hosts 文件
ip1 namenode-0
ip2 datanode-0
...
ip18 datanode-17
4. 关闭防火墙:
- service iptables stop;
- chkconfig iptables off;
5. 修改hadoop配置文件(/data/hadoop/hadoop-2.2.0/etc/hadoop):
5.1 修改
hadoop-env.sh(