一,Hadoop的安装:
vim /etc/hosts
172.25.38.7 server7
useradd hadoop
su - hadoop
pwd
/home/hadoop
tar zxf hadoop-2.7.3.tar.gz
tar zxf jdk-7u79-linux-x64.tar.gz
ln -s jdk1.7.0_79/ jdk
ln -s hadoop-2.7.3 hadoop
vim /home/hadoop/.bash_profile
10 PATH=$PATH:$HOME/bin:/home/hadoop/jdk/bin
vim hadoop/etc/hadoop/hadoop-env.sh
25 export JAVA_HOME=/home/hadoop/jdk
测试使用Hadoop自带的wordcount:
cd /home/hadoop/hadoop
mkdir input
cp etc/hadoop/* input/
bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.3.jar wordcount input output
cd output
hadoop@server7 output]$ ls
part-r-00000 _SUCCESS #对input执行mapreduce的结果
cat part-r-00000 #可以看见wordcount功能已经完成了mapreduce任务;
!= 3
"" 6
"". 4
"$HADOOP_CLASSPATH" 1
"$JAVA_HOME" 2
"$YARN_HEAPSIZE" 1
"$YARN_LOGFILE" 1
"$YARN_LOG_DIR" 1
"$YARN_POLICYFILE" 1
"*" 18
"AS 24
"Error: 1
"License"); 24
"alice,bob 18
"console" 1
"dfs" 3
"hadoop.root.logger". 1
"jks". 4
"jvm" 3
"mapred" 3
"rpc" 3
"run 1
"ugi" 3
"x" 1
单节点集群:NN DN SNN在一台主机上:
cd /home/hadoop/hadoop/etc/hadoop
vim core-site.xml19 <configuration>
20 <property>
21 <name>fs.defaultFS</name>
22 <value>hdfs://172.25.38.7:9000</value>
23 </property>
24 </configuration>
vim hdfs-site.xml
19 <configuration>
20 <property>
21 <name>dfs.replication</name>
22 <value>1</value> #此时数据块仅仅保存一份,默认保存三份
23 </property>
24 </configuration>
vim /home/hadoop/hadoop/etc/hadoop/slaves
172.25.38.7
ssh-keygen
ssh-copy-id 172.25.38.7
ssh-copy-id server7
bin/hdfs namenode -format
/home/hadoop/hadoop/sbin/start-dfs.sh
jps
Hadoop集群搭建:
server7:namenode,SecondaryNameNode
server8,server9:datanode
1,配置集群节点时间同步:设置集群节点同步主机的时间:
主机上:vim /etc/chrony.conf
7 server time1.aliyun.com iburst #设置主机同步阿里云的时间
23 allow 172.25.38/24 #允许同步的主机网段
server7,server8,server9上:
yum install -y ntp
vim /etc/ntp.conf 22 server 172.25.38.250 iburst 设置集群节点同步主机的时间
/etc/init.d/ntpd start
2,配置nfs共享配置:
server7上:
yum install -y nfs-utils
/etc/init.d/rpcbind start
/etc/init.d/nfs start
/home/hadoop *(rw,anonuid=500,anongid=500)
exportfs -rv
server8,server9上:
useradd hadoop
yum install -y nfs-utils
/etc/init.d/rpcbind start
/etc/init.d/nfs start
showmount -e 172.25.38.7
mount 172.25.38.7:/home/hadoop/ /home/hadoop/
vim /home/hadoop/.bash_profile
PATH=$PATH:$HOME/bin:/home/hadoop/jdk/bin