centos 7+hadoop 2.7.3

本文详述了在CentOS 7系统中安装Hadoop 2.7.3的过程,包括JDK的安装与配置、Zookeeper、Maven的部署、protobuf的编译安装、SSH的设置、主机名修改、防火墙配置以及Hadoop的编译安装。同时解决了编译时的异常问题,确保时间同步,并提供了相关配置文件的设置指南。
摘要由CSDN通过智能技术生成

安装JDK

版本:jdk-8u131-linux-x64.tar.gz

需要先删除系统自带的open jdk

先查找java再移除

 

[hadoop@localhost ~]$ rpm -qa|grep java
java-1.7.0-openjdk-1.7.0.111-2.6.7.8.el7.x86_64
python-javapackages-3.4.1-11.el7.noarch
tzdata-java-2016g-2.el7.noarch
javapackages-tools-3.4.1-11.el7.noarch
java-1.8.0-openjdk-1.8.0.102-4.b14.el7.x86_64
java-1.8.0-openjdk-headless-1.8.0.102-4.b14.el7.x86_64
java-1.7.0-openjdk-headless-1.7.0.111-2.6.7.8.el7.x86_64
[hadoop@localhost ~]$ yum remove java-1.7.0-openjdk-1.7.0.111-2.6.7.8.el7.x86_64
[root@localhost hadoop]# yum remove java-1.8.0-openjdk-1.8.0.102-4.b14.el7.x86_64

 

 

 

连接windows,拷贝文件到centos

 

 

[hadoop@localhost java]$ cp jdk-8u131-linux-x64.tar.gz /home/hadoop/java
[hadoop@localhost java]$ cp protobuf-2.5.0.tar.gz /home/hadoop/java
[hadoop@localhost java]$ cp zookeeper-3.4.6.tar.gz /home/hadoop/java
[hadoop@localhost java]$ cp hadoop-2.7.3-src.tar.gz /home/hadoop/java
 

 

部署jdk环境变量

 

 

export JAVA_HOME=/home/hadoop/java/jdk1.8.0_131
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
export PATH=$JAVA_HOME/bin:$PATH

 

 

 

 

 

 

[hadoop@localhost java]$ java -version
java version "1.8.0_131"
Java(TM) SE Runtime Environment (build 1.8.0_131-b11)
Java HotSpot(TM) 64-Bit Server VM (build 25.131-b11, mixed mode)

 

 

下载  zookeeper protobuf maven

 

 

 部署环境变量

 

export JAVA_HOME=/home/hadoop/java/jdk1.8.0_131
export MAVEN_HOME=/home/hadoop/java/apache-maven-3.3.9
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
export PATH=$JAVA_HOME/bin:$MAVEN_HOME/bin:$PATH

 

[root@localhost apache-maven-3.3.9]# mvn -v
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-10T08:41:47-08:00)
Maven home: /home/hadoop/java/apache-maven-3.3.9
Java version: 1.8.0_131, vendor: Oracle Corporation
Java home: /home/hadoop/java/jdk1.8.0_131/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "3.10.0-514.el7.x86_64", arch: "amd64", family: "unix"

 

 

安装必要工具

 

[root@localhost java]# yum install g++ autoconf automake libtool cmake zlib1g-dev pkg-config libssl-dev 


编译安装protobuf

 

问题处理办法:出现该情况是由于c++编译器的相关package没有安装,在终端上执行出现该情况是由于c++编译器的相关package没有安装,在终端上执行

 

 

[root@localhost protobuf-2.5.0]# yum install glibc-headers gcc-c++

 

 

1./configure

2make

3make check

4sudo make install

 

 

 

export JAVA_HOME=/home/hadoop/java/jdk1.8.0_131
export MAVEN_HOME=/home/hadoop/java/apache-maven-3.3.9
export ZOOKEEPER_HOME=/home/hadoop/java/zookeeper-3.4.6
export PROTOBUF_HOME=/usr/soft/protobuf-2.5.0
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
export PATH=$JAVA_HOME/bin:$MAVEN_HOME/bin:$ZOOKEEPER_HOME/bin:$PROTOBUF_HOME/bin:$PATH

 

 

 

[root@localhost bin]# protoc --version
libprotoc 2.5.0

 

protoc验证完成

 

 

安装ssh

 

[root@localhost soft]# yum install openssh-server

 

 

 

root@localhost soft]# yum install iptables-services

 

 

 

 

 

禁用防火墙:systemctl disable firewalld.servic
查看防火墙状态firewall-cmd --state
重启 reboot

 



 

[root@localhost hadoop]# firewall-cmd --state
not running

 

 

 

 

 

[hadoop@localhost ~]$ ssh-keygen -t dsa
Generating public/private dsa key pair.
Enter file in which to save the key (/home/hadoop/.ssh/id_dsa): 
Created directory '/home/hadoop/.ssh'.
Enter passphrase (empty for no passphrase): 
Enter same passphrase again: 
Your identification has been saved in /home/hadoop/.ssh/id_dsa.
Your public key has been saved in /home/hadoop/.ssh/id_dsa.pub.
The key fingerprint is:
8b:9d:66:73:fe:27:05:e1:c4:e2:58:c3:9c:65:5f:c2 hadoop@localhost.localdomain
The key's randomart image is:
+--[ DSA 1024]----+
|         o +o.. .|
|          B.+.Eo |
|         + = ..  |
|        . . o    |
|        S    .   |
|       o o    .  |
|      . B .  .   |
|       o +  . .  |
|          ...o   |
+-----------------+
[hadoop@localhost ~]$ cd .ssh
[hadoop@localhost .ssh]$ cat id_dsa.pub >>  authorized_keys
[hadoop@localhost .ssh]$ chmod 644 authorized_keys

 

vim /etc/ssh/sshd_config,设置这三项后,执行service sshd restart

 

 

RSAAuthentication yes
PubkeyAuthentication yes

# The default is to check both .ssh/authorized_keys and .ssh/authorized_keys2
# but this is overridden so installations will only check .ssh/authorized_keys
AuthorizedKeysFile      .ssh/authorized_keys

 

 

 

 

 

 

修改主机名称

 

[root@localhost .ssh]# cat /etc/hosts
127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4
::1         localhost localhost.localdomain localhost6 localhost6.localdomain6
192.168.233.128 Master
 
[hadoop@localhost ~]$ ssh Master
The authenticity of host 'master (192.168.233.128)' can't be established.
ECDSA key fingerprint is db:71:4f:9d:06:ca:9e:da:ca:bd:71:9d:29:48:aa:42.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added 'master,192.168.233.128' (ECDSA) to the list of known hosts.
Last login: Thu May 18 00:40:40 2017
[hadoop@localhost ~]$ ssh Master
Last login: Thu May 18 00:45:29 2017 from master

 

修改/work/maven3.3.9/conf/settings.xml

 

<mirrors>  
    <!-- mirror  
     | Specifies a repository mirror site to use instead of a given repository. The repository that  
     | this mirror serves has an ID that matches the mirrorOf element of this mirror. IDs are used  
     | for inheritance and direct lookup purposes, and must be unique across the set of mirrors.  
     |-->  
    <!-- 阿里云仓库 -->  
        <mirror>  
            <id>alimaven</id>  
            <mirrorOf>central</mirrorOf>  
            <name>aliyun maven</name>  
            <url>http://maven.aliyun.com/nexus/content/repositories/central/</url>  
        </mirror>  
</mirrors> 

 

 

 

 

 

虽然换源了,但是还是可以看到hadoop项目向http://repository.jboss.org这个网址下载jar,所以我在hadoop源码项目下的pom.xml文件也进行修改
/work/hadoop-2.7.3-src/pom.xml

 

<repositories>  
    <repository>(增加)  
      <id>alimaven</id>  
      <name>aliyun maven</name>  
      <url>http://maven.aliyun.com/nexus/content/repositories/central/</url>  
    </repository>  
    <repository> (原有的)  
      <id>${distMgmtSnapshotsId}</id>  
      <name>${distMgmtSnapshotsName}</name>  
      <url>${distMgmtSnapshotsUrl}</url>  
    </repository>  
    <repository> (原有的)  
      <id>repository.jboss.org</id>  
      <url>http://repository.jboss.org/nexus/content/groups/public/</url>  
      <snapshots>  
        <enabled>false</enabled>  
      </snapshots>  
    </repository>  
  </repositories>  

 

 

 

 

 

 

开始编译安装hadoop2.7.3

 

 

[hadoop@localhost java]$ ls -l
total 226968
drwxrwxr-x.  6 hadoop hadoop        99 May 17 23:17 apache-maven-3.3.9
-rwxrwxr-x.  1 hadoop hadoop   8491533 May 17 23:16 apache-maven-3.3.9-bin.tar.gz
drwxr-xr-x. 16 hadoop hadoop      4096 Aug 17  2016 hadoop-2.7.3-src
-rwxrwxr-x.  1 hadoop hadoop  18258529 May 17 21:02 hadoop-2.7.3-src.tar.gz
drwxr-xr-x.  8 hadoop hadoop       255 Mar 15 01:35 jdk1.8.0_131
-rwxrwxr-x.  1 hadoop hadoop 185540433 May 17 21:01 jdk-8u131-linux-x64.tar.gz
drwxr-xr-x. 10 109965   5000      4096 Feb 26  2013 protobuf-2.5.0
-rwxrwxr-x.  1 hadoop hadoop   2401901 May 17 21:01 protobuf-2.5.0.tar.gz
drwxr-xr-x. 10 hadoop hadoop      4096 Feb 20  2014 zookeeper-3.4.6
-rwxrwxr-x.  1 hadoop hadoop  17699306 May 17 21:01 zookeeper-3.4.6.tar.gz
[hadoop@localhost java]$ cd hadoop-2.7.3-src
[hadoop@localhost hadoop-2.7.3-src]$ 
[hadoop@localhost hadoop-2.7.3-src]$ 
[hadoop@localhost hadoop-2.7.3-src]$ mvn clean package –Pdist,native –DskipTests –Dtarmvn clean package –Pdist,native –DskipTests –Dtar

 

 

yum -y install zlib-devel
yum -y install zlib

yum install lzo-devel  zlib-devel  gcc autoconf automake libtool   ncurses-devel openssl-deve

 

 

[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-dist ---
[INFO] Building jar: /home/hadoop/java/hadoop-2.7.3-src/hadoop-dist/target/hadoop-dist-2.7.3-javadoc.jar
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main ................................. SUCCESS [ 10.386 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [  9.396 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  3.513 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  6.965 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.314 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  7.110 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  6.889 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  8.779 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 31.643 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  6.235 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [06:36 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 18.200 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 23.698 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.128 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [11:24 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 36.725 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [  9.613 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  5.858 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.036 s]
[INFO] hadoop-yarn ........................................ SUCCESS [  0.109 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [ 41.020 s]
[INFO] hadoop-yarn-common ................................. SUCCESS [ 38.934 s]
[INFO] hadoop-yarn-server ................................. SUCCESS [  0.095 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 11.688 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 23.517 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [  5.830 s]
[INFO] hadoop-yarn-server-applicationhis
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值