32位ubuntu14.04系统安装配置hadoop

用了一天时间,安装配置hadoop成功,下面把安装配置的步骤和大家分享一下:

1.在地址http://hadoop.apache.org/releases.html下载hadoop,我是下载的是hadoop-2.7.6.tar.gz.

~/software/hadoop/目录中执行wgethttp://ftp.cuhk.edu.hk/pub/packages/apache.org/hadoop/common/hadoop-2.7.6/hadoop-2.7.6.tar.gz,执行tarzxvf hadoop-2.7.6-src.tar.gz进行解压缩到当前目录.然后执行sudocp -r hadoop-2.7.6-src/usr/lib/hadoop命令将hadoop源文件放到了目的目录中.接下来,开始进行配置.

2./etc目录中打开profile文件,进行环境变量的配置,在文件的后面添加#sethadoop environment

exportHADOOP_HOME=/usr/lib/hadoop/hadoop-2.7.6

exportPATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin ,然后执行sourceprofile使该文件生效.

3.进入目录/usr/lib/hadoop/hadoop-2.7.6/etc/hadoop,打开文件hadoop-env.sh,在该文件中添加如下:exportJAVA_HOME=/usr/lib/jvm/jdk1.8.0_71 (具体路径因配置不同)

exportHADOOP_OPTS="-Djava.library.path=${HADOOP_HOME}/lib/"

4.进入目录/usr/lib/hadoop/hadoop-2.7.6/sbin,打开文件start-dfs.sh,添加如下内容:HDFS_DATANODE_USER=root

HADOOP_SECURE_DN_USER=hdfs

HDFS_NAMENODE_USER=root

HDFS_SECONDARYNAMENODE_USER=root

5.下面进入目录/usr/lib/hadoop/hadoop-2.7.6/sbin,执行start-dfs.sh进行启动.但是报了如下的错误:

WARNING:HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER.Using value of HADOOP_SECURE_DN_USER.

Startingnamenodes on [localhost]

Startingdatanodes

localhost:ERROR: Cannot set priority of datanode process 14988

Startingsecondary namenodes [wchw1011-Lenovo-Y430P]

2018-07-1019:14:15,098 WARN util.NativeCodeLoader: Unable to load native-hadooplibrary for your platform... using builtin-java classes whereapplicable


经过查阅资料了解到,hadoop是运行在64位系统中的,而我使用的是32位系统.因此,我需要编译hadoop源代码.


6.进行~/software/hadoop目录,通过wgethttp://ftp.cuhk.edu.hk/pub/packages/apache.org/hadoop/common/hadoop-2.7.6/hadoop-2.7.6-src.tar.gz,进行源代码下载.然后解压缩到当前目录.


7.在编译源代码之前,大概需要安装以下软件:

(2)JDK1.7+

(2)maven3.0 or later

(3)findbugs1.3.9

(4)protocolBuffer2.5.0 (必需是这个版本)

(5)cmake2.6

(6)zlib-devel

(7)openssl-devel

(8)libssl-dev

8.编译成功后,编译后的文件在目录hadoop-2.7.6/hadoop-dist/target/hadoop-2.7.6.tar.gz,进入该目录,libhadoop.so.1.0.0替换掉目录/usr/lib/hadoop/hadoop-2.7.6/lib/native中的文件libhadoop.so.1.0.0.

9.进入目录/usr/lib/hadoop/hadoop-2.7.6/sbin,再次执行start-dfs.sh进行启动.但是又报了如下的错误:
Incorrectconfiguration: namenode address dfs.namenode.servicerpc-address ordfs.namenode.rpc-address is not configured.

Startingnamenodes on []

Error:Cannot find configuration directory: /etc/hadoop

Error:Cannot find configuration directory: /etc/hadoop

Startingsecondary namenodes [0.0.0.0]

Error:Cannot find configuration directory: /etc/hadoop

解决办法:进入目录/usr/lib/hadoop/hadoop-2.7.6/etc/hadoop,在文件core-site.xml中添加如下配置后,然后重启计算机.

<configuration>

<property>

<name>fs.default.name</name>

<value>hdfs://127.0.0.1:9000</value>

</property>

</configuration>

10.进入目录/usr/lib/hadoop/hadoop-2.7.6/sbin,再次执行start-dfs.sh进行启动.但是又报了如下的错误:

Startingnamenodes on [localhost]

localhost:starting namenode, logging to/usr/lib/hadoop/hadoop-2.7.6/logs/hadoop-root-namenode-wchw1011-Lenovo-Y430P.out

localhost:starting datanode, logging to/usr/lib/hadoop/hadoop-2.7.6/logs/hadoop-root-datanode-wchw1011-Lenovo-Y430P.out

Startingsecondary namenodes [0.0.0.0]

Theauthenticity of host '0.0.0.0 (0.0.0.0)' can't be established.

ECDSAkey fingerprint is b5:e3:a2:8f:f7:54:00:09:bd:e1:95:3a:94:90:de:dd.

Areyou sure you want to continue connecting (yes/no)? yes

0.0.0.0:Warning: Permanently added '0.0.0.0' (ECDSA) to the list of knownhosts.

0.0.0.0:starting secondarynamenode, logging to/usr/lib/hadoop/hadoop-2.7.6/logs/hadoop-root-secondarynamenode-wchw1011-Lenovo-Y430P.out

尝试解决办法:执行sudo ufw disable

11.成功解决问题.

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值