Hadoop Pseudo-Distributed Mode Deployment

Pre-condition and software
Hadoop version: hadoop-2.6.0-cdh5.7.0
JDK version:jdk-8u45-linux-x64
Linux install ssh service

1.Create user and upload hadoop installation package
wget http://archive.cloudera.com/cdh5/cdh/5/hadoop-2.6.0-cdh5.7.0.tar.gz
rz (upload command)

[root@10-9-15-140 ~]# useradd hadoop
[root@10-9-15-140 ~]# su - hadoop
[hadoop@10-9-15-140 ~]$
[hadoop@10-9-15-140 ~]$ mkdir app
[hadoop@10-9-15-140 ~]$ cd app/
[hadoop@10-9-15-140 app]$ wget http://archive.cloudera.com/cdh5/cdh/5/hadoop-2.6.0-cdh5.7.0.tar.gz
或
[hadoop@10-9-15-140 app]$ rz

2.jdk deployment
CDH environment:
mkdir /usr/java jdk deployment
mkdir /usr/share/java CDH deployment need mysql jdbc jar package
rz upload jdk-8u45-linux-x64.gz

unzip
[root@10-9-15-140 java]# tar -xzvf jdk-8u45-linux-x64.gz
[root@10-9-15-140 java]# ll
total 169216
drwxr-xr-x 8 uucp      143      4096 Apr 11  2015 jdk1.8.0_45
-rw-r--r-- 1 hadoop hadoop 173271626 Jun 30 23:59 jdk-8u45-linux-x64.gz
Change permission
[root@10-9-15-140 java]# chown -R root:root jdk1.8.0_45
[root@10-9-15-140 java]# ll
total 169216
drwxr-xr-x 8 root   root        4096 Apr 11  2015 jdk1.8.0_45
-rw-r--r-- 1 hadoop hadoop 173271626 Jun 30 23:59 jdk-8u45-linux-x64.gz
[root@10-9-15-140 java]# vi /etc/profile

#env
export JAVA_HOME=/usr/share/java/jdk1.8.0_45
export JRE_HOME=$JAVA_HOME/jre
export CLASSPATH=.:$JAVA_HOME/lib:$JER_HOME/lib:$CLASSPATH
export PATH=$JAVA_HOME/bin:$JER_HOME/bin:$PATH


[root@10-9-15-140 java]# source /etc/profile
[root@10-9-15-140 java]# which java
/usr/share/java/jdk1.8.0_45/bin/java

3.unzip hadoop package

[hadoop@10-9-15-140 app]$ tar -xzvf hadoop-2.6.0-cdh5.7.0.tar.gz
[hadoop@10-9-15-140 app]$ cd hadoop-2.6.0-cdh5.7.0
[hadoop@10-9-15-140 app]$ ll
total 304288
drwxr-xr-x 14 hadoop hadoop      4096 Mar 24  2016 hadoop-2.6.0-cdh5.7.0
-rw-r--r--  1 hadoop hadoop 311585484 Jun 30 23:59 hadoop-2.6.0-cdh5.7.0.tar.gz
[hadoop@10-9-15-140 app]$ cd hadoop-2.6.0-cdh5.7.0
[hadoop@10-9-15-140 hadoop-2.6.0-cdh5.7.0]$ ll
total 76
drwxr-xr-x  2 hadoop hadoop  4096 Mar 24  2016 bin		*execution script
drwxr-xr-x  2 hadoop hadoop  4096 Mar 24  2016 bin-mapreduce1
drwxr-xr-x  3 hadoop hadoop  4096 Mar 24  2016 cloudera
drwxr-xr-x  6 hadoop hadoop  4096 Mar 24  2016 etc		*configuration menu(conf)
drwxr-xr-x  5 hadoop hadoop  4096 Mar 24  2016 examples
drwxr-xr-x  3 hadoop hadoop  4096 Mar 24  2016 examples-mapreduce1
drwxr-xr-x  2 hadoop hadoop  4096 Mar 24  2016 include
drwxr-xr-x  3 hadoop hadoop  4096 Mar 24  2016 lib		*jar package munu
drwxr-xr-x  2 hadoop hadoop  4096 Mar 24  2016 libexec
-rw-r--r--  1 hadoop hadoop 17087 Mar 24  2016 LICENSE.txt
-rw-r--r--  1 hadoop hadoop   101 Mar 24  2016 NOTICE.txt
-rw-r--r--  1 hadoop hadoop  1366 Mar 24  2016 README.txt
drwxr-xr-x  3 hadoop hadoop  4096 Mar 24  2016 sbin		*hadoop component‘s start and stop script
drwxr-xr-x  4 hadoop hadoop  4096 Mar 24  2016 share
drwxr-xr-x 17 hadoop hadoop  4096 Mar 24  2016 src

4.Configuration

Use the following:
/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/etc/hadoop/core-site.xml:

<configuration>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
    </property>
</configuration>


/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/etc/hadoop/hdfs-site.xml:
<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
</configuration>

5.Config ssh localhost non password trusted connection

[hadoop@10-9-15-140 ~]$ ssh-keygen
Generating public/private rsa key pair.
Enter file in which to save the key (/home/hadoop/.ssh/id_rsa): 
Created directory '/home/hadoop/.ssh'.
Enter passphrase (empty for no passphrase): 
Enter same passphrase again: 
Your identification has been saved in /home/hadoop/.ssh/id_rsa.
Your public key has been saved in /home/hadoop/.ssh/id_rsa.pub.
The key fingerprint is:
53:21:86:1c:af:cd:f3:c4:45:a4:ca:72:ea:83:58:37 hadoop@10-9-15-140
The key's randomart image is:
+--[ RSA 2048]----+
|     ..oo ..o    |
|      oo . +     |
|        . o .    |
|       = + .     |
|      o S o      |
|    . E+ =       |
|   o o..  .      |
|  . ...          |
|      ..         |
+-----------------+
[hadoop@10-9-15-140 ~]$ cd .ssh
[hadoop@10-9-15-140 .ssh]$ ll
total 8
-rw------- 1 hadoop hadoop 1675 Jul  2 01:18 id_rsa
-rw-r--r-- 1 hadoop hadoop  400 Jul  2 01:18 id_rsa.pub
[hadoop@10-9-15-140 .ssh]$ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
[hadoop@10-9-15-140 .ssh]$ ll
total 12
-rw-rw-r-- 1 hadoop hadoop  400 Jul  2 01:18 authorized_keys
-rw------- 1 hadoop hadoop 1675 Jul  2 01:18 id_rsa		私钥
-rw-r--r-- 1 hadoop hadoop  400 Jul  2 01:18 id_rsa.pub	公钥

ssh localhost date (need to input password, but this user do not need the password.
We need to setup the trusted connetion without password.

Change permission
[hadoop@10-9-15-140 .ssh]$ chmod 600 authorized_keys
[hadoop@10-9-15-140 .ssh]$ ssh localhost date
The authenticity of host 'localhost (127.0.0.1)' can't be established.
RSA key fingerprint is 68:d4:3b:cb:d1:f9:dc:23:65:81:a9:a0:fd:e9:ec:3b.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added 'localhost' (RSA) to the list of known hosts.
Tue Jul  2 01:19:38 CST 2019
#ssh test successfully
[hadoop@10-9-15-140 .ssh]$ ssh localhost date
Tue Jul  2 01:19:53 CST 2019

6.Format

[hadoop@10-9-15-140 hadoop-2.6.0-cdh5.7.0]$ bin/hdfs namenode -format

7.Start

[hadoop@10-9-15-140 hadoop-2.6.0-cdh5.7.0]$ sbin/start-dfs.sh
19/07/02 09:48:10 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/hadoop/app/hadoop-2.6.0-cdh5.7.0/logs/hadoop-hadoop-namenode-10-9-15-140.out
localhost: starting datanode, logging to /home/hadoop/app/hadoop-2.6.0-cdh5.7.0/logs/hadoop-hadoop-datanode-10-9-15-140.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/hadoop/app/hadoop-2.6.0-cdh5.7.0/logs/hadoop-hadoop-secondarynamenode-10-9-15-140.out
19/07/02 09:48:34 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[hadoop@10-9-15-140 hadoop-2.6.0-cdh5.7.0]$ jps
29539 Jps
29222 DataNode
29405 SecondaryNameNode
29134 NameNode

Two issues:

Issue One:
start-dfs.sh occured Error:cannot find configuration directory:/etc/hadoop
Solution:
Change hadoop-env.sh

export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/etc/hadoop"}

Change it to absolute url

export HADOOP_CONF_DIR=/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/etc/hadoop

Issue Two:
After install Hadoop, there is an error: JAVA_HOME is not set and could not be found.

Solution:
Revise /etc/hadoop/hadoop-env.sh and set JAVA_HOME

should use absolute url

export JAVA_HOME=$JAVA_HOME                  //This is incorrect one

export JAVA_HOME=/usr/share/java/jdk1.8.0_45/bin/java        //This is the correct one

8.Environment Variable

[hadoop@10-9-15-140 hadoop-2.6.0-cdh5.7.0]$ cat ~/.bash_profile
# .bash_profile

# Get the aliases and functions
if [ -f ~/.bashrc ]; then
	. ~/.bashrc
fi

# User specific environment and startup programs

PATH=$PATH:$HOME/bin

export PATH
export HADOOP_PREFIX=/home/hadoop/app/hadoop-2.6.0-cdh5.7.0
export PATH=$HADOOP_PREFIX/bin:$PATH

9.Launch hadoop Name Node

Visit NameNode through web browser.
http://NameNodeIP machine IP address:50070/

Note: need to open the port number in firewall for cloud machine

10.Stop NameNode and DataNode
$ sbin/stop-dfs.sh

[hadoop@10-9-15-140 hadoop-2.6.0-cdh5.7.0]$ sbin/stop-dfs.sh
19/07/03 00:10:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Stopping namenodes on [localhost]
localhost: stopping namenode
localhost: stopping datanode
Stopping secondary namenodes [0.0.0.0]
0.0.0.0: stopping secondarynamenode
19/07/03 00:10:52 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[hadoop@10-9-15-140 hadoop-2.6.0-cdh5.7.0]$ jps
23642 Jps

Reference:
apache hadoop: hadoop.apache.org
cdh hadoop: http://archive.cloudera.com/cdh5/cdh/5/

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
本火锅店点餐系统采用Java语言和Vue技术,框架采用SSM,搭配Mysql数据库,运行在Idea里,采用小程序模式。本火锅店点餐系统提供管理员、用户两种角色的服务。总的功能包括菜品的查询、菜品的购买、餐桌预定和订单管理。本系统可以帮助管理员更新菜品信息和管理订单信息,帮助用户实现在线的点餐方式,并可以实现餐桌预定。本系统采用成熟技术开发可以完成点餐管理的相关工作。 本系统的功能围绕用户、管理员两种权限设计。根据不同权限的不同需求设计出更符合用户要求的功能。本系统中管理员主要负责审核管理用户,发布分享新的菜品,审核用户的订餐信息和餐桌预定信息等,用户可以对需要的菜品进行购买、预定餐桌等。用户可以管理个人资料、查询菜品、在线点餐和预定餐桌、管理订单等,用户的个人资料是由管理员添加用户资料时产生,用户的订单内容由用户在购买菜品时产生,用户预定信息由用户在预定餐桌操作时产生。 本系统的功能设计为管理员、用户两部分。管理员为菜品管理、菜品分类管理、用户管理、订单管理等,用户的功能为查询菜品,在线点餐、预定餐桌、管理个人信息等。 管理员负责用户信息的删除和管理,用户的姓名和手机号都可以由管理员在此功能里看到。管理员可以对菜品的信息进行管理、审核。本功能可以实现菜品的定时更新和审核管理。本功能包括查询餐桌,也可以发布新的餐桌信息。管理员可以查询已预定的餐桌,并进行审核。管理员可以管理公告和系统的轮播图,可以安排活动。管理员可以对个人的资料进行修改和管理,管理员还可以在本功能里修改密码。管理员可以查询用户的订单,并完成菜品的安排。 当用户登录进系统后可以修改自己的资料,可以使自己信息的保持正确性。还可以修改密码。用户可以浏览所有的菜品,可以查看详细的菜品内容,也可以进行菜品的点餐。在本功能里用户可以进行点餐。用户可以浏览没有预定出去的餐桌,选择合适的餐桌可以进行预定。用户可以管理购物车里的菜品。用户可以管理自己的订单,在订单管理界面里也可以进行查询操作。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值