Ubuntu 20.04.1 LTS安装Hadoop3.3.0和hive3.1.2

Ubuntu 20.04.1 LTS安装Hadoop3.3.0和hive3.1.2

安装过程参考网上各种教程, 现在汇总下安装步骤内容。

先上本机运行情况

  • 查看电脑环境
:~$ uname -srmo
Linux 5.4.0-48-generic x86_64 GNU/Linux
  • 查看java环境
:~$ java -version
openjdk version "1.8.0_265"
OpenJDK Runtime Environment (build 1.8.0_265-8u265-b01-0ubuntu2~20.04-b01)
OpenJDK 64-Bit Server VM (build 25.265-b01, mixed mode)
  • 启动Hadoop
:~$ start-all.sh 
WARNING: Attempting to start all Apache Hadoop daemons as *** in 10 seconds.
WARNING: This is not a recommended production deployment configuration.
WARNING: Use CTRL-C to abort.
Starting namenodes on [localhost]
Starting datanodes
Starting secondary namenodes [******]
Starting resourcemanager
Starting nodemanagers
  • 启动mysql
:~$ service mysql start
  • 启动hive
:~$ hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/apache-hive-bin/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/apache-hive-bin/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Hive Session ID = cdacc36f-3cf7-40de-833b-503f04b0db09

Logging initialized using configuration in file:/opt/apache-hive-bin/conf/hive-log4j2.properties Async: true
Hive Session ID = b56e7d35-a955-456a-8d21-6712e92891ad
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
hive>
  • 执行hive查询
hive> use hive;
OK
Time taken: 0.997 seconds
hive> select * from env;
OK
1	hadoop	hadoop-3.3.0.tar.gz	3.3.0	/opt/hadoop
2	hive	apache-hive-3.1.2-bin.tar.gz	3.1.2	/opt/apache-hive-bin
3	mysql	mysql-server	8.0.21	/usr/bin/mysql
Time taken: 1.983 seconds, Fetched: 3 row(s)
hive> 
  • 查看那hdfs系统hive表路径文件内容
:~$ hdfs dfs -ls -R /user/hive/warehouse
drwxr-xr-x   - *** supergroup          0 2020-10-08 11:37 /user/hive/warehouse/hive.db
drwxr-xr-x   - *** supergroup          0 2020-10-08 11:51 /user/hive/warehouse/hive.db/env
-rw-r--r--   1 *** supergroup        897 2020-10-08 11:51 /user/hive/warehouse/hive.db/env/000000_0
  • 分别查看Hadoop, hive, mysql安装路径
:~$ which hadoop
/opt/hadoop/bin/hadoop
:~$ which hive
/opt/apache-hive-bin/bin/hive
:~$ which mysql
/usr/bin/mysql

准备工作

  • 设备&电脑

电脑(虚拟机): Ubuntu20.04.1 LTS, 已安装open-jdk(1.8)

  • 安装包&执行文件

Hadoop安装文件: hadoop-3.2.1.tar.gz 下载地址
Hive安装文件: apache-hive-3.1.2-bin.tar.gz 下载地址
mysql-connector-java文件: mysql-connector-java-8.0.21.jar 下载地址

软件准备

  • 安装ssh-server
:~$ sudo apt install openssh-server
  • 检查ssh是否安装成功
:~$ ssh localhost

安装成功则会提示需要使用密码
输入密码后则会提示连接成功

***@localhost's password: 
Welcome to Ubuntu 20.04.1 LTS (GNU/Linux 5.4.0-48-generic x86_64)

 * Documentation:  https://help.ubuntu.com
 * Management:     https://landscape.canonical.com
 * Support:        https://ubuntu.com/advantage

0 updates can be installed immediately.
0 of these updates are security updates.

Your Hardware Enablement Stack (HWE) is supported until April 2025.
Last login: Tue Oct  6 14:37:47 2020 from 127.0.0.1
  • 退出ssh连接
:~$ logout
  • 配置ssh免密登陆(生成文件authorized_keys)
:~$ cd ./.ssh
:~/.ssh$ ls
id_rsa  id_rsa.pub  known_hosts
:~/.ssh$ cat ./id_rsa.pub >> ./authorized_keys
:~/.ssh$ ls
authorized_keys  id_rsa  id_rsa.pub  known_hosts

Ubuntu安装openssh-server后会自动在用户文件目录下生成.ssh文件夹, 同时生成密钥

安装Hadoop

解压Hadoop文件

:~$ sudo tar -zxvf ./hadoop-3.2.1.tar.gz -C /opt
:~$ cd /opt
:/opt$ sudo mv ./hadoop-3.2.1 ./hadoop
:/opt$ sudo chgrp -R root ./hadoop
:/opt$ sudo chown -R root ./hadoop
:/opt$ sudo chmod -R 755 ./hadoop
:/opt$ ls -al | grep 'hadoop'
drwxr-xr-x  9 root root  4096 9月  11  2019 hadoop

配置Hadoop环境变量

  • 增加Hadoop环境变量
:/opt$ cd
:~$ vim ./.bashrc

在vim模式下增加HADOOP_HOME, HADOOP_INSTALL, HADOOP_MAPRED_HOME, HADOOP_COMMON_HOME, HADOOP_HDFS_HOME, YARN_HOME, PATH 和HADOOP_CONF_DIR配置
如:

export HADOOP_HOME=/opt/hadoop
export HADOOP_INSTALL=${HADOOP_HOME}
export HADOOP_MAPRED_HOME=${HADOOP_HOME}
export HADOOP_COMMON_HOME
  • 7
    点赞
  • 25
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值