Centos7虚拟机MySQ连接Hive

安装Hive

1.下载apache-hive-2.3.4-bin.tar.gz
2.解压安装包  tar -xvf file.tar  //解压tar包
                       tar -zxvf file.tar.gz  //解压tar.gz
                       tar -jxvf file.tar.bz2 //解压tar.bz2
                       tar -Zxvf file.tar.Z  //解压tar.Z
 3.修改etc/profile 文件:
 export HIVE_HOME=/usr/apache-hive-2.3.4-bin
 export PATH=$HIVE_HOME/bin:$PATH
 4.使配置生效:source /etc/profile
  echo $PATH检查是否配置成功
 5.进入hive目录下的conf文件,修改hive-default.xml.template
 将此文件复制并重命名为hive-site.xml
 cp -b hive-default.xml.template  hive-site.xml

6.执行以下命令

hadoop fs -mkdir /tmp/hive
hadoop fs -mkdir /hive/warehouse
hadoop fs -chmod g+w /tmp/hive
hadoop fs -chmod g+w /hive/warehouse

 

 修改hive-site.xml文件,修改如下

</property>
  <property>
    <name>hive.exec.local.scratchdir</name>
    <value>/bigdata/tmp/hive/local</value><!--本地缓存目录-->
    <description>Local scratch space for Hive jobs</description>
  </property>
<property>
      <name>hive.querylog.location</name><!--Hive运行时的结构化日志目录-->
      <value>/bigdata/tmp/hive</value>
      <description>Location of Hive run time structured log file</description>
      </property>
  <property>
    <name>hive.downloaded.resources.dir</name>
    <value>/bigdata/tmp/hive</value><!--从远程文件系统中添加资源的本地临时目录-->
    <description>Temporary local directory for added resources in the remote file system.</description>
  </property>

<property>
    <name>hive.exec.mode.local.auto</name>
    <value>true</value><!--Hive可以自行决定使用本地模式执行任务-->
    <description>Let Hive determine whether to run in local mode automatically</description>
  </property>
<property><!--日志高级功能开启时,存储操作日志的最高级目录-->
    <name>hive.server2.logging.operation.log.location</name>
    <value>/bigdata/tmp/hive</value>
    <description>Top level directory where operation logs are stored if logging functionality is enabled<
/description>
  </property>
  <property>
<property>
    <name>hive.metastore.warehouse.dir</name><!--default数据库在HDFS的缓存目录-->
    <value>/hive/warehouse</value>
    <description>location of default database for the warehouse</description>
  </property>
 <property>
    <name>hive.exec.scratchdir</name><!--Hive任务在HDFS中的缓存路径-->
    <value>/tmp/hive</value>
    <description>HDFS root scratch dir for Hive jobs which gets created with write all (733)
 permission. For each connecting user, an HDFS scratch dir: ${hive.exec.scratchdir}/&lt;user
name&gt; is created, with ${hive.scratch.dir.permission}.</description>
  </property>
  <property>

安装mysql时不知道早捣鼓的反正最后就删除了

MySQL删除

参照https://www.cnblogs.com/Lenbrother/articles/6203620.html

1.首先只执行了yum remove mysql 但是 rpm -qa | grep mysql 还是有很多mysql的包

2.然后 yum remove mysql-community-common-5.6.44-2.el7.x86_64(即执行上述命令出现的包)

3.再次执行rpm -qa | grep mysql 未出现东西

4.再 find / -name mysql(查看mysql文件目录)

5.删除目录

rm -rf /usr/share/mysql
rm -rf /var/log/mysqld.log
rm -rf /var/lib/mysql

MySql安装及连接hive

参照https://blog.csdn.net/a774630093/article/details/79270080

1.rpm -qa | grep mysql  未出现东西

2.安装

#root权限下操作
yum install mysql
yum insatll wget
#下载mysql的源
wget http://repo.mysql.com/mysql-community-release-el7-5.noarch.rpm
#安装
rpm -ivh mysql-community-release-el7-5.noarch.rpm
cd /etc/yum.repos.d/
ls (有mysql-community.repo, mysql-community-source.repo)

yum install mysql-server
安装完成后
rpm -qa | grep mysql
显示
mysql-community-release-el7-5.noarch
mysql-community-libs-5.6.44-2.el7.x86_64
mysql-community-server-5.6.44-2.el7.x86_64
mysql-community-common-5.6.44-2.el7.x86_64
mysql-community-client-5.6.44-2.el7.x86_64

3.

mysql -u -root

显示ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/lib/mysql/mysql.sock' (2)

chown -R openscanner:openscanner /var/lib/mysql
报错
chown: invalid user: ‘openscanner:openscanner’
cd /var/lib
chown root /var/lib/mysql/
ll

4.重启:

service mysqld restart

5.

mysql -u root -p

回车即可

6.

[root@master lib]# service mysqld restart
Redirecting to /bin/systemctl restart mysqld.service
[root@master lib]# mysql -u root -p
Enter password: (不需输入密码回车即可)
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 2
Server version: 5.6.44 MySQL Community Server (GPL)

Copyright (c) 2000, 2019, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql> use mysql;
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with -A

Database changed
mysql> update user set password=password('123456') where user='root';
Query OK, 4 rows affected (0.00 sec)
Rows matched: 4  Changed: 4  Warnings: 0

mysql> exit
Bye
[root@master lib]# service mysqld restart(systemctl start firewalld.service)
Redirecting to /bin/systemctl restart mysqld.service

7.删除匿名用户


[root@master lib]# mysql -u root -p
Enter password: (密码:123456)
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 3
Server version: 5.6.44 MySQL Community Server (GPL)

Copyright (c) 2000, 2019, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql> select user,host from mysql.user
    -> \c(输错命令,退出)
mysql> select user,host from mysql.user;(正确命令)
+------+-----------+
| user | host      |
+------+-----------+
| root | 127.0.0.1 |
| root | ::1       |
|      | localhost |
| root | localhost |
|      | master    |
| root | master    |
+------+-----------+
6 rows in set (0.00 sec)

mysql> delete from mysql.user where user=''; //删除匿名用户
Query OK, 2 rows affected (0.00 sec)

mysql> select user,host from mysql.user;//查看用户信息
+------+-----------+
| user | host      |
+------+-----------+
| root | 127.0.0.1 |
| root | ::1       |
| root | localhost |
| root | master    |
+------+-----------+
4 rows in set (0.00 sec)

mCtrl-C -- exit!er,host from mysql.user;select user,host from mysql.user;
Aborted

8.设置远程连接及utf8

[root@master /]# mysql -u root -p
Enter password: 123456
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 6
Server version: 5.6.44 MySQL Community Server (GPL)

Copyright (c) 2000, 2019, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql> GRANT ALL PRIVILEGES ON *.* TO root@"%" IDENTIFIED BY "root";
Query OK, 0 rows affected (0.00 sec)

mysql> GRANT ALL PRIVILEGES ON *.* TO root@"%" IDENTIFIED BY "root";
Query OK, 0 rows affected (0.00 sec)

mysql> show variables like "%char%";
+--------------------------+----------------------------+
| Variable_name            | Value                      |
+--------------------------+----------------------------+
| character_set_client     | utf8                       |
| character_set_connection | utf8                       |
| character_set_database   | latin1                     |
| character_set_filesystem | binary                     |
| character_set_results    | utf8                       |
| character_set_server     | latin1                     |
| character_set_system     | utf8                       |
| character_sets_dir       | /usr/share/mysql/charsets/ |
+--------------------------+----------------------------+
8 rows in set (0.01 sec)

mysql> set names utf8;
Query OK, 0 rows affected (0.00 sec)

mysql> use test;
ERROR 1049 (42000): Unknown database 'test'
mysql> show variables like "%char%";
+--------------------------+----------------------------+
| Variable_name            | Value                      |
+--------------------------+----------------------------+
| character_set_client     | utf8                       |
| character_set_connection | utf8                       |
| character_set_database   | latin1                     |
| character_set_filesystem | binary                     |
| character_set_results    | utf8                       |
| character_set_server     | latin1                     |
| character_set_system     | utf8                       |
| character_sets_dir       | /usr/share/mysql/charsets/ |
+--------------------------+----------------------------+
8 rows in set (0.00 sec)

mysql> exit
Bye
[root@master /]# firewall-cmd --zone=public --add-port=3306/tcp --permanent
FirewallD is not running
[root@master /]# firewall-cmd --reload
FirewallD is not running

9.下载mysql-connector-java-5.1.47-bin.jar(在那个目录下执行此操作,压缩包就会下载到此目录中,并在此目录中执行以下操作)

wget http://mirrors.ustc.edu.cn/mysql-ftp/Downloads/Connector-J/mysql-connector-java-5.1.47.tar.gz
tar zxvf mysql-connector-java-5.1.45.tar.gz
cp mysql-connector-java-bin.jar /usr/apache-hive-2.3.4-bin/lib

10.但使用 : schematool -dbType mysql -initSchema时出现,

[root@master bin]# schematool -initSchema -dbType mysql
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/apache-hive-2.3.4-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Metastore connection URL:        jdbc:mysql://master/hive?createDatabaseIfNotExist=true
Metastore Connection Driver :    com.mysql.jdbc.Driver
Metastore connection User:       hive
org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.
Underlying cause: java.sql.SQLException : Access denied for user 'hive'@'master' (using password: YES)
SQL Error code: 1045
Use --verbose for detailed stacktrace.
*** schemaTool failed ***

11.参照https://www.cnblogs.com/SysoCjs/p/10835954.html

  初步怀疑是mysql-connector-java-5.1.47与mysql版本不匹配问题,于是卸载mysql重装5.7版本

(仍旧是在哪个目录下执行的一下命令,安装包就在哪个目录下)

wget http://dev.mysql.com/get/mysql57-community-release-el7-8.noarch.rpm
#下载了 mysql57-community-release-el7-8.noarch.rpm
yum localinstall mysql57-community-release-el7-8.noarch.rpm
[root@master bigdata]# yum repolist enabled | grep "mysql.*-community.*"
mysql-connectors-community/x86_64       MySQL Connectors Community           108
mysql-tools-community/x86_64            MySQL Tools Community                 90
mysql57-community/x86_64                MySQL 5.7 Community Server           347
#安装mysql
yum install mysql-community-server

12.查看mysql状态

[root@master bigdata]# systemctl start mysqld
[root@master bigdata]# systemctl status mysqld
● mysqld.service - MySQL Server
   Loaded: loaded (/usr/lib/systemd/system/mysqld.service; enabled; vendor preset: disabled)
   Active: active (running) since Sun 2019-05-26 04:32:22 CST; 12s ago

13.设置mysql开机自启

[root@master bigdata]# systemctl enable mysqld
[root@master bigdata]# systemctl daemon-reload

14.重置mysql密码

[root@master bigdata]# grep 'temporary password' /var/log/mysqld.log
2019-05-25T20:32:18.663515Z 1 [Note] A temporary password is generated for root@localhost: A.%MFzb4T8wX

15.修改密码,先按这个要求设置

[root@master bigdata]# mysql -u root -p

[root@master bigdata]# mysql -u root -p
Enter password: 
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 4
Server version: 5.7.26

Copyright (c) 2000, 2019, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql> set password for 'root'@'localhost'=password('MyNewPass4!'); 
Query OK, 0 rows affected, 1 warning (0.00 sec)

mysql>

16.修改组策略,修改完后重mysql服务  systemctl restart mysqld

[root@master bigdata]# find / -name my.cnf
/etc/my.cnf
[root@master bigdata]# vi /etc/my.cnf
# Disabling symbolic-links is recommended to prevent assorted security risks
symbolic-links=0

log-error=/var/log/mysqld.log
pid-file=/var/run/mysqld/mysqld.pid
validate_password_policy=0
validate_password=off

17.删除匿名用户

mysql> show databases;
+--------------------+
| Database           |
+--------------------+
| information_schema |
| mysql              |
| performance_schema |
| sys                |
+--------------------+
4 rows in set (0.00 sec)

mysql> use mysql;
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with -A

Database changed
mysql> delete from mysql.user where user='';
Query OK, 0 rows affected (0.00 sec)

18.设置远程用户登录,进入mysql操作

show databases;
use mysql;
update user set host='%' where user='root' and host='localhost';
delete from user where user<>'root';
flush privileges;

19.配置默认编码utf8

修改 vi/etc/my.cnf

# Disabling symbolic-links is recommended to prevent assorted security risks
symbolic-links=0

log-error=/var/log/mysqld.log
pid-file=/var/run/mysqld/mysqld.pid
validate_password_policy=0
validate_password=off
character_set_server=utf8
init_connect='SET NAMES utf8'

20.修改自己指定密码


[root@master bigdata]# mysql -u root -p
Enter password: (MyNewPass4!)
ERROR 1045 (28000): Access denied for user 'root'@'localhost' (using password: YES)
[root@master wuxiaoli]# mysql -u root -p
Enter password: 
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 5
Server version: 5.7.26 MySQL Community Server (GPL)

Copyright (c) 2000, 2019, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql> update mysql.user set authentication_string=password('haveaniceday') where user='root';
Query OK, 1 row affected, 1 warning (0.00 sec)
Rows matched: 1  Changed: 1  Warnings: 1

mysql> flush privileges
    -> \c
mysql> flush privileges;
Query OK, 0 rows affected (0.00 sec)

mysql> exit
Bye
[root@master bigdata]# mysql -u root -p
Enter password: (haveaniceday)

20.修改hive-site.xml文件

 <property><!--存储元数据的MySQL服务器URL-->
    <name>javax.jdo.option.ConnectionURL</name>
    <value>jdbc:mysql://master/hive?createDatabaseIfNotExist=true</value>
    <!--master:主机名,也可以IP地址,hive:数据库名称,jdbc:mysql:协议用jdbc访问数据库-->
    <description>
      JDBC connect string for a JDBC metastore.
      To use SSL to encrypt/authenticate the connection, provide database-specific S
SL flag in the connection URL.
      For example, jdbc:postgresql://myhost/db?ssl=true for postgres database.
    </description>
  </property>


 <property><!--MySQL的JDBC驱动类-->
    <name>javax.jdo.option.ConnectionDriverName</name>
    <value>com.mysql.jdbc.Driver</value>
    <description>Driver class name for a JDBC metastore</description>
  </property>
  <property>


 <property><!--链接到MySQL服务器的用户名-->
    <name>javax.jdo.option.ConnectionUserName</name>
    <value>hive</value>
    <description>Username to use against metastore database</description>
  </property>

 <property><!--链接到MySQL服务器的用户密码-->
    <name>javax.jdo.option.ConnectionPassword</name>
    <value>hive</value>
    <description>password to use against metastore database</description>
  </property>

</property><!--Hive可以自行决定使用本地模式执行任务-->
  <property>
    <name>hive.exec.mode.local.auto</name>
    <value>true</value>
    <description>Let Hive determine whether to run in local mode automatically</desc
ription>
  </property>

21.安装mysql连接器,在那个目录下执行的第一个命令,mysql-connector-java就会下载到哪

 

 

wget http://mirrors.ustc.edu.cn/mysql-ftp/Downloads/Connector-J/mysql-connector-java-5.1.47.tar.gz
tar zxvf mysql-connector-java-5.1.45.tar.gz
cp mysql-connector-java-5.1.47/mysql-connector-java-5.1.47-bin.jar /usr/apache-hive-2.3.4-bin/lib

22.启动hive

 

[root@master bigdata]#start-all.sh
[root@master bigdata]# hive
hive> create database db1;
hive>
[root@master bigdata]# schematool -dpType mysql -initSchema
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/apache-hive-2.3.4-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Metastore connection URL:        jdbc:mysql://master/hive?createDatabaseIfNotExist=true
Metastore Connection Driver :    com.mysql.jdbc.Driver
Metastore connection User:       hive
Sun May 26 17:46:26 CST 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.
Underlying cause: java.sql.SQLException : Access denied for user 'hive'@'master' (using password: YES)
SQL Error code: 1045
Use --verbose for detailed stacktrace.
*** schemaTool failed ***

 

 

23.将元数据在数据库中同步,最后显示schemaTool completed

[root@master apache-hive-2.3.4-bin]# mysql -u root -p
Enter password: (haveaniceday)
#创建名为‘hive’的数据库,添加数据库用户hive,为hive用户赋予全局外部访问的权限
mysql> create database hive;
Query OK, 1 row affected (0.00 sec)

mysql> create user hive IDENTIFIED by 'hive';
Query OK, 0 rows affected (0.34 sec)

mysql> grant all privileges on hive.*to hive@'%'identified by 'hive';
Query OK, 0 rows affected, 1 warning (0.00 sec)

mysql> flush privileges;
Query OK, 0 rows affected (0.00 sec)

mysql> exit
Bye
[root@master apache-hive-2.3.4-bin]# schematool -dbType mysql -initSchema
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/apache-hive-2.3.4-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Metastore connection URL:        jdbc:mysql://master/hive?createDatabaseIfNotExist=true
Metastore Connection Driver :    com.mysql.jdbc.Driver
Metastore connection User:       hive
Sun May 26 17:56:13 CST 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Starting metastore schema initialization to 2.3.0
Initialization script hive-schema-2.3.0.mysql.sql
Sun May 26 17:56:15 CST 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Initialization script completed
Sun May 26 17:56:19 CST 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
schemaTool completed

 此时如果在执行元数据在数据库中同步的命令会出现以下错误

ovide truststore for server certificate verification.
Error: Duplicate key name 'PCS_STATS_IDX' (state=42000,code=1061)
org.apache.hadoop.hive.metastore.HiveMetaException: Schema initialization FAILED! Metastore state would be inconsistent !!
Underlying cause: java.io.IOException : Schema script failed, errorcode 2
Use --verbose for detailed stacktrace.
*** schemaTool failed ***

24.启动metastore服务,此时虽不出现[root@master ]# ,但继续输入指令即可

[root@master ]# hive --service metastore &
[1] 8491
[root@master ]# 2019-05-26 19:07:12: Starting Hive Metastore Server
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/apache-hive-2.3.4-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Sun May 26 19:07:37 CST 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
hive(打开hive)
which: no hbase in (/usr/apache-hive-2.3.4-bin/bin:/usr/hadoop/bin:/usr/hadoop/sbin:/usr/java/jdk/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/local/zookeeper-3.4.10/bin:/usr/local/zookeeper-3.4.10/conf:/home/wuxiaoli/.local/bin:/home/wuxiaoli/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/apache-hive-2.3.4-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

Logging initialized using configuration in jar:file:/usr/apache-hive-2.3.4-bin/lib/hive-common-2.3.4.jar!/hive-log4j2.properties Async: true
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
hive> 

Hive JDBC

1.配置hive-site.xml

<property>
            <name>hive.server2.thrift.bind.host</name>
            <value>master</value>
            <description>Bind host on which to run the HiveServer2 Thrift service.</description>
          </property>
<property>
            <name>hive.zookeeper.quorum</name>
            <value>192.168.79.11:2181,192.168.79.22:2181,192.168.79.33:2181,192.168.79.44:2181,192.168.79.55:2181</value>
            <description>
              List of ZooKeeper servers to talk to. This is needed for:
              1. Read/write locks - when hive.lock.manager is set to
              org.apache.hadoop.hive.ql.lockmgr.zookeeper.ZooKeeperHiveLockManager,
              2. When HiveServer2 supports service discovery via Zookeeper.
              3. For delegation token storage if zookeeper store is used, if
              hive.cluster.delegation.token.store.zookeeper.connectString is not set
              4. LLAP daemon registry service
            </description>
          </property>
<property>
            <name>hive.support.concurrency</name>
            <value>true</value>
            <description>
              Whether Hive supports concurrency control or not.
              A ZooKeeper instance must be up and running when using zookeeper Hive lock manager
            </description>
          </property>
<property>
            <name>javax.jdo.option.ConnectionUserName</name>
            <value>hive</value>
            <description>Username to use against metastore database</description>
          </property>
 <name>javax.jdo.option.ConnectionPassword</name>
            <value>hive</value>
            <description>password to use against metastore database</description>
          </property>
          <property>
 <property>
    <name>hive.server2.enable.doAs</name>
    <value>false</value>
    <description>
      Setting this property to true will have HiveServer2 execute
      Hive operations as the user making the calls to it.
    </description>
  </property>

2.启动HiveServer2(通常使用后台启动)

appending output to ‘nohup.out’证明运行成功,同时把程序运行的输出信息,放到当前目录下的nohup.out文件中去

产生的日志,默认在用户文件夹里

[root@master conf]# hive --service hiveserver2(前台启动)

[root@master conf]# nohup hive --service hiverserver2 &
[2] 12320
[root@master conf]# nohup: ignoring input and appending output to ‘nohup.out’

3. 查看hiveserver2是否启动成功

[root@master apache-hive-2.3.4-bin]# ps -aux| grep hiveserver2 
root      11602  4.1 14.9 2025560 278108 pts/0  Sl   20:39   0:44 /usr/java/jdk/bin/java -Xmx256m -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/usr/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/hadoop -Dhadoop.id.str=wuxiaoli -Dhadoop.root.logger=INFO,console -Djava.library.path=/usr/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dproc_hiveserver2 -Dlog4j.configurationFile=hive-log4j2.properties -Djava.util.logging.config.file=/usr/apache-hive-2.3.4-bin/conf/parquet-logging.properties -Djline.terminal=jline.UnsupportedTerminal -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/apache-hive-2.3.4-bin/lib/hive-service-2.3.4.jar org.apache.hive.service.server.HiveServer2
root      12759  0.0  0.0 112708   972 pts/0    R+   20:57   0:00 grep --color=auto hiveserver2

4.新建Maven项目

 

  • Groupld:edu.qwer.hadoop
  • Artifactld:HiveTest

4.查找Maven相关依赖https://mvnrepository.com/artifact/com.opdar.gulosity

在搜索框中输入hive-jdbc,即打开了https://mvnrepository.com/artifact/org.apache.hive/hive-jdbc

选择2.3.4版本,因为hive是2.3.4版本,要在开头和结尾加上<dependencies></dependencies>

<dependencies>
        <!-- https://mvnrepository.com/artifact/org.apache.hive/hive-jdbc -->
        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-jdbc</artifactId>
            <version>2.3.4</version>
        </dependency>
</dependencies>

5.代码,确保有db1这个database

package edu.qfnu.hadoop;

import java.sql.*;

public class HiveTest {
    private static String driverName = "org.apache.hive.jdbc.HiveDriver";
    public static void main(String[] args) throws SQLException {
        Connection con = null;
        try{
            //加载驱动
            Class.forName(driverName);
            String url = "jdbc:hive2://192.168.79.11:10000/db1";
            String user = "hive"; //一般情况下可以使用匿名的方式,在这里使用了root是因为整个Hive的所有安装等操作都是root
            String password = "hive";

            //创建JDBC链接和Statement对象
            System.out.println("--connecting to "+url+"...");
            DriverManager.setLoginTimeout(3000);//登录时间
            con = DriverManager.getConnection(url);//异常处理抛出
            Statement stmt = con.createStatement();

            //初始化需要用到的字符串

            String tableName = "jdbc_test_table";
            String ddl_createTable = "CREATE TABLE db1."+tableName+"("
                   +"ip string,"
                    +"time string,"
                    +"url string,"
                    +"size string)"
                    +"row format delimited "//注意最后的空格
                    +"FIELDS terminated by '\t'"
                    +"STORED AS textfile "; //空格
            String dataPath = "/root/hiveTestData2.txt";
            //数据存放的路径/root/hiveTestData2.txt及名称hiveTestData2.txt
            System.out.println("dataPath:"+dataPath);
            String ddl_useDB = "use db1";
            String sql_descTable = "DESC "+tableName;  //DESC 后面有空格
            String ddl_loadData = "LOAD DATA local inpath '"+dataPath+"'INTO TABLE " + tableName; //注意三个引号
            String sql_selectData = "SELECT*FROM "+tableName;  //SELECT*FROM  后用空格
            String sql_count = "SELECT count(*) FROM "+tableName;
            String sql_insert = "INSERT INTO TABLE " + tableName + " VALUES ('321','123','231','132')"; //空格

            //创建表
            System.out.println("--creating table db1."+tableName+"...");
            stmt.executeUpdate(ddl_createTable);


            //指定数据库
            System.out.println("--use database:db1");
            stmt.executeUpdate(ddl_useDB);

            //描述表信息
            ResultSet rs_descTable = stmt.executeQuery(sql_descTable);
            System.out.println("--table info" + tableName);
            while(rs_descTable.next()){
                System.out.println(rs_descTable.getString(1)+"\t"+rs_descTable.getString(2));
            }

            //加载数据(注意加载数据存放在HiveServer2所在的机器里)
            System.out.println("--load data...");
            stmt.executeUpdate(ddl_loadData);

            //查询所有数据
            ResultSet rs_select = stmt.executeQuery(sql_selectData);
            System.out.println("--table data:");
            while(rs_select.next()){
                System.out.println(rs_select.getString(1)+";"+rs_select.getString(2)+";"
                        + rs_select.getString(3)+";"+rs_select.getString(4));
            }

            //查看数据量(执行速度慢)
            ResultSet rs_count = stmt.executeQuery(sql_count);
            System.out.println("--table data count:");
            while(rs_count.next()){
                System.out.println(rs_count.getString(1));

            }

            //插入数据
            stmt.executeUpdate(sql_insert);
            System.out.println("--insert table...");

            //查询所有数据
            rs_select = stmt.executeQuery(sql_selectData);
            System.out.println("--table data");
            while(rs_select.next()){
                System.out.println(rs_select.getString(1)+";"+rs_select.getString(2)+";"
                        + rs_select.getString(3)+";"+rs_select.getString(4));
                        
            }

        } catch (Exception e){
            e.printStackTrace();
            System.exit(1);
        }finally{
            try{
                //关闭链接
                con.close();
            }catch (SQLException e){
                e.printStackTrace();
            }
        }

    }
}

在本地IDEA运行报错,可能是zookeeper未打开

--connecting to jdbc:hive2://192.168.79.11:1000/db1...
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/Users/lenovo/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/Users/lenovo/.m2/repository/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://192.168.79.11:1000/db1: java.net.ConnectException: Connection refused: connect
	at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:224)
	at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
	at java.sql.DriverManager.getConnection(DriverManager.java:664)
	at java.sql.DriverManager.getConnection(DriverManager.java:270)
	at edu.qfnu.hadoop.HiveTest.main(HiveTest.java:16)
Caused by: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused: connect
	at org.apache.thrift.transport.TSocket.open(TSocket.java:226)
	at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:266)
	at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
	at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:311)
	at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:196)
	... 4 more
Caused by: java.net.ConnectException: Connection refused: connect
	at java.net.DualStackPlainSocketImpl.waitForConnect(Native Method)
	at java.net.DualStackPlainSocketImpl.socketConnect(DualStackPlainSocketImpl.java:85)
	at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
	at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
	at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
	at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:172)
	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
	at java.net.Socket.connect(Socket.java:589)
	at org.apache.thrift.transport.TSocket.open(TSocket.java:221)
	... 8 more

Process finished with exit code 1
"D:\Program Files\Java\jdk1.8.0_151\bin\java.exe" "-javaagent:D:\Program Files\JetBrains\IntelliJ IDEA 2019.1\lib\idea_rt.jar=65532:D:\Program Files\JetBrains\IntelliJ IDEA 2019.1\bin" -Dfile.encoding=UTF-8 -classpath "D:\Program Files\Java\jdk1.8.0_151\jre\lib\charsets.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\deploy.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\access-bridge-64.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\cldrdata.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\dnsns.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\jaccess.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\jfxrt.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\localedata.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\nashorn.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\sunec.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\sunjce_provider.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\sunmscapi.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\sunpkcs11.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\zipfs.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\javaws.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\jce.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\jfr.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\jfxswt.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\jsse.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\management-agent.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\plugin.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\resources.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\rt.jar;D:\代码\Idea Projects\hivedemo\target\classes;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-jdbc\2.3.4\hive-jdbc-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-common\2.3.4\hive-common-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-storage-api\2.4.0\hive-storage-api-2.4.0.jar;C:\Users\lenovo\.m2\repository\commons-cli\commons-cli\1.2\commons-cli-1.2.jar;C:\Users\lenovo\.m2\repository\commons-lang\commons-lang\2.6\commons-lang-2.6.jar;C:\Users\lenovo\.m2\repository\org\apache\commons\commons-lang3\3.1\commons-lang3-3.1.jar;C:\Users\lenovo\.m2\repository\org\apache\orc\orc-core\1.3.3\orc-core-1.3.3.jar;C:\Users\lenovo\.m2\repository\io\airlift\aircompressor\0.3\aircompressor-0.3.jar;C:\Users\lenovo\.m2\repository\jline\jline\2.12\jline-2.12.jar;C:\Users\lenovo\.m2\repository\org\eclipse\jetty\aggregate\jetty-all\7.6.0.v20120127\jetty-all-7.6.0.v20120127.jar;C:\Users\lenovo\.m2\repository\org\apache\geronimo\specs\geronimo-jta_1.1_spec\1.1.1\geronimo-jta_1.1_spec-1.1.1.jar;C:\Users\lenovo\.m2\repository\javax\mail\mail\1.4.1\mail-1.4.1.jar;C:\Users\lenovo\.m2\repository\javax\activation\activation\1.1\activation-1.1.jar;C:\Users\lenovo\.m2\repository\org\apache\geronimo\specs\geronimo-jaspic_1.0_spec\1.0\geronimo-jaspic_1.0_spec-1.0.jar;C:\Users\lenovo\.m2\repository\org\apache\geronimo\specs\geronimo-annotation_1.0_spec\1.1.1\geronimo-annotation_1.0_spec-1.1.1.jar;C:\Users\lenovo\.m2\repository\asm\asm-commons\3.1\asm-commons-3.1.jar;C:\Users\lenovo\.m2\repository\asm\asm-tree\3.1\asm-tree-3.1.jar;C:\Users\lenovo\.m2\repository\asm\asm\3.1\asm-3.1.jar;C:\Users\lenovo\.m2\repository\org\eclipse\jetty\orbit\javax.servlet\3.0.0.v201112011016\javax.servlet-3.0.0.v201112011016.jar;C:\Users\lenovo\.m2\repository\joda-time\joda-time\2.8.1\joda-time-2.8.1.jar;C:\Users\lenovo\.m2\repository\org\apache\logging\log4j\log4j-1.2-api\2.6.2\log4j-1.2-api-2.6.2.jar;C:\Users\lenovo\.m2\repository\org\apache\logging\log4j\log4j-api\2.6.2\log4j-api-2.6.2.jar;C:\Users\lenovo\.m2\repository\org\apache\logging\log4j\log4j-core\2.6.2\log4j-core-2.6.2.jar;C:\Users\lenovo\.m2\repository\org\apache\logging\log4j\log4j-web\2.6.2\log4j-web-2.6.2.jar;C:\Users\lenovo\.m2\repository\org\apache\logging\log4j\log4j-slf4j-impl\2.6.2\log4j-slf4j-impl-2.6.2.jar;C:\Users\lenovo\.m2\repository\org\apache\commons\commons-compress\1.9\commons-compress-1.9.jar;C:\Users\lenovo\.m2\repository\org\apache\ant\ant\1.9.1\ant-1.9.1.jar;C:\Users\lenovo\.m2\repository\org\apache\ant\ant-launcher\1.9.1\ant-launcher-1.9.1.jar;C:\Users\lenovo\.m2\repository\com\tdunning\json\1.8\json-1.8.jar;C:\Users\lenovo\.m2\repository\io\dropwizard\metrics\metrics-core\3.1.0\metrics-core-3.1.0.jar;C:\Users\lenovo\.m2\repository\io\dropwizard\metrics\metrics-jvm\3.1.0\metrics-jvm-3.1.0.jar;C:\Users\lenovo\.m2\repository\io\dropwizard\metrics\metrics-json\3.1.0\metrics-json-3.1.0.jar;C:\Users\lenovo\.m2\repository\com\fasterxml\jackson\core\jackson-databind\2.6.5\jackson-databind-2.6.5.jar;C:\Users\lenovo\.m2\repository\com\fasterxml\jackson\core\jackson-annotations\2.6.0\jackson-annotations-2.6.0.jar;C:\Users\lenovo\.m2\repository\com\fasterxml\jackson\core\jackson-core\2.6.5\jackson-core-2.6.5.jar;C:\Users\lenovo\.m2\repository\com\github\joshelser\dropwizard-metrics-hadoop-metrics2-reporter\0.1.2\dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-common\2.6.0\hadoop-common-2.6.0.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-annotations\2.6.0\hadoop-annotations-2.6.0.jar;C:\Users\lenovo\.m2\repository\org\apache\commons\commons-math3\3.1.1\commons-math3-3.1.1.jar;C:\Users\lenovo\.m2\repository\xmlenc\xmlenc\0.52\xmlenc-0.52.jar;C:\Users\lenovo\.m2\repository\commons-httpclient\commons-httpclient\3.1\commons-httpclient-3.1.jar;C:\Users\lenovo\.m2\repository\commons-net\commons-net\3.1\commons-net-3.1.jar;C:\Users\lenovo\.m2\repository\commons-collections\commons-collections\3.2.1\commons-collections-3.2.1.jar;C:\Users\lenovo\.m2\repository\org\mortbay\jetty\jetty\6.1.26\jetty-6.1.26.jar;C:\Users\lenovo\.m2\repository\org\mortbay\jetty\jetty-util\6.1.26\jetty-util-6.1.26.jar;C:\Users\lenovo\.m2\repository\com\sun\jersey\jersey-core\1.9\jersey-core-1.9.jar;C:\Users\lenovo\.m2\repository\com\sun\jersey\jersey-json\1.9\jersey-json-1.9.jar;C:\Users\lenovo\.m2\repository\com\sun\xml\bind\jaxb-impl\2.2.3-1\jaxb-impl-2.2.3-1.jar;C:\Users\lenovo\.m2\repository\com\sun\jersey\jersey-server\1.9\jersey-server-1.9.jar;C:\Users\lenovo\.m2\repository\javax\servlet\jsp\jsp-api\2.1\jsp-api-2.1.jar;C:\Users\lenovo\.m2\repository\net\java\dev\jets3t\jets3t\0.9.0\jets3t-0.9.0.jar;C:\Users\lenovo\.m2\repository\com\jamesmurty\utils\java-xmlbuilder\0.4\java-xmlbuilder-0.4.jar;C:\Users\lenovo\.m2\repository\commons-configuration\commons-configuration\1.6\commons-configuration-1.6.jar;C:\Users\lenovo\.m2\repository\commons-digester\commons-digester\1.8\commons-digester-1.8.jar;C:\Users\lenovo\.m2\repository\commons-beanutils\commons-beanutils\1.7.0\commons-beanutils-1.7.0.jar;C:\Users\lenovo\.m2\repository\commons-beanutils\commons-beanutils-core\1.8.0\commons-beanutils-core-1.8.0.jar;C:\Users\lenovo\.m2\repository\com\jcraft\jsch\0.1.42\jsch-0.1.42.jar;C:\Users\lenovo\.m2\repository\org\htrace\htrace-core\3.0.4\htrace-core-3.0.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-service\2.3.4\hive-service-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-llap-server\2.3.4\hive-llap-server-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-llap-common\2.3.4\hive-llap-common-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-llap-client\2.3.4\hive-llap-client-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-llap-tez\2.3.4\hive-llap-tez-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\slider\slider-core\0.90.2-incubating\slider-core-0.90.2-incubating.jar;C:\Users\lenovo\.m2\repository\com\beust\jcommander\1.30\jcommander-1.30.jar;C:\Users\lenovo\.m2\repository\org\codehaus\jackson\jackson-jaxrs\1.9.13\jackson-jaxrs-1.9.13.jar;C:\Users\lenovo\.m2\repository\org\codehaus\jackson\jackson-xc\1.9.13\jackson-xc-1.9.13.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-client\2.7.1\hadoop-client-2.7.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-app\2.7.1\hadoop-mapreduce-client-app-2.7.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-common\2.7.1\hadoop-mapreduce-client-common-2.7.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-shuffle\2.7.1\hadoop-mapreduce-client-shuffle-2.7.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-jobclient\2.7.1\hadoop-mapreduce-client-jobclient-2.7.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-hdfs\2.7.1\hadoop-hdfs-2.7.1.jar;C:\Users\lenovo\.m2\repository\commons-daemon\commons-daemon\1.0.13\commons-daemon-1.0.13.jar;C:\Users\lenovo\.m2\repository\xerces\xercesImpl\2.9.1\xercesImpl-2.9.1.jar;C:\Users\lenovo\.m2\repository\xml-apis\xml-apis\1.3.04\xml-apis-1.3.04.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-yarn-registry\2.7.1\hadoop-yarn-registry-2.7.1.jar;C:\Users\lenovo\.m2\repository\com\sun\jersey\jersey-client\1.9\jersey-client-1.9.jar;C:\Users\lenovo\.m2\repository\com\google\inject\extensions\guice-servlet\3.0\guice-servlet-3.0.jar;C:\Users\lenovo\.m2\repository\org\codehaus\jettison\jettison\1.1\jettison-1.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-llap-common\2.3.4\hive-llap-common-2.3.4-tests.jar;C:\Users\lenovo\.m2\repository\org\apache\hbase\hbase-hadoop2-compat\1.1.1\hbase-hadoop2-compat-1.1.1.jar;C:\Users\lenovo\.m2\repository\org\apache\commons\commons-math\2.2\commons-math-2.2.jar;C:\Users\lenovo\.m2\repository\com\yammer\metrics\metrics-core\2.2.0\metrics-core-2.2.0.jar;C:\Users\lenovo\.m2\repository\org\apache\hbase\hbase-server\1.1.1\hbase-server-1.1.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hbase\hbase-procedure\1.1.1\hbase-procedure-1.1.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hbase\hbase-common\1.1.1\hbase-common-1.1.1-tests.jar;C:\Users\lenovo\.m2\repository\org\apache\hbase\hbase-prefix-tree\1.1.1\hbase-prefix-tree-1.1.1.jar;C:\Users\lenovo\.m2\repository\org\mortbay\jetty\jetty-sslengine\6.1.26\jetty-sslengine-6.1.26.jar;C:\Users\lenovo\.m2\repository\org\mortbay\jetty\jsp-2.1\6.1.14\jsp-2.1-6.1.14.jar;C:\Users\lenovo\.m2\repository\org\mortbay\jetty\jsp-api-2.1\6.1.14\jsp-api-2.1-6.1.14.jar;C:\Users\lenovo\.m2\repository\org\mortbay\jetty\servlet-api-2.5\6.1.14\servlet-api-2.5-6.1.14.jar;C:\Users\lenovo\.m2\repository\com\lmax\disruptor\3.3.0\disruptor-3.3.0.jar;C:\Users\lenovo\.m2\repository\org\apache\hbase\hbase-common\1.1.1\hbase-common-1.1.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hbase\hbase-hadoop-compat\1.1.1\hbase-hadoop-compat-1.1.1.jar;C:\Users\lenovo\.m2\repository\commons-codec\commons-codec\1.4\commons-codec-1.4.jar;C:\Users\lenovo\.m2\repository\net\sf\jpam\jpam\1.1\jpam-1.1.jar;C:\Users\lenovo\.m2\repository\tomcat\jasper-compiler\5.5.23\jasper-compiler-5.5.23.jar;C:\Users\lenovo\.m2\repository\javax\servlet\jsp-api\2.0\jsp-api-2.0.jar;C:\Users\lenovo\.m2\repository\ant\ant\1.6.5\ant-1.6.5.jar;C:\Users\lenovo\.m2\repository\tomcat\jasper-runtime\5.5.23\jasper-runtime-5.5.23.jar;C:\Users\lenovo\.m2\repository\javax\servlet\servlet-api\2.4\servlet-api-2.4.jar;C:\Users\lenovo\.m2\repository\commons-el\commons-el\1.0\commons-el-1.0.jar;C:\Users\lenovo\.m2\repository\org\apache\thrift\libfb303\0.9.3\libfb303-0.9.3.jar;C:\Users\lenovo\.m2\repository\org\apache\curator\curator-recipes\2.7.1\curator-recipes-2.7.1.jar;C:\Users\lenovo\.m2\repository\org\jamon\jamon-runtime\2.3.1\jamon-runtime-2.3.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-serde\2.3.4\hive-serde-2.3.4.jar;C:\Users\lenovo\.m2\repository\com\google\code\findbugs\jsr305\3.0.0\jsr305-3.0.0.jar;C:\Users\lenovo\.m2\repository\org\apache\avro\avro\1.7.7\avro-1.7.7.jar;C:\Users\lenovo\.m2\repository\org\codehaus\jackson\jackson-core-asl\1.9.13\jackson-core-asl-1.9.13.jar;C:\Users\lenovo\.m2\repository\org\codehaus\jackson\jackson-mapper-asl\1.9.13\jackson-mapper-asl-1.9.13.jar;C:\Users\lenovo\.m2\repository\com\thoughtworks\paranamer\paranamer\2.3\paranamer-2.3.jar;C:\Users\lenovo\.m2\repository\org\xerial\snappy\snappy-java\1.0.5\snappy-java-1.0.5.jar;C:\Users\lenovo\.m2\repository\net\sf\opencsv\opencsv\2.3\opencsv-2.3.jar;C:\Users\lenovo\.m2\repository\org\apache\parquet\parquet-hadoop-bundle\1.8.1\parquet-hadoop-bundle-1.8.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-metastore\2.3.4\hive-metastore-2.3.4.jar;C:\Users\lenovo\.m2\repository\javolution\javolution\5.5.1\javolution-5.5.1.jar;C:\Users\lenovo\.m2\repository\com\google\guava\guava\14.0.1\guava-14.0.1.jar;C:\Users\lenovo\.m2\repository\com\google\protobuf\protobuf-java\2.5.0\protobuf-java-2.5.0.jar;C:\Users\lenovo\.m2\repository\org\apache\hbase\hbase-client\1.1.1\hbase-client-1.1.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hbase\hbase-annotations\1.1.1\hbase-annotations-1.1.1.jar;D:\Program Files\Java\jdk1.8.0_151\lib\tools.jar;C:\Users\lenovo\.m2\repository\org\apache\hbase\hbase-protocol\1.1.1\hbase-protocol-1.1.1.jar;C:\Users\lenovo\.m2\repository\commons-io\commons-io\2.4\commons-io-2.4.jar;C:\Users\lenovo\.m2\repository\io\netty\netty-all\4.0.23.Final\netty-all-4.0.23.Final.jar;C:\Users\lenovo\.m2\repository\org\apache\htrace\htrace-core\3.1.0-incubating\htrace-core-3.1.0-incubating.jar;C:\Users\lenovo\.m2\repository\org\jruby\jcodings\jcodings\1.0.8\jcodings-1.0.8.jar;C:\Users\lenovo\.m2\repository\org\jruby\joni\joni\2.1.2\joni-2.1.2.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-auth\2.5.1\hadoop-auth-2.5.1.jar;C:\Users\lenovo\.m2\repository\org\apache\directory\server\apacheds-kerberos-codec\2.0.0-M15\apacheds-kerberos-codec-2.0.0-M15.jar;C:\Users\lenovo\.m2\repository\org\apache\directory\server\apacheds-i18n\2.0.0-M15\apacheds-i18n-2.0.0-M15.jar;C:\Users\lenovo\.m2\repository\org\apache\directory\api\api-asn1-api\1.0.0-M20\api-asn1-api-1.0.0-M20.jar;C:\Users\lenovo\.m2\repository\org\apache\directory\api\api-util\1.0.0-M20\api-util-1.0.0-M20.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-core\2.5.1\hadoop-mapreduce-client-core-2.5.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-yarn-common\2.5.1\hadoop-yarn-common-2.5.1.jar;C:\Users\lenovo\.m2\repository\com\github\stephenc\findbugs\findbugs-annotations\1.3.9-1\findbugs-annotations-1.3.9-1.jar;C:\Users\lenovo\.m2\repository\junit\junit\4.11\junit-4.11.jar;C:\Users\lenovo\.m2\repository\org\hamcrest\hamcrest-core\1.3\hamcrest-core-1.3.jar;C:\Users\lenovo\.m2\repository\com\jolbox\bonecp\0.8.0.RELEASE\bonecp-0.8.0.RELEASE.jar;C:\Users\lenovo\.m2\repository\com\zaxxer\HikariCP\2.5.1\HikariCP-2.5.1.jar;C:\Users\lenovo\.m2\repository\org\apache\derby\derby\10.10.2.0\derby-10.10.2.0.jar;C:\Users\lenovo\.m2\repository\org\datanucleus\datanucleus-api-jdo\4.2.4\datanucleus-api-jdo-4.2.4.jar;C:\Users\lenovo\.m2\repository\org\datanucleus\datanucleus-core\4.1.17\datanucleus-core-4.1.17.jar;C:\Users\lenovo\.m2\repository\org\datanucleus\datanucleus-rdbms\4.1.19\datanucleus-rdbms-4.1.19.jar;C:\Users\lenovo\.m2\repository\commons-pool\commons-pool\1.5.4\commons-pool-1.5.4.jar;C:\Users\lenovo\.m2\repository\commons-dbcp\commons-dbcp\1.4\commons-dbcp-1.4.jar;C:\Users\lenovo\.m2\repository\javax\jdo\jdo-api\3.0.1\jdo-api-3.0.1.jar;C:\Users\lenovo\.m2\repository\javax\transaction\jta\1.1\jta-1.1.jar;C:\Users\lenovo\.m2\repository\org\datanucleus\javax.jdo\3.2.0-m3\javax.jdo-3.2.0-m3.jar;C:\Users\lenovo\.m2\repository\javax\transaction\transaction-api\1.1\transaction-api-1.1.jar;C:\Users\lenovo\.m2\repository\org\antlr\antlr-runtime\3.5.2\antlr-runtime-3.5.2.jar;C:\Users\lenovo\.m2\repository\co\cask\tephra\tephra-api\0.6.0\tephra-api-0.6.0.jar;C:\Users\lenovo\.m2\repository\co\cask\tephra\tephra-core\0.6.0\tephra-core-0.6.0.jar;C:\Users\lenovo\.m2\repository\com\google\code\gson\gson\2.2.4\gson-2.2.4.jar;C:\Users\lenovo\.m2\repository\com\google\inject\guice\3.0\guice-3.0.jar;C:\Users\lenovo\.m2\repository\javax\inject\javax.inject\1\javax.inject-1.jar;C:\Users\lenovo\.m2\repository\aopalliance\aopalliance\1.0\aopalliance-1.0.jar;C:\Users\lenovo\.m2\repository\com\google\inject\extensions\guice-assistedinject\3.0\guice-assistedinject-3.0.jar;C:\Users\lenovo\.m2\repository\it\unimi\dsi\fastutil\6.5.6\fastutil-6.5.6.jar;C:\Users\lenovo\.m2\repository\org\apache\twill\twill-common\0.6.0-incubating\twill-common-0.6.0-incubating.jar;C:\Users\lenovo\.m2\repository\org\apache\twill\twill-core\0.6.0-incubating\twill-core-0.6.0-incubating.jar;C:\Users\lenovo\.m2\repository\org\apache\twill\twill-api\0.6.0-incubating\twill-api-0.6.0-incubating.jar;C:\Users\lenovo\.m2\repository\org\apache\twill\twill-discovery-api\0.6.0-incubating\twill-discovery-api-0.6.0-incubating.jar;C:\Users\lenovo\.m2\repository\org\apache\twill\twill-discovery-core\0.6.0-incubating\twill-discovery-core-0.6.0-incubating.jar;C:\Users\lenovo\.m2\repository\org\apache\twill\twill-zookeeper\0.6.0-incubating\twill-zookeeper-0.6.0-incubating.jar;C:\Users\lenovo\.m2\repository\co\cask\tephra\tephra-hbase-compat-1.0\0.6.0\tephra-hbase-compat-1.0-0.6.0.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-shims\2.3.4\hive-shims-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\shims\hive-shims-common\2.3.4\hive-shims-common-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\shims\hive-shims-0.23\2.3.4\hive-shims-0.23-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-yarn-server-resourcemanager\2.7.2\hadoop-yarn-server-resourcemanager-2.7.2.jar;C:\Users\lenovo\.m2\repository\com\sun\jersey\contribs\jersey-guice\1.9\jersey-guice-1.9.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-yarn-api\2.7.2\hadoop-yarn-api-2.7.2.jar;C:\Users\lenovo\.m2\repository\javax\xml\bind\jaxb-api\2.2.2\jaxb-api-2.2.2.jar;C:\Users\lenovo\.m2\repository\javax\xml\stream\stax-api\1.0-2\stax-api-1.0-2.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-yarn-server-common\2.7.2\hadoop-yarn-server-common-2.7.2.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-yarn-server-applicationhistoryservice\2.7.2\hadoop-yarn-server-applicationhistoryservice-2.7.2.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-yarn-server-web-proxy\2.7.2\hadoop-yarn-server-web-proxy-2.7.2.jar;C:\Users\lenovo\.m2\repository\org\fusesource\leveldbjni\leveldbjni-all\1.8\leveldbjni-all-1.8.jar;C:\Users\lenovo\.m2\repository\org\apache\zookeeper\zookeeper\3.4.6\zookeeper-3.4.6-tests.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\shims\hive-shims-scheduler\2.3.4\hive-shims-scheduler-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-service-rpc\2.3.4\hive-service-rpc-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\httpcomponents\httpclient\4.4\httpclient-4.4.jar;C:\Users\lenovo\.m2\repository\commons-logging\commons-logging\1.2\commons-logging-1.2.jar;C:\Users\lenovo\.m2\repository\org\apache\httpcomponents\httpcore\4.4\httpcore-4.4.jar;C:\Users\lenovo\.m2\repository\org\apache\thrift\libthrift\0.9.3\libthrift-0.9.3.jar;C:\Users\lenovo\.m2\repository\org\apache\zookeeper\zookeeper\3.4.6\zookeeper-3.4.6.jar;C:\Users\lenovo\.m2\repository\org\slf4j\slf4j-log4j12\1.6.1\slf4j-log4j12-1.6.1.jar;C:\Users\lenovo\.m2\repository\log4j\log4j\1.2.16\log4j-1.2.16.jar;C:\Users\lenovo\.m2\repository\io\netty\netty\3.7.0.Final\netty-3.7.0.Final.jar;C:\Users\lenovo\.m2\repository\org\apache\curator\curator-framework\2.7.1\curator-framework-2.7.1.jar;C:\Users\lenovo\.m2\repository\org\apache\curator\curator-client\2.7.1\curator-client-2.7.1.jar;C:\Users\lenovo\.m2\repository\org\slf4j\slf4j-api\1.7.10\slf4j-api-1.7.10.jar" edu.qfnu.hadoop.HiveTest
--connecting to jdbc:hive2://192.168.79.11:10000/db1...
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/Users/lenovo/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/Users/lenovo/.m2/repository/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
dataPath:/root/hiveTestData2.txt
--creating table db1.jdbc_test_table...

 

 运行到此处不往下进行了,而且在虚拟机中打开hive输入命令,也无法显示,查看hive.log

[root@master root]# find / -name hive.log
/tmp/root/hive.log
#进入后查看
2019-06-18T02:16:52,060  INFO [HiveServer2-Background-Pool: Thread-44-SendThread(192.168.79.44:2181)] zookeeper.ClientCnxn: Opening socket connection to server 192.168.79.11/192.168.79.11:2181. Will not attempt to authenticate using SASL (unknown error)
2019-06-18T02:16:52,060  WARN [HiveServer2-Background-Pool: Thread-44-SendThread(192.168.79.44:2181)] zookeeper.ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect
java.net.ConnectException: Connection refused
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:1.8.0_201]
        at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) ~[?:1.8.0_201]
        at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361) ~[zookeeper-3.4.6.jar:3.4.6-1569965]
        at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1081) ~[zookeeper-3.4.6.jar:3.4.6-1569965]

 发现zookeeper无法启动

[root@master root]# zkServer.sh start
ZooKeeper JMX enabled by default
Using config: /usr/local/zookeeper-3.4.10/bin/../conf/zoo.cfg
Starting zookeeper ... STARTED
[root@master root]# zkServer.sh status
ZooKeeper JMX enabled by default
Using config: /usr/local/zookeeper-3.4.10/bin/../conf/zoo.cfg
Error contacting service. It is probably not running.
[root@master root]#

zookeeper日志

2019-06-18 03:00:35,789 [myid:1] - INFO  [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2181:NIOServerCnxnFactory@192] - Accepted socket connection from /192.168.79.11:53422
2019-06-18 03:00:35,789 [myid:1] - WARN  [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2181:NIOServerCnxn@373] - Exception causing close of session 0x0 due to java.io.IOException: ZooKeeperServer not running
2019-06-18 03:00:35,789 [myid:1] - INFO  [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2181:NIOServerCnxn@1044] - Closed socket connection for client /192.168.79.11:53422 (no session established for client)
2019-06-18 03:00:39,814 [myid:1] - INFO  [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2181:NIOServerCnxnFactory@192] - Accepted socket connection from /192.168.79.11:53432
2019-06-18 03:00:39,814 [myid:1] - WARN  [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2181:NIOServerCnxn@373] - Exception causing close of session 0x0 due to java.io.IOException: ZooKeeperServer not running
2019-06-18 03:03:05,008 [myid:1] - WARN  [QuorumPeer[myid=1]/0:0:0:0:0:0:0:0:2181:QuorumCnxManager@588] - Cannot open channel to 5 at election address /192.168.79.55:3888
java.net.ConnectException: Connection refused (Connection refused)
        at java.net.PlainSocketImpl.socketConnect(Native Method)
        at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
        at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
        at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
        at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
        at java.net.Socket.connect(Socket.java:589)
        at org.apache.zookeeper.server.quorum.QuorumCnxManager.connectOne(QuorumCnxManager.java:562)
        at org.apache.zookeeper.server.quorum.QuorumCnxManager.connectAll(QuorumCnxManager.java:614)
        at org.apache.zookeeper.server.quorum.FastLeaderElection.lookForLeader(FastLeaderElection.java:843)
        at org.apache.zookeeper.server.quorum.QuorumPeer.run(QuorumPeer.java:913)
2019-06-18 03:03:05,008 [myid:1] - INFO  [QuorumPeer[myid=1]/0:0:0:0:0:0:0:0:2181:QuorumPeer$QuorumServer@167] - Resolved hostname: 192.168.79.55 to address: /192.168.79.55
2019-06-18 03:03:06,887 [myid:1] - WARN  [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2181:NIOServerCnxn@373] - Exception causing close of session 0x0 due to java.io.IOException: ZooKeeperServer not running
2019-06-18 03:03:06,887 [myid:1] - INFO  [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2181:NIOServerCnxn@1044] - Closed socket connection for client /192.168.79.11:53820 (no session established for client)

 

 运行之前要打开zookeeper,

运行结果

"D:\Program Files\Java\jdk1.8.0_151\bin\java.exe" "-javaagent:D:\Program Files\JetBrains\IntelliJ IDEA 2019.1\lib\idea_rt.jar=55574:D:\Program Files\JetBrains\IntelliJ IDEA 2019.1\bin" -Dfile.encoding=UTF-8 -classpath "D:\Program Files\Java\jdk1.8.0_151\jre\lib\charsets.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\deploy.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\access-bridge-64.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\cldrdata.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\dnsns.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\jaccess.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\jfxrt.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\localedata.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\nashorn.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\sunec.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\sunjce_provider.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\sunmscapi.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\sunpkcs11.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\ext\zipfs.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\javaws.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\jce.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\jfr.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\jfxswt.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\jsse.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\management-agent.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\plugin.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\resources.jar;D:\Program Files\Java\jdk1.8.0_151\jre\lib\rt.jar;D:\代码\Idea Projects\hivedemo\target\classes;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-jdbc\2.3.4\hive-jdbc-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-common\2.3.4\hive-common-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-storage-api\2.4.0\hive-storage-api-2.4.0.jar;C:\Users\lenovo\.m2\repository\commons-cli\commons-cli\1.2\commons-cli-1.2.jar;C:\Users\lenovo\.m2\repository\commons-lang\commons-lang\2.6\commons-lang-2.6.jar;C:\Users\lenovo\.m2\repository\org\apache\commons\commons-lang3\3.1\commons-lang3-3.1.jar;C:\Users\lenovo\.m2\repository\org\apache\orc\orc-core\1.3.3\orc-core-1.3.3.jar;C:\Users\lenovo\.m2\repository\io\airlift\aircompressor\0.3\aircompressor-0.3.jar;C:\Users\lenovo\.m2\repository\jline\jline\2.12\jline-2.12.jar;C:\Users\lenovo\.m2\repository\org\eclipse\jetty\aggregate\jetty-all\7.6.0.v20120127\jetty-all-7.6.0.v20120127.jar;C:\Users\lenovo\.m2\repository\org\apache\geronimo\specs\geronimo-jta_1.1_spec\1.1.1\geronimo-jta_1.1_spec-1.1.1.jar;C:\Users\lenovo\.m2\repository\javax\mail\mail\1.4.1\mail-1.4.1.jar;C:\Users\lenovo\.m2\repository\javax\activation\activation\1.1\activation-1.1.jar;C:\Users\lenovo\.m2\repository\org\apache\geronimo\specs\geronimo-jaspic_1.0_spec\1.0\geronimo-jaspic_1.0_spec-1.0.jar;C:\Users\lenovo\.m2\repository\org\apache\geronimo\specs\geronimo-annotation_1.0_spec\1.1.1\geronimo-annotation_1.0_spec-1.1.1.jar;C:\Users\lenovo\.m2\repository\asm\asm-commons\3.1\asm-commons-3.1.jar;C:\Users\lenovo\.m2\repository\asm\asm-tree\3.1\asm-tree-3.1.jar;C:\Users\lenovo\.m2\repository\asm\asm\3.1\asm-3.1.jar;C:\Users\lenovo\.m2\repository\org\eclipse\jetty\orbit\javax.servlet\3.0.0.v201112011016\javax.servlet-3.0.0.v201112011016.jar;C:\Users\lenovo\.m2\repository\joda-time\joda-time\2.8.1\joda-time-2.8.1.jar;C:\Users\lenovo\.m2\repository\org\apache\logging\log4j\log4j-1.2-api\2.6.2\log4j-1.2-api-2.6.2.jar;C:\Users\lenovo\.m2\repository\org\apache\logging\log4j\log4j-api\2.6.2\log4j-api-2.6.2.jar;C:\Users\lenovo\.m2\repository\org\apache\logging\log4j\log4j-core\2.6.2\log4j-core-2.6.2.jar;C:\Users\lenovo\.m2\repository\org\apache\logging\log4j\log4j-web\2.6.2\log4j-web-2.6.2.jar;C:\Users\lenovo\.m2\repository\org\apache\logging\log4j\log4j-slf4j-impl\2.6.2\log4j-slf4j-impl-2.6.2.jar;C:\Users\lenovo\.m2\repository\org\apache\commons\commons-compress\1.9\commons-compress-1.9.jar;C:\Users\lenovo\.m2\repository\org\apache\ant\ant\1.9.1\ant-1.9.1.jar;C:\Users\lenovo\.m2\repository\org\apache\ant\ant-launcher\1.9.1\ant-launcher-1.9.1.jar;C:\Users\lenovo\.m2\repository\com\tdunning\json\1.8\json-1.8.jar;C:\Users\lenovo\.m2\repository\io\dropwizard\metrics\metrics-core\3.1.0\metrics-core-3.1.0.jar;C:\Users\lenovo\.m2\repository\io\dropwizard\metrics\metrics-jvm\3.1.0\metrics-jvm-3.1.0.jar;C:\Users\lenovo\.m2\repository\io\dropwizard\metrics\metrics-json\3.1.0\metrics-json-3.1.0.jar;C:\Users\lenovo\.m2\repository\com\fasterxml\jackson\core\jackson-databind\2.6.5\jackson-databind-2.6.5.jar;C:\Users\lenovo\.m2\repository\com\fasterxml\jackson\core\jackson-annotations\2.6.0\jackson-annotations-2.6.0.jar;C:\Users\lenovo\.m2\repository\com\fasterxml\jackson\core\jackson-core\2.6.5\jackson-core-2.6.5.jar;C:\Users\lenovo\.m2\repository\com\github\joshelser\dropwizard-metrics-hadoop-metrics2-reporter\0.1.2\dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-common\2.6.0\hadoop-common-2.6.0.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-annotations\2.6.0\hadoop-annotations-2.6.0.jar;C:\Users\lenovo\.m2\repository\org\apache\commons\commons-math3\3.1.1\commons-math3-3.1.1.jar;C:\Users\lenovo\.m2\repository\xmlenc\xmlenc\0.52\xmlenc-0.52.jar;C:\Users\lenovo\.m2\repository\commons-httpclient\commons-httpclient\3.1\commons-httpclient-3.1.jar;C:\Users\lenovo\.m2\repository\commons-net\commons-net\3.1\commons-net-3.1.jar;C:\Users\lenovo\.m2\repository\commons-collections\commons-collections\3.2.1\commons-collections-3.2.1.jar;C:\Users\lenovo\.m2\repository\org\mortbay\jetty\jetty\6.1.26\jetty-6.1.26.jar;C:\Users\lenovo\.m2\repository\org\mortbay\jetty\jetty-util\6.1.26\jetty-util-6.1.26.jar;C:\Users\lenovo\.m2\repository\com\sun\jersey\jersey-core\1.9\jersey-core-1.9.jar;C:\Users\lenovo\.m2\repository\com\sun\jersey\jersey-json\1.9\jersey-json-1.9.jar;C:\Users\lenovo\.m2\repository\com\sun\xml\bind\jaxb-impl\2.2.3-1\jaxb-impl-2.2.3-1.jar;C:\Users\lenovo\.m2\repository\com\sun\jersey\jersey-server\1.9\jersey-server-1.9.jar;C:\Users\lenovo\.m2\repository\javax\servlet\jsp\jsp-api\2.1\jsp-api-2.1.jar;C:\Users\lenovo\.m2\repository\net\java\dev\jets3t\jets3t\0.9.0\jets3t-0.9.0.jar;C:\Users\lenovo\.m2\repository\com\jamesmurty\utils\java-xmlbuilder\0.4\java-xmlbuilder-0.4.jar;C:\Users\lenovo\.m2\repository\commons-configuration\commons-configuration\1.6\commons-configuration-1.6.jar;C:\Users\lenovo\.m2\repository\commons-digester\commons-digester\1.8\commons-digester-1.8.jar;C:\Users\lenovo\.m2\repository\commons-beanutils\commons-beanutils\1.7.0\commons-beanutils-1.7.0.jar;C:\Users\lenovo\.m2\repository\commons-beanutils\commons-beanutils-core\1.8.0\commons-beanutils-core-1.8.0.jar;C:\Users\lenovo\.m2\repository\com\jcraft\jsch\0.1.42\jsch-0.1.42.jar;C:\Users\lenovo\.m2\repository\org\htrace\htrace-core\3.0.4\htrace-core-3.0.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-service\2.3.4\hive-service-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-llap-server\2.3.4\hive-llap-server-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-llap-common\2.3.4\hive-llap-common-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-llap-client\2.3.4\hive-llap-client-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-llap-tez\2.3.4\hive-llap-tez-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\slider\slider-core\0.90.2-incubating\slider-core-0.90.2-incubating.jar;C:\Users\lenovo\.m2\repository\com\beust\jcommander\1.30\jcommander-1.30.jar;C:\Users\lenovo\.m2\repository\org\codehaus\jackson\jackson-jaxrs\1.9.13\jackson-jaxrs-1.9.13.jar;C:\Users\lenovo\.m2\repository\org\codehaus\jackson\jackson-xc\1.9.13\jackson-xc-1.9.13.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-client\2.7.1\hadoop-client-2.7.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-app\2.7.1\hadoop-mapreduce-client-app-2.7.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-common\2.7.1\hadoop-mapreduce-client-common-2.7.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-shuffle\2.7.1\hadoop-mapreduce-client-shuffle-2.7.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-jobclient\2.7.1\hadoop-mapreduce-client-jobclient-2.7.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-hdfs\2.7.1\hadoop-hdfs-2.7.1.jar;C:\Users\lenovo\.m2\repository\commons-daemon\commons-daemon\1.0.13\commons-daemon-1.0.13.jar;C:\Users\lenovo\.m2\repository\xerces\xercesImpl\2.9.1\xercesImpl-2.9.1.jar;C:\Users\lenovo\.m2\repository\xml-apis\xml-apis\1.3.04\xml-apis-1.3.04.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-yarn-registry\2.7.1\hadoop-yarn-registry-2.7.1.jar;C:\Users\lenovo\.m2\repository\com\sun\jersey\jersey-client\1.9\jersey-client-1.9.jar;C:\Users\lenovo\.m2\repository\com\google\inject\extensions\guice-servlet\3.0\guice-servlet-3.0.jar;C:\Users\lenovo\.m2\repository\org\codehaus\jettison\jettison\1.1\jettison-1.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-llap-common\2.3.4\hive-llap-common-2.3.4-tests.jar;C:\Users\lenovo\.m2\repository\org\apache\hbase\hbase-hadoop2-compat\1.1.1\hbase-hadoop2-compat-1.1.1.jar;C:\Users\lenovo\.m2\repository\org\apache\commons\commons-math\2.2\commons-math-2.2.jar;C:\Users\lenovo\.m2\repository\com\yammer\metrics\metrics-core\2.2.0\metrics-core-2.2.0.jar;C:\Users\lenovo\.m2\repository\org\apache\hbase\hbase-server\1.1.1\hbase-server-1.1.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hbase\hbase-procedure\1.1.1\hbase-procedure-1.1.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hbase\hbase-common\1.1.1\hbase-common-1.1.1-tests.jar;C:\Users\lenovo\.m2\repository\org\apache\hbase\hbase-prefix-tree\1.1.1\hbase-prefix-tree-1.1.1.jar;C:\Users\lenovo\.m2\repository\org\mortbay\jetty\jetty-sslengine\6.1.26\jetty-sslengine-6.1.26.jar;C:\Users\lenovo\.m2\repository\org\mortbay\jetty\jsp-2.1\6.1.14\jsp-2.1-6.1.14.jar;C:\Users\lenovo\.m2\repository\org\mortbay\jetty\jsp-api-2.1\6.1.14\jsp-api-2.1-6.1.14.jar;C:\Users\lenovo\.m2\repository\org\mortbay\jetty\servlet-api-2.5\6.1.14\servlet-api-2.5-6.1.14.jar;C:\Users\lenovo\.m2\repository\com\lmax\disruptor\3.3.0\disruptor-3.3.0.jar;C:\Users\lenovo\.m2\repository\org\apache\hbase\hbase-common\1.1.1\hbase-common-1.1.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hbase\hbase-hadoop-compat\1.1.1\hbase-hadoop-compat-1.1.1.jar;C:\Users\lenovo\.m2\repository\commons-codec\commons-codec\1.4\commons-codec-1.4.jar;C:\Users\lenovo\.m2\repository\net\sf\jpam\jpam\1.1\jpam-1.1.jar;C:\Users\lenovo\.m2\repository\tomcat\jasper-compiler\5.5.23\jasper-compiler-5.5.23.jar;C:\Users\lenovo\.m2\repository\javax\servlet\jsp-api\2.0\jsp-api-2.0.jar;C:\Users\lenovo\.m2\repository\ant\ant\1.6.5\ant-1.6.5.jar;C:\Users\lenovo\.m2\repository\tomcat\jasper-runtime\5.5.23\jasper-runtime-5.5.23.jar;C:\Users\lenovo\.m2\repository\javax\servlet\servlet-api\2.4\servlet-api-2.4.jar;C:\Users\lenovo\.m2\repository\commons-el\commons-el\1.0\commons-el-1.0.jar;C:\Users\lenovo\.m2\repository\org\apache\thrift\libfb303\0.9.3\libfb303-0.9.3.jar;C:\Users\lenovo\.m2\repository\org\apache\curator\curator-recipes\2.7.1\curator-recipes-2.7.1.jar;C:\Users\lenovo\.m2\repository\org\jamon\jamon-runtime\2.3.1\jamon-runtime-2.3.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-serde\2.3.4\hive-serde-2.3.4.jar;C:\Users\lenovo\.m2\repository\com\google\code\findbugs\jsr305\3.0.0\jsr305-3.0.0.jar;C:\Users\lenovo\.m2\repository\org\apache\avro\avro\1.7.7\avro-1.7.7.jar;C:\Users\lenovo\.m2\repository\org\codehaus\jackson\jackson-core-asl\1.9.13\jackson-core-asl-1.9.13.jar;C:\Users\lenovo\.m2\repository\org\codehaus\jackson\jackson-mapper-asl\1.9.13\jackson-mapper-asl-1.9.13.jar;C:\Users\lenovo\.m2\repository\com\thoughtworks\paranamer\paranamer\2.3\paranamer-2.3.jar;C:\Users\lenovo\.m2\repository\org\xerial\snappy\snappy-java\1.0.5\snappy-java-1.0.5.jar;C:\Users\lenovo\.m2\repository\net\sf\opencsv\opencsv\2.3\opencsv-2.3.jar;C:\Users\lenovo\.m2\repository\org\apache\parquet\parquet-hadoop-bundle\1.8.1\parquet-hadoop-bundle-1.8.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-metastore\2.3.4\hive-metastore-2.3.4.jar;C:\Users\lenovo\.m2\repository\javolution\javolution\5.5.1\javolution-5.5.1.jar;C:\Users\lenovo\.m2\repository\com\google\guava\guava\14.0.1\guava-14.0.1.jar;C:\Users\lenovo\.m2\repository\com\google\protobuf\protobuf-java\2.5.0\protobuf-java-2.5.0.jar;C:\Users\lenovo\.m2\repository\org\apache\hbase\hbase-client\1.1.1\hbase-client-1.1.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hbase\hbase-annotations\1.1.1\hbase-annotations-1.1.1.jar;D:\Program Files\Java\jdk1.8.0_151\lib\tools.jar;C:\Users\lenovo\.m2\repository\org\apache\hbase\hbase-protocol\1.1.1\hbase-protocol-1.1.1.jar;C:\Users\lenovo\.m2\repository\commons-io\commons-io\2.4\commons-io-2.4.jar;C:\Users\lenovo\.m2\repository\io\netty\netty-all\4.0.23.Final\netty-all-4.0.23.Final.jar;C:\Users\lenovo\.m2\repository\org\apache\htrace\htrace-core\3.1.0-incubating\htrace-core-3.1.0-incubating.jar;C:\Users\lenovo\.m2\repository\org\jruby\jcodings\jcodings\1.0.8\jcodings-1.0.8.jar;C:\Users\lenovo\.m2\repository\org\jruby\joni\joni\2.1.2\joni-2.1.2.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-auth\2.5.1\hadoop-auth-2.5.1.jar;C:\Users\lenovo\.m2\repository\org\apache\directory\server\apacheds-kerberos-codec\2.0.0-M15\apacheds-kerberos-codec-2.0.0-M15.jar;C:\Users\lenovo\.m2\repository\org\apache\directory\server\apacheds-i18n\2.0.0-M15\apacheds-i18n-2.0.0-M15.jar;C:\Users\lenovo\.m2\repository\org\apache\directory\api\api-asn1-api\1.0.0-M20\api-asn1-api-1.0.0-M20.jar;C:\Users\lenovo\.m2\repository\org\apache\directory\api\api-util\1.0.0-M20\api-util-1.0.0-M20.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-core\2.5.1\hadoop-mapreduce-client-core-2.5.1.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-yarn-common\2.5.1\hadoop-yarn-common-2.5.1.jar;C:\Users\lenovo\.m2\repository\com\github\stephenc\findbugs\findbugs-annotations\1.3.9-1\findbugs-annotations-1.3.9-1.jar;C:\Users\lenovo\.m2\repository\junit\junit\4.11\junit-4.11.jar;C:\Users\lenovo\.m2\repository\org\hamcrest\hamcrest-core\1.3\hamcrest-core-1.3.jar;C:\Users\lenovo\.m2\repository\com\jolbox\bonecp\0.8.0.RELEASE\bonecp-0.8.0.RELEASE.jar;C:\Users\lenovo\.m2\repository\com\zaxxer\HikariCP\2.5.1\HikariCP-2.5.1.jar;C:\Users\lenovo\.m2\repository\org\apache\derby\derby\10.10.2.0\derby-10.10.2.0.jar;C:\Users\lenovo\.m2\repository\org\datanucleus\datanucleus-api-jdo\4.2.4\datanucleus-api-jdo-4.2.4.jar;C:\Users\lenovo\.m2\repository\org\datanucleus\datanucleus-core\4.1.17\datanucleus-core-4.1.17.jar;C:\Users\lenovo\.m2\repository\org\datanucleus\datanucleus-rdbms\4.1.19\datanucleus-rdbms-4.1.19.jar;C:\Users\lenovo\.m2\repository\commons-pool\commons-pool\1.5.4\commons-pool-1.5.4.jar;C:\Users\lenovo\.m2\repository\commons-dbcp\commons-dbcp\1.4\commons-dbcp-1.4.jar;C:\Users\lenovo\.m2\repository\javax\jdo\jdo-api\3.0.1\jdo-api-3.0.1.jar;C:\Users\lenovo\.m2\repository\javax\transaction\jta\1.1\jta-1.1.jar;C:\Users\lenovo\.m2\repository\org\datanucleus\javax.jdo\3.2.0-m3\javax.jdo-3.2.0-m3.jar;C:\Users\lenovo\.m2\repository\javax\transaction\transaction-api\1.1\transaction-api-1.1.jar;C:\Users\lenovo\.m2\repository\org\antlr\antlr-runtime\3.5.2\antlr-runtime-3.5.2.jar;C:\Users\lenovo\.m2\repository\co\cask\tephra\tephra-api\0.6.0\tephra-api-0.6.0.jar;C:\Users\lenovo\.m2\repository\co\cask\tephra\tephra-core\0.6.0\tephra-core-0.6.0.jar;C:\Users\lenovo\.m2\repository\com\google\code\gson\gson\2.2.4\gson-2.2.4.jar;C:\Users\lenovo\.m2\repository\com\google\inject\guice\3.0\guice-3.0.jar;C:\Users\lenovo\.m2\repository\javax\inject\javax.inject\1\javax.inject-1.jar;C:\Users\lenovo\.m2\repository\aopalliance\aopalliance\1.0\aopalliance-1.0.jar;C:\Users\lenovo\.m2\repository\com\google\inject\extensions\guice-assistedinject\3.0\guice-assistedinject-3.0.jar;C:\Users\lenovo\.m2\repository\it\unimi\dsi\fastutil\6.5.6\fastutil-6.5.6.jar;C:\Users\lenovo\.m2\repository\org\apache\twill\twill-common\0.6.0-incubating\twill-common-0.6.0-incubating.jar;C:\Users\lenovo\.m2\repository\org\apache\twill\twill-core\0.6.0-incubating\twill-core-0.6.0-incubating.jar;C:\Users\lenovo\.m2\repository\org\apache\twill\twill-api\0.6.0-incubating\twill-api-0.6.0-incubating.jar;C:\Users\lenovo\.m2\repository\org\apache\twill\twill-discovery-api\0.6.0-incubating\twill-discovery-api-0.6.0-incubating.jar;C:\Users\lenovo\.m2\repository\org\apache\twill\twill-discovery-core\0.6.0-incubating\twill-discovery-core-0.6.0-incubating.jar;C:\Users\lenovo\.m2\repository\org\apache\twill\twill-zookeeper\0.6.0-incubating\twill-zookeeper-0.6.0-incubating.jar;C:\Users\lenovo\.m2\repository\co\cask\tephra\tephra-hbase-compat-1.0\0.6.0\tephra-hbase-compat-1.0-0.6.0.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-shims\2.3.4\hive-shims-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\shims\hive-shims-common\2.3.4\hive-shims-common-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\shims\hive-shims-0.23\2.3.4\hive-shims-0.23-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-yarn-server-resourcemanager\2.7.2\hadoop-yarn-server-resourcemanager-2.7.2.jar;C:\Users\lenovo\.m2\repository\com\sun\jersey\contribs\jersey-guice\1.9\jersey-guice-1.9.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-yarn-api\2.7.2\hadoop-yarn-api-2.7.2.jar;C:\Users\lenovo\.m2\repository\javax\xml\bind\jaxb-api\2.2.2\jaxb-api-2.2.2.jar;C:\Users\lenovo\.m2\repository\javax\xml\stream\stax-api\1.0-2\stax-api-1.0-2.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-yarn-server-common\2.7.2\hadoop-yarn-server-common-2.7.2.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-yarn-server-applicationhistoryservice\2.7.2\hadoop-yarn-server-applicationhistoryservice-2.7.2.jar;C:\Users\lenovo\.m2\repository\org\apache\hadoop\hadoop-yarn-server-web-proxy\2.7.2\hadoop-yarn-server-web-proxy-2.7.2.jar;C:\Users\lenovo\.m2\repository\org\fusesource\leveldbjni\leveldbjni-all\1.8\leveldbjni-all-1.8.jar;C:\Users\lenovo\.m2\repository\org\apache\zookeeper\zookeeper\3.4.6\zookeeper-3.4.6-tests.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\shims\hive-shims-scheduler\2.3.4\hive-shims-scheduler-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\hive\hive-service-rpc\2.3.4\hive-service-rpc-2.3.4.jar;C:\Users\lenovo\.m2\repository\org\apache\httpcomponents\httpclient\4.4\httpclient-4.4.jar;C:\Users\lenovo\.m2\repository\commons-logging\commons-logging\1.2\commons-logging-1.2.jar;C:\Users\lenovo\.m2\repository\org\apache\httpcomponents\httpcore\4.4\httpcore-4.4.jar;C:\Users\lenovo\.m2\repository\org\apache\thrift\libthrift\0.9.3\libthrift-0.9.3.jar;C:\Users\lenovo\.m2\repository\org\apache\zookeeper\zookeeper\3.4.6\zookeeper-3.4.6.jar;C:\Users\lenovo\.m2\repository\org\slf4j\slf4j-log4j12\1.6.1\slf4j-log4j12-1.6.1.jar;C:\Users\lenovo\.m2\repository\log4j\log4j\1.2.16\log4j-1.2.16.jar;C:\Users\lenovo\.m2\repository\io\netty\netty\3.7.0.Final\netty-3.7.0.Final.jar;C:\Users\lenovo\.m2\repository\org\apache\curator\curator-framework\2.7.1\curator-framework-2.7.1.jar;C:\Users\lenovo\.m2\repository\org\apache\curator\curator-client\2.7.1\curator-client-2.7.1.jar;C:\Users\lenovo\.m2\repository\org\slf4j\slf4j-api\1.7.10\slf4j-api-1.7.10.jar" edu.qfnu.hadoop.HiveTest
--connecting to jdbc:hive2://192.168.79.11:10000/db1...
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/Users/lenovo/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/Users/lenovo/.m2/repository/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
dataPath:/root/hiveTestData2.txt
--creating table db1.jdbc_test_table...
--use database:db1
--table infojdbc_test_table
ip	string
time	string
url	string
size	string
--load data...
--table data:
20111230111529;90dad1d0612387afb6998415bdc10349;qwuopalsdf;http://wenwen.soso.com/z/q343414009.htm
20111230111529;9b360c73c35af14c275764d420bc393c;afsd ga ;http://www.iqiyi.com/dongman/20110818/1395a8c4db6b4f29.html?src=alddm
20111230111529;df6faa8ab16f59b93719fa93f72205e2;asdfghjk;http://bj.58.com/cheliangfuwu/6389092208516x.shtml
20111230111529;3113b5c148c7085c5fbee22ffd94e302;xdfgyu;http://www.qire123.com/taiwan/xajh/
20111230111529;04419fadd01fb44f80343548e583f6a2;qwertyjo;http://wenwen.soso.com/z/q284237573.htm
20111230111529;6a2c3e8beae22ef6339124f38c0e99d2;sdfghjmo;http://world.newssc.org/system/2011/12/30/013410135.shtml
20111230111529;8162f344bd46082f0b6bd8d2057269c7;xglt;http://www.net114.com/764230118/
20111230111529;b93a73101eb18c1ec9ec92fdf0e402a2;tyd;http://www.nxhrss.gov.cn/
20111230111530;9d3237a005f38cefe0793f07ac51c72d;dhdkf;http://www.pclady.com.cn/fitness/sx/chest
20111230111530;765220445312840da6fccac59ab4406e;gskfhs;http://db.auto.sohu.com/model-2110.shtml
--table data count:
10
--insert table...
--table data
321;123;231;132
20111230111529;90dad1d0612387afb6998415bdc10349;qwuopalsdf;http://wenwen.soso.com/z/q343414009.htm
20111230111529;9b360c73c35af14c275764d420bc393c;afsd ga ;http://www.iqiyi.com/dongman/20110818/1395a8c4db6b4f29.html?src=alddm
20111230111529;df6faa8ab16f59b93719fa93f72205e2;asdfghjk;http://bj.58.com/cheliangfuwu/6389092208516x.shtml
20111230111529;3113b5c148c7085c5fbee22ffd94e302;xdfgyu;http://www.qire123.com/taiwan/xajh/
20111230111529;04419fadd01fb44f80343548e583f6a2;qwertyjo;http://wenwen.soso.com/z/q284237573.htm
20111230111529;6a2c3e8beae22ef6339124f38c0e99d2;sdfghjmo;http://world.newssc.org/system/2011/12/30/013410135.shtml
20111230111529;8162f344bd46082f0b6bd8d2057269c7;xglt;http://www.net114.com/764230118/
20111230111529;b93a73101eb18c1ec9ec92fdf0e402a2;tyd;http://www.nxhrss.gov.cn/
20111230111530;9d3237a005f38cefe0793f07ac51c72d;dhdkf;http://www.pclady.com.cn/fitness/sx/chest
20111230111530;765220445312840da6fccac59ab4406e;gskfhs;http://db.auto.sohu.com/model-2110.shtml

Process finished with exit code 0

 

 测试数据放在虚拟机root目录下

20111230111529	90dad1d0612387afb6998415bdc10349	qwuopalsdf	http://wenwen.soso.com/z/q343414009.htm
20111230111529	9b360c73c35af14c275764d420bc393c	afsd ga 	http://www.iqiyi.com/dongman/20110818/1395a8c4db6b4f29.html?src=alddm
20111230111529	df6faa8ab16f59b93719fa93f72205e2	asdfghjk	http://bj.58.com/cheliangfuwu/6389092208516x.shtml
20111230111529	3113b5c148c7085c5fbee22ffd94e302	xdfgyu	http://www.qire123.com/taiwan/xajh/
20111230111529	04419fadd01fb44f80343548e583f6a2	qwertyjo	http://wenwen.soso.com/z/q284237573.htm
20111230111529	6a2c3e8beae22ef6339124f38c0e99d2	sdfghjmo	http://world.newssc.org/system/2011/12/30/013410135.shtml
20111230111529	8162f344bd46082f0b6bd8d2057269c7	xglt	http://www.net114.com/764230118/
20111230111529	b93a73101eb18c1ec9ec92fdf0e402a2	tyd	http://www.nxhrss.gov.cn/
20111230111530	9d3237a005f38cefe0793f07ac51c72d	dhdkf	http://www.pclady.com.cn/fitness/sx/chest
20111230111530	765220445312840da6fccac59ab4406e	gskfhs	http://db.auto.sohu.com/model-2110.shtml

 

数据

链接:https://pan.baidu.com/s/1AHJ50DpeETezjnquOjnkfw 
提取码:f7wz 

 

6.出现以下,hiveserver2应该可以正常运行

[root@master conf]# netstat -nl | grep 10000
tcp        0      0 0.0.0.0:10000           0.0.0.0:*               LISTEN 

 

[root@master usr]# jps
7618 SecondaryNameNode
15508 Jps
8551 HMaster
8488 HQuorumPeer
7419 NameNode
9148 RunJar
7822 ResourceManager

 

 

使用浏览器搜索:192.168.8.1(安装hive的主机IP地址):10002,查看

 

[root@master ~]# beeline
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/apache-hive-2.3.4-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Beeline version 2.3.4 by Apache Hive
beeline> !connect jdbc:hive2://192.168.79.11:10000
Connecting to jdbc:hive2://192.168.79.11:10000
Enter username for jdbc:hive2://192.168.79.11:10000: wuxiaoli
Enter password for jdbc:hive2://192.168.79.11:10000: ************(haveaniceday)
19/06/03 07:32:24 [main]: WARN jdbc.HiveConnection: Failed to connect to 192.168.79.11:10000
Error: Could not open client transport with JDBC Uri: jdbc:hive2://192.168.79.11:10000: Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate wuxiaoli (state=08S01,code=0)
beeline> 

9.查看log文件

[root@master apache-hive-2.3.4-bin]# cd conf
[root@master conf]# ls
beeline-log4j2.properties.template  hive-exec-log4j2.properties.template  ivysettings.xml                         nohup.out
hive-default.xml.template           hive-log4j2.properties.template       llap-cli-log4j2.properties.template     parquet-logging.properties
hive-env.sh.template                hive-site.xml                         llap-daemon-log4j2.properties.template
[root@master conf]# 
[root@master conf]# cat nohup.out
which: no hbase in (/usr/apache-hive-2.3.4-bin/bin:/usr/hadoop/bin:/usr/hadoop/sbin:/usr/java/jdk/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/local/zookeeper-3.4.10/bin:/usr/local/zookeeper-3.4.10/conf:/home/bigdata/.local/bin:/home/bigdata/bin)
Service hiverserver2 not found
Available Services: beeline cleardanglingscratchdir cli hbaseimport hbaseschematool help hiveburninclient hiveserver2 hplsql jar lineage llapdump llap llapstatus metastore metatool orcfiledump rcfilecat schemaTool version 
[root@master conf]# 

查找hive.log日志文件

[root@master conf]# find / -name hive.log
/tmp/root/hive.log
[root@master conf]# cd /tmp/root
[root@master conf]# cat hive.log

10.修改hadoop的

<configuration>
    <property>
        <name>hadoop.tmp.dir</name>
        <value>file:/usr/hadoop/tmp</value>
    </property>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://master:9000</value>
    </property>

<property>
    <name>hadoop.proxyuser.root.hosts</name>
    <value>*</value>
</property>
<property>
    <name>hadoop.proxyuser.root.groups</name>
    <value>*</value>
</property>
</configuration>

 

  • 1
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值