Hive 远程链接(hiveserver2/beeline)

前言

在生产中, 我们不可能每次都登陆到Hive安装的机器上进行操作. 在本章内, 我们就介绍下Hive远程链接操作.


远程链接Hive Server

Hive服务器是通过Thrift协议进行约定调用的. 我们通过启动bin/hiveserver2来启动服务端的操作. 通过bin/beeline来启动客户端.

PS: 历史版本貌似通过hive hiveserver来启动版本1, 个人猜测可能是因为Hadoop2.x的架构上使用Yarn作为资源调度了, 导致Hive也跟着升级了.

服务端
^Clocalhost:bin Sean$ ./hiveserver2
2019-04-07 17:20:11: Starting HiveServer2
Picked up JAVA_TOOL_OPTIONS: -Dfile.encoding=UTF-8
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/Sean/Software/Hive/apache-hive-2.3.4-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/Sean/Software/hadoop/hadoop-2.7.5/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
客户端
localhost:bin Sean$ ./beeline
Picked up JAVA_TOOL_OPTIONS: -Dfile.encoding=UTF-8
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/Sean/Software/Hive/apache-hive-2.3.4-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/Sean/Software/hadoop/hadoop-2.7.5/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Beeline version 2.3.4 by Apache Hive
beeline> !connect jdbc:hive2://localhost:10000
Connecting to jdbc:hive2://localhost:10000
Enter username for jdbc:hive2://localhost:10000: Sean
Enter password for jdbc:hive2://localhost:10000: *********
Connected to: Apache Hive (version 2.3.4)
Driver: Hive JDBC (version 2.3.4)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://localhost:10000> show databases;
+----------------+
| database_name  |
+----------------+
| default        |
| flow           |
+----------------+
2 rows selected (1.556 seconds)
0: jdbc:hive2://localhost:10000> use flow;
No rows affected (0.384 seconds)
0: jdbc:hive2://localhost:10000> show tables;
+---------------+
|   tab_name    |
+---------------+
| flowcount     |
| t_phone_type  |
+---------------+
2 rows selected (0.155 seconds)
  • 链接 !connect jdbc:hive2://localhost:10000
  • 输入当前用户的用户名与密码.(启动hiveserver2程序的用户.)(当然使用其他的配置也可以.)

Others

看到有配置Hive的运行目录等的配置. 一并记录如下:

 <!-- <property>
      <name>hadoop.proxyuser.Sean.hosts</name>
      <value>*</value>
    </property>
    <property>
      <name>hadoop.proxyuser.Sean.groups</name>
      <value>*</value>
    </property>
   -->

  <!-- <property>
        <name>hive.exec.scratchdir</name>
        <value>/tmp/hive</value>
    </property>
    
    <property>
        <name>hive.exec.local.scratchdir</name>
        <value>/Users/Sean/Software/Hive/tmp</value>
    </property>


    <property>
        <name>hive.downloaded.resources.dir</name>
        <value>/Users/Sean/Software/Hive/tmp/${hive.session.id}_resources</value>
    </property>


    <property>
        <name>hive.metastore.warehouse.dir</name>
        <value>/Users/Sean/Software/Hive/warehouse</value>    
    </property>
 -->

  <!--thrift Server-->
 <!-- <property>
    <name>hive.server2.authentication</name>
    <value>NONE</value>
    <description>
      Expects one of [nosasl, none, ldap, kerberos, pam, custom].
      Client authentication types.
        NONE: no authentication check
        LDAP: LDAP/AD based authentication
        KERBEROS: Kerberos/GSSAPI authentication
        CUSTOM: Custom authentication provider
                (Use with property hive.server2.custom.authentication.class)
        PAM: Pluggable authentication module
        NOSASL:  Raw transport
    </description>
  </property> -->


Q & A

当然, 我在试验的过程中, 也不是一帆风顺的. 也稍微遇到一些的问题.

Q1: 远程登陆用户验证异常

Enter username for jdbc:hive2://localhost:10000:
Enter password for jdbc:hive2://localhost:10000:
19/04/07 16:17:36 [main]: WARN jdbc.HiveConnection: Failed to connect to localhost:10000
Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000: Failed to open new session: java.lang.RuntimeException: >org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: Sean is not allowed to impersonate anonymous (state=08S01,code=0)

解决办法: 本来以为,默认用户登陆就不需要任何配置的. 后来发现想多了. 还是要输入某些用户的. 输入hiveserver2运行程序的当前用户重试即可.

Q2: 远程登陆用户验证异常

beeline> !connect jdbc:hive2://localhost:10000
Connecting to jdbc:hive2://localhost:10000
Enter username for jdbc:hive2://localhost:10000: Sean
Enter password for jdbc:hive2://localhost:10000: *********
19/04/07 16:19:57 [main]: WARN jdbc.HiveConnection: Failed to connect to localhost:10000
Error: Could not open client transport with JDBC Uri: >jdbc:hive2://localhost:10000: Failed to open new session: >java.lang.RuntimeException: >org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: Sean is not allowed to impersonate Sean (state=08S01,code=0)

解决办法: 更改了,输入用户的用户名和密码后. 它还是不让我登陆. 有点难受. 查询后, 原有有2点:

  1. 是因为到Hive到Hadoop的超级用户配置没有配置.
  2. Hive默认端口没开.(这个应该不用开, 只是我的电脑上有这个问题.)

具体操作:
解决1. 更改hadoop/etc/core-site.xml文件的配置. (中间的proxyuser.x.hostx为你的用户.)

<property>
       <name>hadoop.proxyuser.Sean.hosts</name>
      <value>*</value>
   </property>
  <property>
       <name>hadoop.proxyuser.Sean.groups</name>
      <value>*</value>
 </property>

解决2. hive-site.xml

<property>
     <name>hive.server2.thrift.port</name>
    <value>10000</value>
     <description>Port number of HiveServer2 Thrift interface when hive.server2.transport.mode is 'binary'.</description>
  </property>

Q2: 删除表操作报错.
服务端报错:

localhost:bin Sean$ ./hiveserver2
2019-04-07 17:13:30: Starting HiveServer2
Picked up JAVA_TOOL_OPTIONS: -Dfile.encoding=UTF-8
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/Sean/Software/Hive/apache-hive-2.3.4-b in/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/Sean/Software/hadoop/hadoop-2.7.5/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
FAILED: Execution Error, return code 1 from >org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:You have >an error in your SQL syntax; check the manual that corresponds to your >MySQL server version for the right syntax to use near 'OPTION >SQL_SELECT_LIMIT=DEFAULT' at line 1)
FAILED: SemanticException Unable to fetch table t_phone_type. You have an >error in your SQL syntax; check the manual that corresponds to your MySQL >server version for the right syntax to use near 'OPTION >SQL_SELECT_LIMIT=DEFAULT' at line 1

客户端报错:

Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'OPTION SQL_SELECT_LIMIT=DEFAULT' at line 1
	at sun.reflect.GeneratedConstructorAccessor60.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at com.mysql.jdbc.Util.handleNewInstance(Util.java:406)
	at com.mysql.jdbc.Util.getInstance(Util.java:381)
	at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1030)
	at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:956)
	at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3536)
	at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3468)
	at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1957)
	at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2107)
	at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2642)
	at com.mysql.jdbc.StatementImpl.executeSimpleNonQuery(StatementImpl.java:1531)
	at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1433)
	at com.mysql.jdbc.ConnectionImpl.getTransactionIsolation(ConnectionImpl.java:3216)
	at com.jolbox.bonecp.ConnectionHandle.getTransactionIsolation(ConnectionHandle.java:825)
	at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:427)
	at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getXAResource(ConnectionFactoryImpl.java:361)
	at org.datanucleus.store.connection.ConnectionManagerImpl.allocateConnection(ConnectionManagerImpl.java:316)
	at org.datanucleus.store.connection.AbstractConnectionFactory.getConnection(AbstractConnectionFactory.java:84)
	at org.datanucleus.store.AbstractStoreManager.getConnection(AbstractStoreManager.java:347)
	at org.datanucleus.store.AbstractStoreManager.getConnection(AbstractStoreManager.java:310)
	at org.datanucleus.store.rdbms.query.JDOQLQuery.performExecute(JDOQLQuery.java:591)
	at org.datanucleus.store.query.Query.executeQuery(Query.java:1855)
	at org.datanucleus.store.query.Query.executeWithArray(Query.java:1744)
	at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:368)
	... 53 more (state=08S01,code=1)
0: jdbc:hive2://localhost:10000> DROP TABLE t_phone_type;

解决办法: 读异常, 感觉是Mysql关键字问题. 经过网上搜索, 发现是Mysql-conncetor包的问题.hive/lib mysql-connector-java-5.1.08.jar 替换为 mysql-connector-java-5.1.44.jar后. 重启hiveserver2解决.
Reference :
[1]. 使用hive OPTION SQL_SELECT_LIMIT=DEFAULT
[2]. 【mysql案例】Java程序访问Mysql报错’OPTION SQL_SELECT_LIMIT=XXX’
[3]. check the manual that corresponds to your MySQL server version for the right syntax to use near
[4]. Hive删除表


Reference

[1]. [Hive]那些年我们踩过的Hive坑

[2]. beeline连接hiveserver2报错:User: root is not allowed to impersonate root

[3]. hadoop组件—数据仓库(五)—通过JDBC连接hive的thrift或者hiveserver2

[4]. HIVE2 :beeline连接设置用户名和密码注意问题

Hive设置连接用户名和密码

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值