Hive --beeline客户端 --常用操作命令 --一些小知识点

原文:https://blog.csdn.net/qq_41028958/article/details/80861531

HIVE BeenLine客户端-常用命令    支持多链接

启动hiveserver2服务。接受多个客户端连接请求
使得client通过JDBC连接操纵hive数据仓库 
-------------------------------------------
[hadoop@master data]$ hive --service hiveserver2 &

首先开启     hive --service hiveserver2 &

查看帮助信息

[hadoop@master bin]$ hive --service beeline --help
Usage: java org.apache.hive.cli.beeline.BeeLine 
   -u <database url>               the JDBC URL to connect to
   -r                              reconnect to last saved connect url (in conjunction with !save)
   -n <username>                   the username to connect as
   -p <password>                   the password to connect as
   -d <driver class>               the driver class to use
第一种模式

[hadoop@master bin]$ hive --service beeline -u jdbc:hive2://master:10000/myhive            //连接JDBC
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/soft/hive-2.1.1/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/soft/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://master:10000/myhive
Connected to: Apache Hive (version 2.1.1)
Driver: Hive JDBC (version 2.1.1)
18/06/29 20:46:49 [main]: WARN jdbc.HiveConnection: Request to set autoCommit to false; Hive does not support autoCommit=false.
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 2.1.1 by Apache Hive
0: jdbc:hive2://master:10000/myhive> 
第二种模式

[hadoop@master bin]$ beeline
Beeline version 2.1.1 by Apache Hive
beeline> !connect jdbc:hive2://192.168.178.100:10000 hadoop hadoop
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/soft/hive-2.1.1/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/soft/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://192.168.178.100:10000
Connected to: Apache Hive (version 2.1.1)
Driver: Hive JDBC (version 2.1.1)
18/06/29 20:49:11 [main]: WARN jdbc.HiveConnection: Request to set autoCommit to false; Hive does not support autoCommit=false.
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://192.168.178.100:10000> show databases;
+----------------+--+
| database_name  |
+----------------+--+
| default        |
| myhive         |
+----------------+--+
2 rows selected (2.091 seconds)
0: jdbc:hive2://192.168.178.100:10000> 
本地登录

[hadoop@master bin]$  hive --service beeline -u jdbc:hive2://localhost:10000/myhive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/soft/hive-2.1.1/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/soft/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://localhost:10000/myhive
Connected to: Apache Hive (version 2.1.1)
Driver: Hive JDBC (version 2.1.1)
18/06/29 20:51:07 [main]: WARN jdbc.HiveConnection: Request to set autoCommit to false; Hive does not support autoCommit=false.
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 2.1.1 by Apache Hive
0: jdbc:hive2://localhost:10000/myhive> show databases;
+----------------+--+
| database_name  |
+----------------+--+
| default        |
| myhive         |
+----------------+--+
2 rows selected (0.152 seconds)
0: jdbc:hive2://localhost:10000/myhive> 
执行shell命令:

0: jdbc:hive2://localhost:10000/myhive> !sh hdfs dfs -lsr /user/hive/warehouse
lsr: DEPRECATED: Please use 'ls -R' instead.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/soft/hive-2.1.1/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/soft/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
drwxrwxrwx   - hadoop supergroup          0 2018-06-29 13:55 /user/hive/warehouse/myhive.db
drwxrwxrwx   - hadoop supergroup          0 2018-06-29 10:51 /user/hive/warehouse/myhive.db/employee
-rwxrwxrwx   3 hadoop supergroup         16 2018-06-29 10:51 /user/hive/warehouse/myhive.db/employee/000000_0
-rw-r--r--   3 hadoop supergroup         62 2018-06-29 10:42 /user/hive/warehouse/myhive.db/employee/data.txt
-rwxrwxrwx   3 hadoop supergroup        164 2018-06-29 10:30 /user/hive/warehouse/myhive.db/employee/employees.txt
drwxrwxrwx   - hadoop supergroup          0 2018-06-29 10:58 /user/hive/warehouse/myhive.db/employee1
-rwxrwxrwx   3 hadoop supergroup         62 2018-06-29 10:56 /user/hive/warehouse/myhive.db/employee1/data.txt
drwxrwxrwx   - hadoop supergroup          0 2018-06-27 23:42 /user/hive/warehouse/weblogs
hive的数据类型
--------------------------------------------
TINYINT(byte)        SMALLINT(short)        INT(int)        BIGINT(long)        BOOLEAN(boolean)        FLOAT(float)        DOUBLE(double)          
String            字符串 '' /""
TIMESTAMP         时间戳
BINARY        字节数组
【集合类型】
STRUCT  struct('Jone','Doe')
MAP        map('first','John','last','Doe')
ARRAY        array('Jone','Doe')    
HIVE中一次命令的使用

[hadoop@master ~]$ hive -e "select * from employee"                //-e execute(执行的意思)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/soft/hive-2.1.1/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/soft/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
 
Logging initialized using configuration in file:/usr/local/soft/hive-2.1.1/conf/hive-log4j2.properties Async: true
FAILED: SemanticException [Error 10001]: Line 1:14 Table not found 'employee'
[hadoop@master ~]$ hive -e "select * from myhive.employee"
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/soft/hive-2.1.1/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/soft/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
 
Logging initialized using configuration in file:/usr/local/soft/hive-2.1.1/conf/hive-log4j2.properties Async: true
OK
employee.eud    employee.name    employee.salary    employee.destination
1208    jack    NULL    NULL
1206    tom    5000    Proof reader
1207    liming    40000    Technical writer
1201    Gopal    45000    Technical manager
1202    Manisha    45000    Proof reader
1203    Masthanvali    40000    Technical writer
1204    Krian    40000    Hr Admin
1205    Kranthi    30000    Op Admin
hive命令(简单的操作)
-----------------------

hive (myhive)> dfs -lsr /user/hive/warehouse/myhive.db/;        //执行dfs命令
lsr: DEPRECATED: Please use 'ls -R' instead.
drwxrwxrwx   - hadoop supergroup          0 2018-06-29 10:51 /user/hive/warehouse/myhive.db/employee
-rwxrwxrwx   3 hadoop supergroup         16 2018-06-29 10:51 /user/hive/warehouse/myhive.db/employee/000000_0
-rw-r--r--   3 hadoop supergroup         62 2018-06-29 10:42 /user/hive/warehouse/myhive.db/employee/data.txt
-rwxrwxrwx   3 hadoop supergroup        164 2018-06-29 10:30 /user/hive/warehouse/myhive.db/employee/employees.txt
drwxrwxrwx   - hadoop supergroup          0 2018-06-29 10:58 /user/hive/warehouse/myhive.db/employee1
-rwxrwxrwx   3 hadoop supergroup         62 2018-06-29 10:56 /user/hive/warehouse/myhive.db/employee1/data.txt


hive (myhive)> !ls /home/hadoop/;
公共
模板
视频
图片
文档
下载
音乐
桌面


hive (myhive)> -- this is a comment                    //注释
hive (myhive)> create database hive3 with dbproperties('author'='jhy','createtime'='today')        //建库
             > ;
OK
Time taken: 0.177 seconds
hive (myhive)> alter database hive3 set dbproperties('author'='you');        //修改库
OK
Time taken: 0.076 seconds
 
hive (myhive)> create table hive1.teswt1(id int) tblproperties('author'='jhy');                //建表
OK
Time taken: 0.137 seconds
hive (myhive)> create table hive1.testwe1(id int) LOCATION '/input/';
OK
Time taken: 0.081 seconds
hive (myhive)> desc extended hive1.testwe1;            //显示扩展信息
OK
col_name    data_type    comment
id                      int                                         
          
Detailed Table Information    Table(tableName:testwe1, dbName:hive1, owner:hadoop, createTime:1530327566, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:id, type:int, comment:null)], location:hdfs://master:9000/input, inputFormat:org.apache.hadoop.mapred.TextInputFormat, outputFormat:org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, parameters:{serialization.format=1}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{}), storedAsSubDirectories:false), partitionKeys:[], parameters:{transient_lastDdlTime=1530327566, totalSize=36, numFiles=1}, viewOriginalText:null, viewExpandedText:null, tableType:MANAGED_TABLE)    
Time taken: 0.104 seconds, Fetched: 3 row(s)
hive (myhive)> desc formatted hive1.testwe1;                //显示格式化的信息    
OK
col_name    data_type    comment
# col_name                data_type               comment             
          
id                      int                                         
          
# Detailed Table Information          
Database:               hive1                    
Owner:                  hadoop                   
CreateTime:             Sat Jun 30 10:59:26 CST 2018     
LastAccessTime:         UNKNOWN                  
Retention:              0                        
Location:               hdfs://master:9000/input     
Table Type:             MANAGED_TABLE            
Table Parameters:          
    numFiles                1                   
    totalSize               36                  
    transient_lastDdlTime    1530327566          
          
# Storage Information          
SerDe Library:          org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe     
InputFormat:            org.apache.hadoop.mapred.TextInputFormat     
OutputFormat:           org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat     
Compressed:             No                       
Num Buckets:            -1                       
Bucket Columns:         []                       
Sort Columns:           []                       
Storage Desc Params:          
    serialization.format    1                   
Time taken: 0.095 seconds, Fetched: 27 row(s)
hive (myhive)> show tables in hive1;            //显示指定数据库的表集合  默认为当前库
OK
tab_name
test1
test3
testwe1
teswt1
Time taken: 0.031 seconds, Fetched: 4 row(s)
set hive.cli.print.header=true        //显示字段名称(头)

 

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值