Hive:JDBC示例

1)本地目录/home/hadoop/test下的test4.txt文件内容(每行数据之间用tab键隔开)如下所示:

[hadoop@master test]$ sudo vim test4.txt

1    dajiangtai
2    hadoop
3    hive
4    hbase
5    spark

2)启动hiveserver2

[hadoop@master test]$ cd ${HIVE_HOME}/bin
[hadoop@master bin]$ ll
total 32
-rwxr-xr-x 1 hadoop hadoop 881 Jan 30 2015 beeline
drwxr-xr-x 3 hadoop hadoop 4096 May 14 23:28 ext
-rwxr-xr-x 1 hadoop hadoop 7311 Jan 30 2015 hive
-rwxr-xr-x 1 hadoop hadoop 1900 Jan 30 2015 hive-config.sh
-rwxr-xr-x 1 hadoop hadoop 885 Jan 30 2015 hiveserver2
-rwxr-xr-x 1 hadoop hadoop 832 Jan 30 2015 metatool
-rwxr-xr-x 1 hadoop hadoop 884 Jan 30 2015 schematool

[hadoop@master bin]$ ./hiveserver2 
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/modules/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/modules/hive1.0.0/lib/hive-jdbc-1.0.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

3) 程序代码

import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
public class Hive {
    private static String driverName = "org.apache.hive.jdbc.HiveDriver";//hive驱动名称
    private static String url = "jdbc:hive2://master:10000/default";//连接hive2服务的连接地址,Hive0.11.0以上版本提供了一个全新的服务:HiveServer2
    private static String user = "hadoop";//对HDFS有操作权限的用户
    private static String password = "";//在非安全模式下,指定一个用户运行查询,忽略密码
    private static String sql = "";
    private static ResultSet res;
    public static void main(String[] args) {
        try {
            Class.forName(driverName);//加载HiveServer2驱动程序
            Connection conn = DriverManager.getConnection(url, user, password);//根据URL连接指定的数据库
            Statement stmt = conn.createStatement();

            //创建的表名
            String tableName = "testHiveDriverTable";

            /** 第一步:表存在就先删除 **/
            sql = "drop table " + tableName;
            stmt.execute(sql);

            /** 第二步:表不存在就创建 **/
            sql = "create table " + tableName + " (key int, value string)  row format delimited fields terminated by '\t' STORED AS TEXTFILE";
            stmt.execute(sql);

            // 执行“show tables”操作
            sql = "show tables '" + tableName + "'";
            res = stmt.executeQuery(sql);
            if (res.next()) {
                System.out.println(res.getString(1));
            }

            // 执行“describe table”操作
            sql = "describe " + tableName;
            res = stmt.executeQuery(sql);
            while (res.next()) {  
                System.out.println(res.getString(1) + "\t" + res.getString(2));
            }

            // 执行“load data into table”操作
            String filepath = "/home/hadoop/test/test4.txt";//hive服务所在节点的本地文件路径
            sql = "load data local inpath '" + filepath + "' into table " + tableName;
            stmt.execute(sql);

            // 执行“select * query”操作
            sql = "select * from " + tableName;
            res = stmt.executeQuery(sql);
            while (res.next()) {
                System.out.println(res.getInt(1) + "\t" + res.getString(2));
            }

            // 执行“regular hive query”操作,此查询会转换为MapReduce程序来处理
            sql = "select count(*) from " + tableName;
            res = stmt.executeQuery(sql);
            while (res.next()) {
                System.out.println(res.getString(1));
            }        
            conn.close();
            conn = null;
        } catch (ClassNotFoundException e) {
            e.printStackTrace();
            System.exit(1);
        } catch (SQLException e) {
            e.printStackTrace();
            System.exit(1);
        }
    }
}

4) 运行结果(右击–>Run as–>Run on Hadoop)

此时直接运行会报错,解决方案请见下一篇博文:HiveSQLException: Error while compiling statement: No privilege ‘Create’ found for outputs{database}

运行日志如下:

2018-05-24 09:25:52,416 INFO [org.apache.hive.jdbc.Utils] - Supplied authorities: master:10000
2018-05-24 09:25:52,418 INFO [org.apache.hive.jdbc.Utils] - Resolved authority: master:10000
2018-05-24 09:25:52,508 INFO [org.apache.hive.jdbc.HiveConnection] - Will try to open client transport with JDBC Uri: jdbc:hive2://master:10000/default
2018-05-24 09:25:52,509 DEBUG [org.apache.thrift.transport.TSaslTransport] - opening transport org.apache.thrift.transport.TSaslClientTransport@3834d63f
2018-05-24 09:25:52,529 DEBUG [org.apache.thrift.transport.TSaslClientTransport] - Sending mechanism name PLAIN and initial response of length 17
2018-05-24 09:25:52,533 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Writing message with status START and payload length 5
2018-05-24 09:25:52,534 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Writing message with status COMPLETE and payload length 17
2018-05-24 09:25:52,534 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Start message handled
2018-05-24 09:25:52,534 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Main negotiation loop complete
2018-05-24 09:25:52,534 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: SASL Client receiving last message
2018-05-24 09:25:52,536 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Received message with status COMPLETE and payload length 0
2018-05-24 09:25:52,552 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 71
2018-05-24 09:25:52,749 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 109
2018-05-24 09:25:52,768 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 137
2018-05-24 09:25:52,795 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 109
2018-05-24 09:25:52,805 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:25:52,830 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:25:52,837 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 96
2018-05-24 09:25:52,840 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 42
2018-05-24 09:25:52,841 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 229
2018-05-24 09:25:52,914 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 109
2018-05-24 09:25:52,914 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:25:53,270 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:25:53,270 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 96
2018-05-24 09:25:53,271 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 42
2018-05-24 09:25:53,272 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 140
2018-05-24 09:25:53,328 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 109
2018-05-24 09:25:53,328 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:25:53,369 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:25:53,376 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 102
2018-05-24 09:25:53,429 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 131
2018-05-24 09:25:53,451 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 112
2018-05-24 09:25:53,536 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 119
2018-05-24 09:25:53,551 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: 
**testhivedrivertable**
2018-05-24 09:25:53,551 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 96
2018-05-24 09:25:53,554 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 42
2018-05-24 09:25:53,554 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 135
2018-05-24 09:25:53,674 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 109
2018-05-24 09:25:53,674 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:25:53,740 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:25:53,740 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 102
2018-05-24 09:25:53,741 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 273
2018-05-24 09:25:53,741 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 112
2018-05-24 09:25:53,745 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 179
2018-05-24 09:25:53,746 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: 
**key    int**
2018-05-24 09:25:53,746 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: 
**value    string**
2018-05-24 09:25:53,746 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 112
2018-05-24 09:25:53,747 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 138
2018-05-24 09:25:53,747 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 96
2018-05-24 09:25:53,751 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 42
2018-05-24 09:25:53,751 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 190
2018-05-24 09:25:53,833 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 109
2018-05-24 09:25:53,833 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:25:54,721 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:25:54,721 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 96
2018-05-24 09:25:54,722 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 42
2018-05-24 09:25:54,723 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 140
2018-05-24 09:25:55,219 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 109
2018-05-24 09:25:55,219 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:25:55,221 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:25:55,222 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 102
2018-05-24 09:25:55,223 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 186
2018-05-24 09:25:55,224 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 112
2018-05-24 09:25:55,324 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 187
2018-05-24 09:25:55,329 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: 
**1    dajiangtai**
2018-05-24 09:25:55,329 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: 
**2    hadoop**
2018-05-24 09:25:55,329 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: 
**3    hive**
2018-05-24 09:25:55,330 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: 
**4    hbase**
2018-05-24 09:25:55,330 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: 
**5    spark**
2018-05-24 09:25:55,330 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 112
2018-05-24 09:25:55,330 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 117
2018-05-24 09:25:55,331 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 96
2018-05-24 09:25:55,346 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 42
2018-05-24 09:25:55,346 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 147
2018-05-24 09:25:55,603 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 109
2018-05-24 09:25:55,603 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:26:00,604 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:26:00,604 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:26:05,605 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:26:05,605 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:26:10,607 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:26:10,607 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:26:15,609 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:26:15,609 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:26:20,611 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:26:20,611 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:26:25,613 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:26:25,613 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:26:30,614 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:26:30,614 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 100
2018-05-24 09:26:33,020 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 53
2018-05-24 09:26:33,020 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 102
2018-05-24 09:26:33,021 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 102
2018-05-24 09:26:33,021 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 112
2018-05-24 09:26:33,043 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 104
2018-05-24 09:26:33,052 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: 
**5**
2018-05-24 09:26:33,052 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 112
2018-05-24 09:26:33,053 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 96
2018-05-24 09:26:33,057 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 83
2018-05-24 09:26:33,103 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 40

执行“show tables”运行结果:

testhivedrivertable

执行“describe table”运行结果:

key    int
value    string

执行“select * query”运行结果:

1    dajiangtai
2    hadoop
3    hive
4    hbase
5    spark

或者从集群上查看运行结果。

hive> show tables;
OK
copy_student1
copy_student2
copy_student3
copy_student4
employee
group_gender_agg
group_gender_sum
group_test
index_test
index_tmp
partition_test
student1
student2
test
test_view
testhivedrivertable
user
Time taken: 0.153 seconds, Fetched: 17 row(s)
hive> desc testhivedrivertable;
OK
key                     int                                         
value                   string                                      
Time taken: 0.184 seconds, Fetched: 2 row(s)
hive> select * from testhivedrivertable;
OK
1    dajiangtai
2    hadoop
3    hive
4    hbase
5    spark
Time taken: 0.346 seconds, Fetched: 5 row(s)

以上就是博主为大家介绍的这一板块的主要内容,这都是博主自己的学习过程,希望能给大家带来一定的指导作用,有用的还望大家点个支持,如果对你没用也望包涵,有错误烦请指出。如有期待可关注博主以第一时间获取更新哦,谢谢! 

  • 1
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
当然可以!以下是一个使用 JDBC 查询 Hive 表的示例代码: ```java import java.sql.*; public class HiveJdbcExample { // JDBC 连接信息 private static final String JDBC_DRIVER = "org.apache.hive.jdbc.HiveDriver"; private static final String CONNECTION_URL = "jdbc:hive2://localhost:10000/default"; private static final String USERNAME = "your_username"; private static final String PASSWORD = "your_password"; public static void main(String[] args) { Connection conn = null; Statement stmt = null; ResultSet rs = null; try { // 加载 Hive JDBC 驱动 Class.forName(JDBC_DRIVER); // 建立 JDBC 连接 conn = DriverManager.getConnection(CONNECTION_URL, USERNAME, PASSWORD); // 创建 Statement 对象 stmt = conn.createStatement(); // 执行查询语句 String sql = "SELECT * FROM your_table"; rs = stmt.executeQuery(sql); // 处理查询结果 while (rs.next()) { // 获取每一行的数据 String column1 = rs.getString("column1"); int column2 = rs.getInt("column2"); // ... 获取其他列的数据 // 处理数据,例如打印输出 System.out.println("column1: " + column1 + ", column2: " + column2); } } catch (ClassNotFoundException | SQLException e) { e.printStackTrace(); } finally { // 关闭连接和释放资源 try { if (rs != null) { rs.close(); } if (stmt != null) { stmt.close(); } if (conn != null) { conn.close(); } } catch (SQLException e) { e.printStackTrace(); } } } } ``` 在以上示例代码中,你需要替换以下信息: - `CONNECTION_URL`:Hive 连接的 URL,根据你的环境进行修改。 - `USERNAME` 和 `PASSWORD`:连接 Hive 的用户名和密码,根据你的环境进行修改。 - `your_table`:要查询的 Hive 表名。 请注意,你还需要将 Hive JDBC 驱动添加到你的项目依赖中。 这是一个简单的 JDBC 查询 Hive 表的示例,你可以根据自己的需求进行修改和扩展。希望对你有帮助!如果还有其他问题,请随时提出。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值