Hive JDBC运行连接注意事项

通过jdbc方式连接hive,能让你非常方便、简单地去使用hadoop挖掘数据,门槛大大降低。其实连接方式很简单,但是第一次使用,总会有些莫名奇妙的错误,下面给出一些注意事项,希望对初学者有帮助。

首先我的环境:hadoop2.4.0+hive0.14.0。

 

所需jar包:

 

    <classpathentry kind="lib" path="lib/commons-collections-3.2.1.jar"/>
    <classpathentry kind="lib" path="lib/commons-logging-1.1.3.jar"/>
    <classpathentry kind="lib" path="lib/hadoop-common-2.4.0.jar"/>
    <classpathentry kind="lib" path="lib/libfb303-0.9.0.jar"/>
    <classpathentry kind="lib" path="lib/httpclient-4.2.5.jar"/>
    <classpathentry kind="lib" path="lib/httpcore-4.2.5.jar"/>
    <classpathentry kind="lib" path="lib/log4j-1.2.16.jar"/>
    <classpathentry kind="lib" path="lib/slf4j-api-1.6.1.jar"/>
    <classpathentry kind="lib" path="lib/slf4j-log4j12-1.6.1.jar"/>
    <classpathentry kind="lib" path="lib/hive-exec-0.14.0.jar"/>
    <classpathentry kind="lib" path="lib/hive-jdbc-0.14.0.jar"/>
    <classpathentry kind="lib" path="lib/hive-metastore-0.14.0.jar"/>
    <classpathentry kind="lib" path="lib/hive-service-0.14.0.jar"/>
    <classpathentry kind="lib" path="lib/hadoop-mapreduce-client-core-2.4.0.jar"/>

 

如果你运行程序出现以下错误:

 

java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.

java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.

 

解决方案:

 

1、System.setProperty("hadoop.home.dir", "D:/hadoop-2.4.0");

2、下载winutils.exe https://github.com/srccodes/hadoop-common-2.2.0-bin/blob/master/bin/winutils.exe

 

附上测试代码:


    public class HiveJdbcClient2 {  
        private static String driverName = "org.apache.hadoop.hive.jdbc.HiveDriver";    
            
        /**  
         * @param args  
         * @throws SQLException  
         */    
        public static void main(String[] args) throws SQLException {    
              
            System.setProperty("hadoop.home.dir", "D:/hadoop-2.4.0");  
              
            BasicConfigurator.configure();  
              
            try {    
                Class.forName(driverName);    
            } catch (ClassNotFoundException e) {    
                // TODO Auto-generated catch block    
                e.printStackTrace();    
                System.exit(1);    
            }    
            Connection con = DriverManager.getConnection("jdbc:hive://127.0.0.1:10000/defalt", "","");    
            Statement stmt = con.createStatement();    
              
            //stmt.executeQuery("drop table test");  
            stmt.executeQuery("create table if not exists test(amount DOUBLE, st_name string) " +  
                    "ROW FORMAT DELIMITED " +  
                    "FIELDS TERMINATED BY '\t' " +  
                    "STORED AS TEXTFILE");  
              
            //stmt.executeQuery("load data inpath '/user/hive_data/test_data.txt' into table gas");  
              
            long st = System.currentTimeMillis();  
            ResultSet res = stmt.executeQuery("select st_name,sum(amount) c from test group by st_name  sort by c");    
            int i=0;  
            while (res.next()) {    
                i++;  
                System.out.println(res.getString(1)+" - "+res.getString(2));    
            }    
            long en = System.currentTimeMillis();  
              
            System.out.println("总耗时:"+(en-st)+",记录总数:"+i);  
        }   
    }  


  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值