I'm trying to create a connection via JDBC to Impala using the Hive2 connector. But I'm getting this error:
Exception in thread "main" java.lang.NoSuchFieldError: HIVE_CLI_SERVICE_PROTOCOL_V7
at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:175)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at dsnoc.dsnoc_api.dolar.getDolarFromImpala(dolar.java:145)
at dsnoc.dsnoc_api.dsnoc.main(dsnoc.java:75)
I don't know if it is a depencency compatibility issue:
org.apache.hive
hive-exec
1.1.0
org.apache.hive
hive-jdbc
1.1.0
org.apache.hadoop
hadoop-common
2.6.0
I'm using CDH 5.8.0 with Hive 1.1.0 and Hadoop 2.6.0
Or maybe is Code issue:
public static double getDolarFromImpala(String date) {
double dolar = 0.0;
try {
Class.forName(JDBC_DRIVER_HIVE);
String sql = "SELECT valor FROM dolar where fecha ='"+date+"'";
Connection con = DriverManager.getConnection(JDBC_HIVE2_URL,USERNAME,PASSWORD);
Statement stmt = con.createStatement();
ResultSet rs = stmt.executeQuery(sql);
while(rs.next()){
dolar = rs.getDouble("valor");
}
stmt.close();
con.close();
}
catch(SQLException se){
//Handle errors for JDBC
se.printStackTrace();
}
catch(Exception e){
//Handle errors for Class.forName
e.printStackTrace();
}
return dolar;
}
But I think is not, because I tried it with an Impala-JDBC and worked.
Other thing is that I'm not using the Impala-JDBC because it don't reads or sends the USERNAME and PASSWORD, throwing me this error:
[Simba][ImpalaJDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: TStatus(statusCode:ERROR_STATUS, sqlState:HY000, errorMessage:AuthorizationException: User '' does not have privileges to execute 'SELECT'
Regards,
解决方案
Try this jars:
hive-jdbc-2.1.1-standalone.jar
hadoop-common-2.7.3.jar
depencency:
org.apache.hive
hive-jdbc
2.1.1
org.apache.hadoop
hadoop-common
2.7.3
You can find this jars here: https://search.maven.org
This jars working for me in JMeter.