(一)常见的问题
首先声明我使用的CDH版本是 CDH5.12.0 ,springboot 版本是2.0.8
在使用springboot jdbc连接hive时,通常会报各种各样的java 包冲突
以下是几种常见的错误:
1.1、org.apache.thrift.TApplicationException: Required field ‘client_protocol’ is unset! Struct:TOpenSessionReq(client_protocol:null)
可能的原因是项目的hive-jdbc版本和服务器的版本不一致的原因造成的,替换成和服务器一致的版本就可以了,我选了以下版本
- hive-jdbc
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<version>1.0.0</version>
</dependency>
- hadoop-common
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.4.1</version>
</dependency>
1.2、servlet-api 所引起jar包冲突
java.util.concurrent.ExecutionException: org.apache.catalina.LifecycleException: Failed to start component [StandardEngine[Tomcat].StandardHost[localhost].TomcatEmbeddedContext[]]
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:192)
at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:942)
at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:872)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1423)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1413)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.catalina.LifecycleException: Failed to start component [StandardEngine[Tomcat].StandardHost[localhost].TomcatEmbeddedContext[]]
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:167)
... 6 common frames omitted
Caused by: org.apache.catalina.LifecycleException: Failed to start component [Pipeline[StandardEngine[Tomcat].StandardHost[localhost].TomcatEmbeddedContext[]]]
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:167)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5188)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
... 6 common frames omitted
Caused by: org.apache.catalina.LifecycleException: Failed to start component [org.apache.catalina.authenticator.NonLoginAuthenticator[]]
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:167)
at org.apache.catalina.core.StandardPipeline.startInternal(StandardPipeline.java:182)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
... 8 common frames omitted
Caused by: java.lang.NoSuchMethodError: javax.servlet.ServletContext.getVirtualServerName()Ljava/lang/String;
在hive-jdbc 和 haoop-common 中将 排除掉 servlet-api
<exclusion>
<artifactId>servlet-api</artifactId>
<groupId>javax.servlet</groupId>
</exclusion>
1.3、 使用swagger2所造成的jar包冲突
2020-07-10 20:41:38.498 ==> [http-nio-8080-exec-1] ==> ERROR o.a.c.core.ContainerBase.[Tomcat].[localhost] - Exception Processing /swagger-ui.html
java.lang.SecurityException: AuthConfigFactory error: java.lang.ClassNotFoundException: org.apache.geronimo.components.jaspi.AuthConfigFactoryImpl
at javax.security.auth.message.config.AuthConfigFactory.getFactory(AuthConfigFactory.java:75)
at org.apache.catalina.authenticator.AuthenticatorBase.findJaspicProvider(AuthenticatorBase.java:1245)
at org.apache.catalina.authenticator.AuthenticatorBase.getJaspicProvider(AuthenticatorBase.java:1235)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:484)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:140)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:81)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:87)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:342)
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:800)
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66)
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:806)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1498)
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: org.apache.geronimo.components.jaspi.AuthConfigFactoryImpl
at org.springframework.boot.web.embedded.tomcat.TomcatEmbeddedWebappClassLoader.loadClass(TomcatEmbeddedWebappClassLoader.java:70)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1172)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at javax.security.auth.message.config.AuthConfigFactory$3.run(AuthConfigFactory.java:66)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.message.config.AuthConfigFactory.getFactory(AuthConfigFactory.java:62)
... 16 common frames omitted
2020-07-10 20:41:38.508 ==> [http-nio-8080-exec-1] ==> INFO o.a.c.core.ContainerBase.[Tomcat].[localhost].[/] - Initializing Spring FrameworkServlet 'dispatcherServlet'
2020-07-10 20:41:38.730 ==> [http-nio-8080-exec-3] ==> ERROR o.a.c.core.ContainerBase.[Tomcat].[localhost] - Exception Processing /favicon.ico
java.lang.SecurityException: AuthConfigFactory error: java.lang.ClassNotFoundException: org.apache.geronimo.components.jaspi.AuthConfigFactoryImpl
在hive-jdbc 中将jaspic 排除掉
<exclusion>
<artifactId>geronimo-jaspic_1.0_spec</artifactId>
<groupId>org.apache.geronimo.specs</groupId>
</exclusion>
1.4、集成swagger2出现guava包下的FluentIterable.append方法找不到。
<!-- 我的swagger2 配置 -->
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger2</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger-ui</artifactId>
<version>2.4.0</version>
</dependency>
由于hive-jdbc 中使用的guava 版本和 swagger2使用的不同造成jar 冲突
排除哪一个版本都会报错,于是选择两个都兼容的较高的一个版本作为依赖
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>20.0</version>
</dependency>
</dependencies>
</dependencyManagement>
(二)正常运行代码
2.1、pom 文件配置
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.bigdata</groupId>
<artifactId>bigdata-exchange-center</artifactId>
<version>0.0.1-SNAPSHOT</version>
</parent>
<artifactId>exchange-start</artifactId>
<dependencies>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<version>1.0.0</version>
<exclusions>
<exclusion>
<artifactId>slf4j-log4j12</artifactId>
<groupId>org.slf4j</groupId>
</exclusion>
<exclusion>
<artifactId>servlet-api</artifactId>
<groupId>javax.servlet</groupId>
</exclusion>
<exclusion>
<artifactId>geronimo-jaspic_1.0_spec</artifactId>
<groupId>org.apache.geronimo.specs</groupId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.4.1</version>
<exclusions>
<exclusion>
<artifactId>slf4j-log4j12</artifactId>
<groupId>org.slf4j</groupId>
</exclusion>
<exclusion>
<artifactId>servlet-api</artifactId>
<groupId>javax.servlet</groupId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>jdk.tools</groupId>
<artifactId>jdk.tools</artifactId>
<version>1.8</version>
<scope>system</scope>
<systemPath>${JAVA_HOME}/lib/tools.jar</systemPath>
</dependency>
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger2</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger-ui</artifactId>
<version>2.4.0</version>
</dependency>
</dependencies>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>20.0</version>
</dependency>
</dependencies>
</dependencyManagement>
<build>
<finalName>exchange_center</finalName>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<!--接入应用启动类 -->
<mainClass>com.bigdata.StartApplication</mainClass>
<includeSystemScope>true</includeSystemScope>
</configuration>
<executions>
<execution>
<goals>
<!--打包时能够打入系统自己依赖jar包 -->
<goal>repackage</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
<encoding>UTF-8</encoding>
</configuration>
</plugin>
</plugins>
</build>
</project>
2.2、jdbc 工具类
import java.io.IOException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
import java.util.ArrayList;
import java.util.List;
import java.util.Properties;
public class DbUtils {
private static String jdbcUrl ;
private static String userName ;
private static String pwd ;
static {
try {
Properties p = new Properties();
p.load(ClassLoader.getSystemClassLoader().getResourceAsStream("config.properties"));
jdbcUrl = p.getProperty("hive.url");
userName = p.getProperty("hive.user");
pwd = p.getProperty("hive.pwd");
} catch (IOException e) {
e.printStackTrace();
}
}
/**
*
* 获取hive Connection
* @return
* @throws SQLException
* @throws ClassNotFoundException
*
*/
public static Connection getHiveConnection() {
Connection conn = null;
try {
Class.forName("org.apache.hive.jdbc.HiveDriver");
//"jdbc:hive2://cdh-hadoop1:10000/default"
conn = DriverManager.getConnection(jdbcUrl, userName, pwd);
} catch (ClassNotFoundException | SQLException e) {
e.printStackTrace();
}
return conn;
}
/**
*
* 通过传入sql从hive中获取数据
* @param sql
* @return 返回列表信息信息
* @throws SQLException
* @throws ClassNotFoundException
*/
public static List<String> getMessageFromHive(String db, String sql) {
// 这个方法只有下面两种操作
// show table partitions
// show create table
List<String> list = new ArrayList<>();
try {
Connection con = getHiveConnection();
Statement stmt = con.createStatement();
stmt.execute("use "+db);
ResultSet rs = stmt.executeQuery(sql);
while (rs.next()) {
list.add(rs.getString(1));
}
rs.close();
stmt.close();
con.close();
} catch (SQLException e) {
e.printStackTrace();
}
return list;
}
/**
*
* 通过传入sql从hive中获取数据
* @param sql
* @return 返回一个结果
* @throws SQLException
* @throws ClassNotFoundException
*
*/
public static String getMessageFromHive(String sql) {
String result = "blank";
try {
Connection con = getHiveConnection();
Statement stmt = con.createStatement();
ResultSet rs = stmt.executeQuery(sql);
while (rs.next()) {
result = rs.getString(1);
}
rs.close();
stmt.close();
con.close();
} catch (SQLException e) {
e.printStackTrace();
}
return result;
}
/**
*
* 保存数据到hive
* @param sql
* @throws SQLException
*
*/
public static void saveIntoHive(String db, String sql) {
// 这个方法只有下面三种操作
// alter table drop partition
// create table
// insert into
Connection connection = getHiveConnection();
Statement statement;
try {
statement = connection.createStatement();
statement.execute("use " + db);
statement.execute("set hive.exec.dynamic.partition=true");
statement.execute("set hive.exec.dynamic.partition.mode=nonstrict");
statement.execute(sql);
statement.close();
connection.close();
} catch (SQLException e) {
e.printStackTrace();
}
}
public static void main(String[] args) {
getMessageFromHive("i8ji", "show partitions i8ji.ods_t_db_to_hive_list_arc ");
}
}
2.3、controller
@RestController
@RequestMapping("/syncimport")
public class SyncImportController {
@GetMapping("/test")
public Result test(@RequestParam(required =false) String fieldType){
List<String> messageFromHive = DbUtils.getMessageFromHive("i8ji", "show partitions i8ji.ods_t_db_to_hive_list_arc ");
return ResultGeneral.success(messageFromHive);
}
}