Hive集成关系型数据库(MySQL、PostgreSQL)

大数据 专栏收录该内容
11 篇文章 0 订阅

        本文主要目的是基于HDP2.6.3集群,搭建Hive与PostgreSQL数据库集成,以便能够在Hive中直接对PostgreSQL进行较为复杂的查询,比如条件查询,查询排序,分组统计查询,模糊查询,关联查询等。为了更好的完成最终集成环境搭建,做了一些其他有益的集成测试,比如基于hive-jdbc-handler-2.3.6.jar版本集成MySQL数据库等。

        Hive主要通过storage handler框架中不同的handler完成与各类数据库集成,例如HBase,CassandraAzure TableJDBC(包括:MySQL/H2/DERBY/ORACLE/POSTGRES/MSSQL/METASTORE/JETHRO_DATA)、MongoDBElasticSearchPhoenix HBaseVoltDB 和 Google Spreadsheets。在Hive-2.3.6版本中,已经支持与MYSQL/H2/DERBY等数据库集成,遗憾地是尚不支持集成PostgreSQL数据库,而在之后的Hive-3.1.2版本中,支持的数据库已经很多,有MYSQL/H2/DERBY/ORACLE/POSTGRES/MSSQL/METASTORE/JETHRO_DATA等。

下面讨论三种不同版本的jdbc handler对MySQL和PostgreSQL的支持情况:

1、hive-jdbc-handler-2.3.6.jar

        hive-jdbc-handler-2.3.6.jar是Hive-2.3.6版本原生Jar包,支持如下关系型数据库:MYSQL/H2/DERBY,不支持PostgreSQL。

2、qubole-hive-JDBC-0.0.7.jar

        非Hive原生Jar包,对关系型数据库集成支持不成熟。

3、hive-jdbc-handler-3.1.2.jar

        hive-jdbc-handler-3.1.2.jar是Hive-3.1.2版本原生Jar包,支持如下数据库:  MYSQL/H2/DERBY/ORACLE/POSTGRES/MSSQL/METASTORE/JETHRO_DATA

对MYSQL和PostgreSQL均支持。

 

环境和版本

系统

IP地址

软件

Linux192.168.128.21HDP2.6.3
Windows192.168.128.1MySQL5.7、PostgreSQL9.6.15

基于hive-jdbc-handler-2.3.6.jar版本集成MySQL

1、添加hive-jdbc-handler-2.3.6.jar依赖jar包

        将Hive集成MySQL依赖的jar包hive-jdbc-handler-2.3.6.jar拷贝到/usr/hdp/2.6.3.0-235/hive/auxlib下,如果auxlib目录不存在,可以创建一个。

2、在MySQL中创建一张需要映射到Hive中的表

jdbchandler表创建语句如下:

DROP TABLE IF EXISTS `jdbchandler`;
CREATE TABLE `jdbchandler`  (
  `id` int(11) NOT NULL,
  `name` varchar(10) CHARACTER SET utf8 COLLATE utf8_general_ci NULL DEFAULT NULL,
  `age` varchar(10) CHARACTER SET utf8 COLLATE utf8_general_ci NULL DEFAULT NULL,
  `gpa` varchar(10) CHARACTER SET utf8 COLLATE utf8_general_ci NULL DEFAULT NULL,
  PRIMARY KEY (`id`) USING BTREE
) ENGINE = InnoDB CHARACTER SET = utf8 COLLATE = utf8_general_ci ROW_FORMAT = Dynamic;

3、在Hive中创建映射表

CREATE EXTERNAL TABLE student_jdbc
(
  name string,
  age int,
  gpa double
)
STORED BY 'org.apache.hive.storage.jdbc.JdbcStorageHandler'
TBLPROPERTIES (
    "hive.sql.database.type" = "MYSQL",
    "hive.sql.jdbc.driver" = "com.mysql.jdbc.Driver",
    "hive.sql.jdbc.url" = "jdbc:mysql://192.168.128.1/test",
    "hive.sql.dbcp.username" = "root",
    "hive.sql.dbcp.password" = "root",
    "hive.sql.table" = "jdbchandler",
    "hive.sql.query"="select name,age,gpa from jdbchandler",
    "hive.sql.dbcp.maxActive" = "10"
);

表创建中部分属性说明:

hive.sql.database.type:数据库类型;

hive.sql.jdbc.driver:数据库驱动;

hive.sql.jdbc.url:数据库url;

hive.sql.dbcp.username:数据库用户名;

hive.sql.dbcp.password:数据库密码;

hive.sql.table:关系型数据库中需要被映射的表名;

hive.sql.query:查询SQL语句,用于提供映射字段和数据;

hive.sql.dbcp.maxActive:数据库最大连接数;

 

执行该创建语句时,可能会出现如下错误:

(1)未添加集成需要的依赖包

FAILED: SemanticException Cannot find class 'org.apache.hive.storage.jdbc.JdbcStorageHandler'

(2)未添加类似"hive.sql.query"="select name,age,gpa from jdbchandler",的字段和数据映射关系

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: MetaException(message:org.apache.hadoop.hive.serde2.SerDeException java.lang.Il
legalArgumentException: Property hive.sql.query is required.)

(3)无权限访问MySQL

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: MetaException(message:org.apache.hadoop.hive.serde2.SerDeException org.apache.h
ive.storage.jdbc.exception.HiveJdbcDatabaseAccessException: Error while trying to get column names: Cannot create PoolableConnectionFactory (null,  message from server: "Host 'hdp21' is not allowed to connect to this MySQL server")

解决方法:进入MySQL的mysql数据库中,执行如下命令进行授权即可:

select * from user where user='root';

update user set host = '%' where user ='root'; 

flush privileges; 

 

4、对DML的支持

(1)在MySQL中新增一条记录,在Hive中查看

insert into  jdbchandler VALUES(1,'Jack','20','98.2');

 

hive> select * from student_jdbc;
OK
Jack    20    98.2
Time taken: 0.838 seconds, Fetched: 1 row(s)
hive> 

 

(2)在MySQL中修改,在Hive中查看

UPDATE jdbchandler set age='34' WHERE id=1;
SELECT * FROM jdbchandler WHERE id=1;

如图:

hive> select * from student_jdbc;
OK
Jack    34    98.2
Time taken: 0.869 seconds, Fetched: 1 row(s)
hive> 


如图:

 

(3)在MySQL中删除,在Hive中查看

DELETE FROM jdbchandler WHERE id=1;
SELECT * FROM jdbchandler WHERE id=1;

如图:

 

(4)在Hive中插入数据,查看MySQL

insert into table student_jdbc values('Mike','23','89');

可能会出现如下错误:

a、缺少commons-dbcp-1.4.jar

Caused by: java.lang.NoClassDefFoundError: org/apache/commons/dbcp/BasicDataSourceFactory

b、缺少commons-pool-1.6.jar

Caused by: java.lang.NoClassDefFoundError: org/apache/commons/pool/KeyedObjectPoolFactory
    at org.apache.commons.dbcp.BasicDataSourceFactory.createDataSource(BasicDataSourceFactory.java:167)
    at org.apache.hive.storage.jdbc.dao.GenericJdbcDatabaseAccessor.initializeDatabaseConnection(GenericJdbcDatabaseAccessor.java:212)
    at org.apache.hive.storage.jdbc.dao.GenericJdbcDatabaseAccessor.getColumnNames(GenericJdbcDatabaseAccessor.java:61)
    at org.apache.hive.storage.jdbc.JdbcSerDe.initialize(JdbcSerDe.java:69)
    at org.apache.hadoop.hive.ql.exec.FileSinkOperator.initializeOp(FileSinkOperator.java:361)
    at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:363)
    at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:482)
    at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:439)
    at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:376)
    at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:482)
    at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:439)
    at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:376)
    at org.apache.hadoop.hive.ql.exec.MapOperator.initializeMapOperator(MapOperator.java:489)
    at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.init(MapRecordProcessor.java:231)
c、缺少mysql-connector-java-5.1.47.jar

Caused by: java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at org.apache.commons.dbcp.BasicDataSource.createConnectionFactory(BasicDataSource.java:1420)
d、不支持写操作

Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.UnsupportedOperationException: Write operations are not allowed.
    at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:231)
    at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:627)
    at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:571)
    ... 25 more
Caused by: java.lang.UnsupportedOperationException: Write operations are not allowed.
    at org.apache.hive.storage.jdbc.JdbcOutputFormat.getHiveRecordWriter(JdbcOutputFormat.java:44)
    at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:243)
    at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:228)

hive-jdbc-handler-2.3.6.jar不支持从Hive映射表向MySQL写入数据。

 

(5)在MySQL中删除表,然后在hive中查看

hive> show tables;
OK
data_type_test
student_jdbc
test
test_1
test_2
values__tmp__table__1
Time taken: 0.819 seconds, Fetched: 6 row(s)
hive> 
 

hive> desc student_jdbc;
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.hadoop.hive.serde2.SerDeException org.apache.hive.storage.jdbc.exception.HiveJdbcDatabaseAccessE
xception: Error while trying to get column names: Table 'test.jdbchandler' doesn't existhive> 

Hive中映射的表还存在,但是不能使用,事实上,该表只是一个目录,如果重新在MySQL中创建该表,可以在Hive中恢复使用。

 

(6)在Hive中truncate表数据

hive> truncate table student_jdbc;
FAILED: SemanticException [Error 10146]: Cannot truncate non-managed table student_jdbc.
hive> 

Hive中非管理表不能删除数据。

 

(7)在Hive中删除表,在MySQL查看

hive> drop table student_jdbc;
OK
Time taken: 0.655 seconds
hive> show tables;
OK
data_type_test
test
test_1
test_2
values__tmp__table__1
Time taken: 0.261 seconds, Fetched: 5 row(s)
hive> 
Hive中该表正常删除,对MySQL表无影响;

 

(8)对复杂SQL查询支持情况

测试如下sql:

* select count(*) from student_jdbc_pg ;
* select id from student_jdbc_pg  where id > 2;
* select names from student_jdbc_pg ;
* select * from student_jdbc_pg where name like ‘D%’;
* SELECT * FROM student_jdbc_pg ORDER BY name DESC;

基本都支持。不支持count(1),count(*),支持count(字段名)。

 

基于qubole-hive-JDBC-0.0.7.jar版本集成MySQL

        先将hive-jdbc-handler-2.3.6.jar替换为qubole-hive-JDBC-0.0.7.jar,执行创建语句:

CREATE EXTERNAL TABLE student_jdbc
(
  name string,
  age int,
  gpa double
)
STORED BY 'org.apache.hadoop.hive.jdbc.storagehandler.JdbcStorageHandler'
TBLPROPERTIES (
    "hive.sql.database.type" = "MYSQL",
    "hive.sql.jdbc.driver" = "com.mysql.jdbc.Driver",
    "hive.sql.jdbc.url" = "jdbc:mysql://192.168.128.1/test",
    "hive.sql.dbcp.username" = "root",
    "hive.sql.dbcp.password" = "root",
    "hive.sql.table" = "jdbchandler",
    "hive.sql.query"="select name,age,gpa from jdbchandler",
    "hive.sql.dbcp.maxActive" = "10"
);

1、在Hive中查看

查询报错如下:

hive> select * from student_jdbc_pg;
OK
Failed with exception java.io.IOException:java.lang.RuntimeException: java.lang.RuntimeException: java.lang.NullPointerException
Time taken: 0.052 seconds

 

2、在Hive中新增,MySQL查看

insert into table student_jdbc values('Mike','23','89');

报错如下:

Caused by: java.io.IOException
    at org.apache.hadoop.mapreduce.lib.db.DBOutputFormat.getRecordWriter(DBOutputFormat.java:196)
    at org.apache.hadoop.hive.jdbc.storagehandler.JdbcOutputFormat.getHiveRecordWriter(JdbcOutputFormat.java:62)
    at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:243)
    at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:228)
    ... 27 more


qubole-hive-JDBC-0.0.7.jar在Hive中不支持对MySQL写入数据。

 

注意:使用了qubole-hive-JDBC-0.0.7.jar,Hive映射表创建语句使用

STORED BY 'org.apache.hadoop.hive.jdbc.storagehandler.JdbcStorageHandler'
 

通过测试,qubole-hive-JDBC-0.0.7.jar版本集成MySQL并不成熟。

 

基于hive-jdbc-handler-2.3.6.jar版本集成PostgreSQL

下面使用hive-jdbc-handler-2.3.6.jar集成PostgreSQL。

1、在PostgreSQL中创建表

DROP TABLE IF EXISTS jdbchandler;
CREATE TABLE test.jdbchandler (
    id int4 NOT NULL,
    "name" varchar(10) NOT NULL,
    CONSTRAINT pkey PRIMARY KEY (id)
);

如图:

并在表中插入一条记录,如图:

insert into jdbchandler values(1,'Mick');

2、在Hive中创建PostgreSQL的映射表

CREATE EXTERNAL TABLE student_jdbc_pg
(
  name string
)
STORED BY 'org.apache.hive.storage.jdbc.JdbcStorageHandler'
TBLPROPERTIES (
    "hive.sql.database.type" = "POSTGRES",
    "hive.sql.jdbc.driver" = "org.postgresql.Driver",
    "hive.sql.jdbc.url" = "jdbc:postgresql://192.168.128.1/crmbat?currentSchema=test",
    "hive.sql.dbcp.username" = "postgres",
    "hive.sql.dbcp.password" = "root",
    "hive.sql.table" = "jdbchandler",
    "hive.sql.query"="select name from jdbchandler"
);

报错如下:

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: MetaException(message:org.apache.hadoop.hive.serde2.SerDeException java.lang.Il
legalArgumentException: No enum constant org.apache.hive.storage.jdbc.conf.DatabaseType.POSTGRES)

原因:hive-jdbc-handler-2.3.6.jar仅支持MySQL/H2/Derby,不支持PostgreSQL。

 

基于hive-jdbc-handler-3.1.2.jar版本集成PostgreSQL

下面使用hive-jdbc-handler-3.1.2.jar版本,再进行测试,报错如下:

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: MetaException(message:org.apache.hadoop.hive.serde2.SerDeException org.apache.h
ive.storage.jdbc.exception.HiveJdbcDatabaseAccessException: Error while trying to get column names: Cannot load JDBC driver class 'org.postgresql.Driver')

原因:缺少PostgreSQL驱动包,添加postgresql-9.1-901-1.jdbc4.jar,报错如下:

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: MetaException(message:org.apache.hadoop.hive.serde2.SerDeException org.apache.h
ive.storage.jdbc.exception.HiveJdbcDatabaseAccessException: Error while trying to get column names: Cannot create PoolableConnectionFactory (Connection refused. Check that the hostname and port are correct and that the postmaster is accepting TCP/IP connections.))

原因:postgresql-9.1-901-1.jdbc4.jar不对,替换为postgresql-42.2.8.jar

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: MetaException(message:org.apache.hadoop.hive.serde2.SerDeException org.apache.h
ive.storage.jdbc.exception.HiveJdbcDatabaseAccessException: Error while trying to get column names: Cannot create PoolableConnectionFactory (The connection attempt failed.))

解决方法:

(1) 打开postgresql安装目录的data子目录 
(2)修改pg_hba.conf文件:在IPV4部分添加新的一行:host all all 0.0.0.0/0 trust
(3)控制面板-->系统与安全-->Windows防火墙 
(4)点击左侧“高级设置”功能 
(5)选择左侧树上“入站规则”,再点击右侧操作树上的“新建规则” 
(6)在弹出的向导对话框中选择"端口",然后点击“下一步” 
(7)在特定本地端口的输入框里输入"5432",然后一直“下一步”到最后一页,随便给规则起个名字如“postgresql rule”。 

配置PostgreSQL的data目录下的pg_hba.conf和postgresql.conf:

pg_hba.conf中配置所有ip可访问pg:

host    all                all                0.0.0.0/0                trust

postgresql.conf中配置所有ip可监听:

host    all                all                0.0.0.0/0                trust

重新执行创建语句

CREATE EXTERNAL TABLE student_jdbc_pg
(
  name string
)
STORED BY 'org.apache.hive.storage.jdbc.JdbcStorageHandler'
TBLPROPERTIES (
    "hive.sql.database.type" = "POSTGRES",
    "hive.sql.jdbc.driver" = "org.postgresql.Driver",
    "hive.sql.jdbc.url" = "jdbc:postgresql://192.168.128.1/crmbat?currentSchema=test",
    "hive.sql.dbcp.username" = "postgres",
    "hive.sql.dbcp.password" = "root",
    "hive.sql.table" = "jdbchandler",
    "hive.sql.query"="select name from jdbchandler"
);

如图:

 

1、Hive集成PostgreSQL对DML的支持

(1)在PostgreSQL中新增一条记录,Hive查看

insert into jdbchandler values(2,'Tome');
select * from jdbchandler;

如图:

在Hive中查看,如图:

(2)在PostgreSQL中修改记录,在Hive中查看

update jdbchandler set name='Jack' where id=1;
select * from jdbchandler;

如图:

在Hive中查看,如图:

(3)在PostgreSQL中删除,在Hive中查看

delete from jdbchandler where id=1;
select * from jdbchandler;

如图:

在Hive中查看,如图:

(4)在PostgreSQL中删除表,在Hive中查看

FAILED: RuntimeException MetaException(message:org.apache.hadoop.hive.serde2.SerDeException org.apache.hive.storage.jdbc.exception.HiveJdbcDatabaseAccessException: Error while trying to get c
olumn names: ERROR: relation "jdbchandler" does not exist  Position: 18)

PostgreSQL中表已经被删除,在Hive中表依然存在,但无法查询,报错如图:

在PostgreSQL中重新建表后,可以在Hive中恢复查询.

(5)在Hive中删除表,在PostgreSQL查看

在Hive中可以正常删除表,PostgreSQL中表依然存在,如图:

(6)在Hive中新增记录,在PostgreSQL查看

insert into table student_jdbc_pg values("Lucy");
报错如下:

Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.ql.metadata.JarUtils
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 24 more
FAILED: Execution Error, return code -101 from org.apache.hadoop.hive.ql.exec.tez.TezTask. org/apache/hadoop/hive/ql/metadata/JarUtils
 

2、对SQL查询支持情况

hive> select * from student_jdbc_pg where name like 'L%';
OK
org.xml.sax.SAXParseException; lineNumber: 1; columnNumber: 1; Content is not allowed in prolog.
Continuing ...
Lucy
Time taken: 0.728 seconds, Fetched: 1 row(s)
hive> 
结论:均不支持。

 

Status: Failed
Vertex failed, vertexName=Map 2, vertexId=vertex_1570067772780_0021_1_00, diagnostics=[Vertex vertex_1570067772780_0021_1_00 [Map 2] killed/failed due to:I
NIT_FAILURE, Fail to create InputInitializerManager, org.apache.tez.dag.api.TezReflectionException: Unable to instantiate class with 1 arguments: org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator。

 

仅有hive-jdbc-handler-3.1.2.jar依赖,还不够。下面新增一些依赖包,再次测试Hive集成PostgreSQL对SQL查询支持情况:

select count(*) from jdbchandler;
select count(1) from jdbchandler;
select count(id) from jdbchandler;
select id from jdbchandlerwhere id > 2;
select name from jdbchandler;
select * from jdbchandlerwhere name like ‘D%’;
SELECT * FROM jdbchandler ORDER BY name DESC;
select a.id,b.name from student as a inner join joinstudent as b on a.id=b.id;

PostgreSQL中创建jdbchandler表,插入一些测试数据,如下:

CREATE TABLE test.jdbchandler (
    id int4 NOT NULL,
    "name" varchar(10) NOT NULL,
    age int4 not null,
    CONSTRAINT pkey PRIMARY KEY (id)
);

select * from jdbchandler;

1    Tom    23
2    Jack    30
3    Lucy    2
4    Mick    67
5    Jone    210
6    Cat    40
7    Catty    45

再创建joinhandler表,插入一些测试数据,如下:

CREATE TABLE test.joinhandler (
    id int4 NOT NULL,
    "name" varchar(10) NOT NULL,
    age int4 not null,
    CONSTRAINT p_key PRIMARY KEY (id)
);

select * from joinhandler;

1    Tom    23
2    Jack    30
3    Lucy    2
8    Cao    45

 

在Hive中创建一张表,如下:

CREATE TABLE student_orc
(
id int,
name string,
age int
)
ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001'
STORED AS orc;

向表中插入数据
insert overwrite  table student_orc select id,name,age from jdbchandler;

 

 

密码保护

hadoop credential create host1.password -provider jceks://hdfs/user/foo/test.jceks -v passwd1

hadoop credential create host2.password -provider jceks://hdfs/user/foo/test.jceks -v passwd2

 

CREATE EXTERNAL TABLE student_jdbc

(

  name string,

  age int,

  gpa double

)

STORED BY 'org.apache.hive.storage.jdbc.JdbcStorageHandler'

TBLPROPERTIES (

    . . . . . .

    "hive.sql.dbcp.password.keystore" "jceks://hdfs/user/foo/test.jceks",

    "hive.sql.dbcp.password.key" "host1.password",

    . . . . . .

);

数据类型支持

The column data type for a Hive JdbcStorageHandler table can be:

  • Numeric data type: byte, short, int, long, float, double

  • Decimal with scale and precision

  • String date type: string, char, varchar

  • Date

  • Timestamp

Note complex data type: struct, map, array are not supported

分区支持

略。

 

参看文献:

https://cwiki.apache.org/confluence/display/Hive/HBaseIntegration#HBaseIntegration-StorageHandlers

https://cwiki.apache.org/confluence/display/Hive/StorageHandlers#StorageHandlers-DDL

https://cwiki.apache.org/confluence/display/Hive/JdbcStorageHandlerhttps://github.com/qubole/Hive-JDBC-Storage-Handler

  • 0
    点赞
  • 1
    评论
  • 1
    收藏
  • 一键三连
    一键三连
  • 扫一扫,分享海报

©️2021 CSDN 皮肤主题: 大白 设计师:CSDN官方博客 返回首页
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、C币套餐、付费专栏及课程。

余额充值