mysql,oracle 一键转hive建表语句、oracle转hive表结构转换脚本 、mysql转hive、oracle转hive

**

mysql转hive转换脚本:

**

SET SESSION group_concat_max_len = 102400;
SELECT
    a.TABLE_NAME ,
    b.TABLE_COMMENT ,
    concat('DROP TABLE IF EXISTS hive库名.hive表名前缀ods_',a.TABLE_NAME,';CREATE EXTERNAL TABLE IF NOT EXISTS hive库名.hive表名前缀ods_',a.TABLE_NAME ,' (',group_concat(concat(a.COLUMN_NAME,' ',
	
	CASE
    WHEN DATA_TYPE in ('varchar','longtext','char','datetime','timestamp','varbinary','bit','mediumtext','set','longblob','text','blob','time','date') THEN
        'string'
    WHEN DATA_TYPE = 'decimal' THEN
        a.COLUMN_TYPE
   WHEN DATA_TYPE = 'float' THEN
        'double'
    ELSE 
        DATA_TYPE
    END -- 数据类型转换
	
	," COMMENT '",a.COLUMN_COMMENT,"'") order by a.TABLE_NAME,a.ORDINAL_POSITION) ,
    ") COMMENT '",b.TABLE_COMMENT,"' row format delimited fields terminated by  '\\t' lines terminated by '\\n' stored as textfile;") AS hive建表语句
FROM
    (
        SELECT
            TABLE_SCHEMA,
            TABLE_NAME,
            COLUMN_NAME,
            ORDINAL_POSITION,
            DATA_TYPE,
            COLUMN_COMMENT,
			COLUMN_TYPE
        FROM
            information_schema.COLUMNS
        WHERE
            TABLE_SCHEMA='你的库名'
        ) AS a
LEFT JOIN
    information_schema.TABLES AS b
ON
    a.TABLE_NAME=b.TABLE_NAME
AND a.TABLE_SCHEMA=b.TABLE_SCHEMA
where b.TABLE_TYPE='BASE TABLE'
  and a.TABLE_NAME not like 'ods_%'
GROUP BY
    a.TABLE_NAME,
    b.TABLE_COMMENT
;

**

oracle转hive转换脚本:

**

select to_char(substr(table_prefix || col_strs ||table_subfix, 1, 4000)) con
  from (select n.table_prefix,
               wm_concat(m.col_str) as col_strs,
               n.table_subfix
          from (select a.table_name,
                       'drop table if exists hive库名.hive表名前缀ods_' || lower(a.table_name) ||'; 
						create table if not exists hive库名.hive表名前缀ods_' || lower(a.table_name) ||'(' as table_prefix,
                       ')
				PARTITIONED BY (ds STRING comment ''yyyymmdd'')
				row format delimited fields terminated by  ''\t'' lines terminated by ''\n'' stored as textfile;' as table_subfix
                  from user_tables a, user_tab_comments b
                 where a.table_name = b.table_name
                 order by a.table_name) n,
               (select c.TABLE_NAME,
                       c.column_name || CASE
                         WHEN c.DATA_TYPE IN ('CHAR',
                                              'NCHAR',
                                              'VARCHAR',
                                              'VARCHAR2',
                                              'NVARCHAR2',
                                              'DATE',
                                              'TIMESTAMP',
                                              'TIMESTAMP WITH TIME ZONE',
                                              'TIMESTAMP WITH LOCAL TIME ZONE',
                                              'INTERVAL YEAR TO MOTH',
                                              'INTERVAL DAY TO SECOND',
                                              'BLOB',
                                              'CLOB',
                                              'NCLOB',
                                              'BFILE',
                                              'RAW',
                                              'LONG RAW') THEN
                          ' STRING '
                         WHEN C.DATA_TYPE = 'INTEGER' THEN
                          ' BIGINT '
                         WHEN C.DATA_TYPE = 'NUMBER' THEN
                          (CASE
                            WHEN C.DATA_SCALE IS NOT NULL AND c.DATA_SCALE <> 0 THEN
                             ' DECIMAL(' || C.DATA_PRECISION || ',' ||
                             C.DATA_SCALE || ') '
                            WHEN C.DATA_PRECISION < 3 THEN
                             ' TINYINT '
                            WHEN C.DATA_PRECISION < 5 THEN
                             ' SMALLINT '
                            WHEN C.DATA_PRECISION < 10 THEN
                             ' INT '
                            ELSE
                             ' BIGINT '
                          END)
                         WHEN C.DATA_TYPE IN
                              ('BINARY_FLOAT', 'BINARY_DOUBLE', 'FLOAT') THEN
                          ' DOUBLE '
                         ELSE
                          ' STRING '
                       END || 'comment ''' ||
                       REGEXP_REPLACE(T.comments,
                                      '[' || CHR(10) || CHR(13) || CHR(9) ||
                                      CHR(32) || ']',
                                      '') || '''' as col_str
                  from user_tab_cols c, user_col_comments t
                 where c.TABLE_NAME = t.table_name
                   and c.COLUMN_NAME = t.column_name) m
         where n.table_name = m.table_name
         group by n.table_prefix, n.table_subfix)
  • 1
    点赞
  • 11
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
你可以使用Spark Core的算子来实现将MySQL建表语句转换Hive建表语句。下面是一个示例代码,你可以根据自己的需要进行修改和扩展: ```scala import org.apache.spark.sql.SparkSession object MySQLToHiveConverter { def main(args: Array[String]): Unit = { val spark = SparkSession.builder() .appName("MySQLToHiveConverter") .master("local") .enableHiveSupport() .getOrCreate() // MySQL建表语句 val mysqlCreateTableStatement = """ |CREATE TABLE employees ( | id INT, | name STRING, | age INT, | salary FLOAT |) |""".stripMargin // 解析MySQL建表语句 val parsedColumns = mysqlCreateTableStatement .split("\n") .map(_.trim) .filter(_.nonEmpty) .filter(!_.startsWith("CREATE TABLE")) .filter(!_.startsWith(")")) .map(_.split(" ")(0)) // 构建Hive建表语句 val hiveCreateTableStatement = s""" |CREATE EXTERNAL TABLE employees_hive ( | ${parsedColumns.mkString(",\n ")} |) |ROW FORMAT DELIMITED |FIELDS TERMINATED BY ',' |STORED AS TEXTFILE |LOCATION '/path/to/hive/employees' |""".stripMargin println(hiveCreateTableStatement) spark.stop() } } ``` 在这个示例中,我们假设你已经创建了一个名为`employees`的MySQL,并提供了对应的建表语句。我们使用SparkSession来创建Spark应用程序,并启用Hive支持。然后,我们解析MySQL建表语句,提取出的列名。最后,我们使用这些列名构建了Hive建表语句。 请注意,这只是一个简单的示例,你可能需要根据你的具体情况进行修改和扩展。另外,你需要确保你的Spark应用程序能够访问到MySQLHive的相关配置信息和依赖项。 希望以上代码对你有所帮助!如有任何疑问,请随时向我提问。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值