mysql-hive-hbase数据之间的传递-sqoop

17 篇文章 0 订阅
1.将mysql数据导入hive
a.普通表
创建hive表格
	CREATE TABLE hive.mysql_to_hive
	(
	id INT,
	name STRING,
	age INT
	)
	ROW FORMAT DELIMITED FIELDS TERMINATED BY ','
	STORED AS TEXTFILE;

	sqoop import \
	--connect 'jdbc:mysql://candle:3306/sqoop' \
	--username 'hive' \
	--password 'hive' \
	--as-textfile \
	--table 'human' \
	--mapreduce-job-name 'mysql to hdfs' \
	--fields-terminated-by ',' \
	--lines-terminated-by '\n' \
	-m 1 \
	--hive-import \
	--hive-database 'hive' \
	--hive-table 'mysql_to_hive'

错误原因  hive中相关的包没有找到
把export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HIVE_HOME/lib/*  添加环境变量 ~/.base_profile

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/shims/ShimLoader
	at org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:370)
	at org.apache.hadoop.hive.conf.HiveConf.<clinit>(HiveConf.java:108)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:264)


19/01/17 23:15:34 ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.
19/01/17 23:15:34 ERROR tool.ImportTool: Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
	at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:50)
	at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:392)
	at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:379)



b.mysql 通过查询语句插入分区表
创建一个分区表
	CREATE TABLE hive.mysql_to_hive1
	(
	id INT,
	name STRING,
	age INT
	)
	PARTITIONED BY (sex STRING)
	ROW FORMAT DELIMITED FIELDS TERMINATED BY ','
	STORED AS TEXTFILE;



	sqoop import \
	--connect 'jdbc:mysql://candle:3306/sqoop' \
	--username 'hive' \
	--password 'hive' \
	--as-textfile \
	--table 'human' \
	--mapreduce-job-name 'mysql to hdfs' \
	--fields-terminated-by ',' \
	--lines-terminated-by '\n' \
	-m 1 \
	--hive-import \
	--hive-database 'hive' \
	--hive-table 'mysql_to_hive1' \
	--hive-partition-key 'sex' \
	--hive-partition-value 'male'

2.mysql数据迁移到hbase

HBase arguments:
   --column-family <family>  列族    Sets the target column family for the
                               import
   --hbase-bulkload           批量  Enables HBase bulk loading
   --hbase-create-table       会自动创建  If specified, create missing HBase tables
   --hbase-row-key <col>      row key指定  Specifies which input column to use as the
                               row key
   --hbase-table <table>      表格  Import to <table> in HBase


从mysql插入到hbase中,作为rowkey列不能有空列
sqoop import \
--connect 'jdbc:mysql://candle:3306/sqoop' \
--username 'hive' \
--password 'hive' \
--table 'human' \
--mapreduce-job-name 'mysql to hbase' \
-m 1 \
--hbase-create-table  \
--hbase-table 'hadoop:mysql_to_hbase1' \
--column-family 'f1' \
--hbase-row-key 'age'

id为空
Error: java.io.IOException: Could not insert row with null value for row-key column: id
	at org.apache.sqoop.hbase.ToStringPutTransformer.getPutCommand(ToStringPutTransformer.java:152)
	at org.apache.sqoop.hbase.HBasePutProcessor.accept(HBasePutProcessor.java:132)
	at org.apache.sqoop.mapreduce.DelegatingOutputFormat$DelegatingRecordWriter.write(DelegatingOutputFormat.java:128)

c.hbase(hive)导出MySQL
--创建hive表,和hbase表关联
create external  table hive.hbase_hive_human(
rowkey string,
id int,
name string
 ) 
stored by 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
 with serdeproperties ("hbase.columns.mapping" = ":key,info:id,info:name"
 ,"hbase.table.name" = "hadoop:hbase_hive_human");
--将外部表改为内部表
create table hivetomysql 
ROW FORMAT  DELIMITED FIELDS TERMINATED BY ','
as select * from hbase_hive_human;
--将内部表导出给MySQL
sqoop export \
--connect 'jdbc:mysql://wangfutai:3306/sqoop' \
--username 'hive' \
--password 'hive' \
--table 'hbase_hive_human' \
--export-dir '/user/wangfutai/hive/warehouse/hive.db/hivetomysql' \
--num-mappers 1 \
--mapreduce-job-name 'hive to mysql' \
--fields-terminated-by ',' \
--lines-terminated-by '\n' 

 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值