mysql –> hbase
1.本次所用数据
sid student _id course_id score
(1,1,60),
(1,2,59),
(1,3,58),
(1,4,22),
(1,5,59),
(1,6,60),
(1,7,99),
(1,8,100),
(1,9,88),
(2,1,99),
(2,2,99),
(2,3,89),
(2,4,60),
(2,5,59),
(2,6,33),
…
2.在MySQL中的数据库和表
数据库名:kb06mysqltestdb,
创建表Mysql_stu1,
列族分别为info和score
sqoop import \
--connect jdbc:mysql://single:3306/kb06mysqltestdb \
--username root \
--password kb10 \
--table score \
--hbase-table kb10:mysql_stu1 \
--column-family score \
--hbase-row-key sid \
--hbase-bulkload
hive➡hdfs
在hive里执行以下操作,将hive里的数据进行过滤清洗到hdfs上
insert overwrite directory '/kb10/shop1118/'
row format delimited
fields terminated by ','
stored as textfile
select shopid,shopname,contact.mobile mobile,
concat_ws('',address) address,volumn['2020'] vol2020
from shop;
输入如下命令,可查看传输到指定hdfs目录/kb10/shop1118/下的数据信息:
hdfs dfs -cat /kb10/shop1118/000000_0
hdfs➡mysql
在mysql里新建一张hive_shop表
create table hive_shop(
id int,
name varchar(50),
mobile