-
导入hdfs 然后进行load data
-
import to hive
sqoop import
–connect jdbc:mysql://localhost:3306/chen
–username root
–password 123
–table test
–delete-target-dir
–hive-import
–hive-table test
导入分区表:
先创建表结构:
cREATE TABLE d5_test_p(
user string,
host string
)
PARTITIONED BY (event_month string)
ROW FORMAT DELIMITED FIELDS TERMINATED BY “\t”;
sqoop import
–connect jdbc:mysql://localhost:3306/chen
–username root
–password 123
–table test
–delete-target-dir
–hive-import
–hive-table test_p
–fields-terminated-by ‘\t’
–hive-overwrite
–hive-partition-key ‘event_month’
–hive-partition-value ‘2018-11’