1.sqoop导入到HDFS
1.1执行sqoop job,会自动更新last value
# sqoop 增量导入脚本
bin/sqoop job --create sqoop_hdfs_test02 -- import \
--connect jdbc:mysql://localhost:3306/pactera_test \
--username root \
--password 123456 \
--table student \
--target-dir /user/sqoop/test002/ \
--fields-terminated-by "\t" \
--check-column last_modified \
--incremental lastmodified \
--last-value "2018-12-12 00:03:00" \
--append
说明:--append 参数是必须的,要不然第二次运行job 会报错,如下:
至此,sqoop job 已建设完毕!
2.Hive创建表,并读取sqoop导入的数据
create external table if not exists student_hive (SId int,Sname string ,Sage string,Ssex string , last_modified Timestamp)
row format delimited fields terminated by '