hive 将hdfs数据传输到hive:
CREATE TABLE a (id STRING,num INT) row format delimited fields terminated by '\t';
load data inpath '/user/xxx/a.txt' into table a
一定要注意分隔符的问题!!!
sqoop 将hive中的数据表导出到mysql中:
sqoop export --connect "jdbc:mysql://IP:3306/database?useUnicode=true&characterEncoding=utf-8" --username root --password password --table extract_50user_data --export-dir /user/hive/warehouse/name.db/extract_50users_join --fields-terminated-by '\001' --input-null-string '\\N' --input-null-non-string '\\N'
hive 修改列名:
ALTER TABLE table_name CHANGE [COLUMN] col_old_name col_new_name column_type [COMMENTcol_comment] [FIRST|(AFTER column_name)]
示例:
ALTER TABLE user_action change frep freq int comment 'frequency' AFTER name;