采坑一:
[xiaokang@hadoop03 ~]$ vim hivedata/5.txt
1,Allen
2,Bob
3,Tom
0: jdbc:hive2://hadoop01:11240> create table t_user(id int,name string)
. . . . . . . . . . . . . . . > partitioned by (country string)
. . . . . . . . . . . . . . . > row format delimited
. . . . . . . . . . . . . . . > fields terminated by ',';
0: jdbc:hive2://hadoop01:11240> load data local inpath '/home/xiaokang/hivedata/5.txt'
. . . . . . . . . . . . . . . > into table t_user partition(country='US');
报错如下:
Error: Error while compiling statement: FAILED: SemanticException Line 1:23 Invalid path ‘’/home/xiaokang/hivedata/5.txt’’: No files matching path file:/home/xiaokang/hivedata/5.txt (state=42000,code=40000)
原因分析:
用hadoop03连接hadoop01的hive,当我在hive命令行上传数据时,指定的是本地上传,注意:hadoop03上的hive命令所指的本地是哪儿?不是hadoop03的本地,而是hadoop01的本地,因为hadoop03的hive已经连接到hadoop01的hive了。而我的数据在hadoop03上,因此会找不到指定文件。
解决办法:
将hadoop03的本地数据文件拷贝一份到hadoop01上
[xiaokang@hadoop03 ~]$ scp -r hivedata/ xiaokang@hadoop01:/home/xiaokang/
拷贝完成之后,就可以上传本地数据到表中了
0: jdbc:hive2://hadoop01:11240> load data local inpath '/home/xiaokang/hivedata/5.txt'
. . . . . . . . . . . . . . . > into table t_user partition(country='US');
如图所示,上传成功