HDFS: Number of large read operations=0
HDFS: Number of write operations=24
Map-Reduce Framework
Map input records=52457
Map output records=52457
Input split bytes=453
Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=41
Total committed heap usage (bytes)=366870528
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=10819859
24/02/23 03:02:27 INFO mapreduce.ImportJobBase: Transferred 25.4932 MB in 6.9433 seconds (3.6716 MB/sec)
24/02/23 03:02:27 INFO mapreduce.ImportJobBase: Retrieved 52457 records.
24/02/23 03:02:27 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners for table book
24/02/23 03:02:27 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `book` AS t LIMIT 1
24/02/23 03:02:27 WARN hive.TableDefWriter: Column PublicationDate had to be cast to a less precise type in Hive
24/02/23 03:02:27 WARN hive.TableDefWriter: Column Price had to be cast to a less precise type in Hive
24/02/23 03:02:27 INFO hive.HiveImport: Loading uploaded data into Hive
Killed
mysql通过sqoop工具将数据同步到hive出现hive进程杀死的问题不知如何解决
最新推荐文章于 2024-07-09 21:02:55 发布