MySQL导入hive失败_sqoop导数据到hive失败

[hadoop@cloud1 conf]$ sqoop import --connect "jdbc:mysql://localhost:3306/comment" --target-dir "/user/hive/tmp1" --username "comment_usr" --password "comment" --query "select s--split-by "sku_id" --hive-overwrite --hive-import --create-hive-table --hive-table "dw.comment_data_store_tmp5" --hive-partition-key "dt" --hive-partition-value $(date +%Y-%m-%d) --

Warning: /opt/sqoop/../hive-hcatalog does not exist! HCatalog jobs will fail.

Please set $HCAT_HOME to the root of your HCatalog installation.

Warning: /opt/sqoop/../accumulo does not exist! Accumulo imports will fail.

Please set $ACCUMULO_HOME to the root of your Accumulo installation.

16/08/23 17:42:17 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.2

16/08/23 17:42:17 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.

16/08/23 17:42:17 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override

16/08/23 17:42:17 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.

16/08/23 17:42:17 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.

16/08/23 17:42:17 INFO tool.CodeGenTool: Beginning code generation

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/data/platform/hadoop-2.5.0-cdh5.3.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/opt/sqoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/data/platform/hbase0.96/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

16/08/23 17:42:19 INFO manager.SqlManager: Executing SQL statement: select sku_id,goods_id from feiniu_comment where store_type = 2 and is_deleted=0 and is_illegal=0 and goods_id!='' and sk

16/08/23 17:42:19 INFO manager.SqlManager: Executing SQL statement: select sku_id,goods_id from feiniu_comment where store_type = 2 and is_deleted=0 and is_illegal=0 and goods_id!='' and sk

16/08/23 17:42:19 INFO manager.SqlManager: Executing SQL statement: select sku_id,goods_id from feiniu_comment where store_type = 2 and is_deleted=0 and is_illegal=0 and goods_id!='' and sk

16/08/23 17:42:19 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/hadoop

Note: /tmp/sqoop-hadoop/compile/6706482c720350bc7070e62bd34ee215/QueryResult.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

16/08/23 17:42:22 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/6706482c720350bc7070e62bd34ee215/QueryResult.jar

16/08/23 17:42:22 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

16/08/23 17:42:23 INFO tool.ImportTool: Destination directory /user/hive/tmp1 is not present, hence not deleting.

16/08/23 17:42:23 INFO mapreduce.ImportJobBase: Beginning query import.

16/08/23 17:42:23 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar

16/08/23 17:42:23 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps

16/08/23 17:42:23 INFO client.RMProxy: Connecting to ResourceManager at cloud1/10.200.42.1:8032

16/08/23 17:42:33 INFO db.DBInputFormat: Using read commited transaction isolation

16/08/23 17:42:33 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(sku_id), MAX(sku_id) FROM (select sku_id,goods_id from feiniu_comment where store_type = 2 and is_deleted=0

16/08/23 17:42:33 WARN db.TextSplitter: Generating splits for a textual index column.

16/08/23 17:42:33 WARN db.TextSplitter: If your database sorts in a case-insensitive order, this may result in a partial import or duplicate records.

16/08/23 17:42:33 WARN db.TextSplitter: You are strongly encouraged to choose an integral split column.

16/08/23 17:42:33 INFO mapreduce.JobSubmitter: number of splits:5

16/08/23 17:42:34 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1468754987139_0278

16/08/23 17:42:35 INFO impl.YarnClientImpl: Submitted application application_1468754987139_0278

16/08/23 17:42:35 INFO mapreduce.Job: The url to track the job: http://cloud1:8088/proxy/application_1468754987139_0278/

16/08/23 17:42:35 INFO mapreduce.Job: Running job: job_1468754987139_0278

16/08/23 17:42:56 INFO mapreduce.Job: Job job_1468754987139_0278 running in uber mode : false

16/08/23 17:42:56 INFO mapreduce.Job: map 0% reduce 0%

16/08/23 17:43:11 INFO mapreduce.Job: map 40% reduce 0%

16/08/23 17:43:14 INFO mapreduce.Job: map 80% reduce 0%

16/08/23 17:43:15 INFO mapreduce.Job: map 100% reduce 0%

16/08/23 17:43:15 INFO mapreduce.Job: Job job_1468754987139_0278 completed successfully

16/08/23 17:43:15 INFO mapreduce.Job: Counters: 30

File System Counters

FILE: Number of bytes read=0

FILE: Number of bytes written=889220

FILE: Number of read operations=0

FILE: Number of large read operations=0

FILE: Number of write operations=0

HDFS: Number of bytes read=641

HDFS: Number of bytes written=20619

HDFS: Number of read operations=20

HDFS: Number of large read operations=0

HDFS: Number of write operations=10

Job Counters

Launched map tasks=5

Other local map tasks=5

Total time spent by all maps in occupied slots (ms)=66672

Total time spent by all reduces in occupied slots (ms)=0

Total time spent by all map tasks (ms)=66672

Total vcore-seconds taken by all map tasks=66672

Total megabyte-seconds taken by all map tasks=68272128

Map-Reduce Framework

Map input records=1045

Map output records=1045

Input split bytes=641

Spilled Records=0

Failed Shuffles=0

Merged Map outputs=0

GC time elapsed (ms)=310

CPU time spent (ms)=11400

Physical memory (bytes) snapshot=1264214016

Virtual memory (bytes) snapshot=8973721600

Total committed heap usage (bytes)=2521825280

File Input Format Counters

Bytes Read=0

File Output Format Counters

Bytes Written=20619

16/08/23 17:43:15 INFO mapreduce.ImportJobBase: Transferred 20.1357 KB in 52.1306 seconds (395.5258 bytes/sec)

16/08/23 17:43:15 INFO mapreduce.ImportJobBase: Retrieved 1045 records.

16/08/23 17:43:16 INFO manager.SqlManager: Executing SQL statement: select sku_id,goods_id from feiniu_comment where store_type = 2 and is_deleted=0 and is_illegal=0 and goods_id!='' and sk

16/08/23 17:43:16 INFO manager.SqlManager: Executing SQL statement: select sku_id,goods_id from feiniu_comment where store_type = 2 and is_deleted=0 and is_illegal=0 and goods_id!='' and sk

16/08/23 17:43:16 INFO hive.HiveImport: Loading uploaded data into Hive

16/08/23 17:43:16 INFO Configuration.deprecation: mapred.input.dir.recursive is deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive

16/08/23 17:43:16 INFO Configuration.deprecation: mapred.max.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize

16/08/23 17:43:16 INFO Configuration.deprecation: mapred.min.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize

16/08/23 17:43:16 INFO Configuration.deprecation: mapred.min.split.size.per.rack is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.rack

16/08/23 17:43:16 INFO Configuration.deprecation: mapred.min.split.size.per.node is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.node

16/08/23 17:43:16 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces

16/08/23 17:43:16 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative

Logging initialized using configuration in jar:file:/opt/sqoop/lib/hive-common-0.12.0-cdh5.0.2.jar!/hive-log4j.properties

OK

Time taken: 6.115 seconds

Loading data to table fnsearch.comment_data_store_tmp5 partition (dt=2016-08-23)

Partition fnsearch.comment_data_store_tmp5{dt=2016-08-23} stats: [num_files: 6, num_rows: 0, total_size: 20619, raw_data_size: 0]

Table fnsearch.comment_data_store_tmp5 stats: [num_partitions: 1, num_files: 6, num_rows: 0, total_size: 20619, raw_data_size: 0]

OK

Time taken: 1.078 seconds

[hadoop@cloud1 conf]$ cd ..

[hadoop@cloud1 hive]$ sqoop import --connect "jdbc:mysql://localhost:3306/comment" --target-dir "/user/hive/tmp1" --username "comment_usr" --password "comment" --query "select s--split-by "sku_id" --hive-overwrite --hive-import --create-hive-table --hive-table "dw.comment_data_store_tmp6" --hive-partition-key "dt" --hive-partition-value $(date +%Y-%m-%d) --

Warning: /opt/sqoop/../hive-hcatalog does not exist! HCatalog jobs will fail.

Please set $HCAT_HOME to the root of your HCatalog installation.

Warning: /opt/sqoop/../accumulo does not exist! Accumulo imports will fail.

Please set $ACCUMULO_HOME to the root of your Accumulo installation.

16/08/23 17:43:39 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.2

16/08/23 17:43:39 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.

16/08/23 17:43:39 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override

16/08/23 17:43:39 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.

16/08/23 17:43:39 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.

16/08/23 17:43:39 INFO tool.CodeGenTool: Beginning code generation

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/data/platform/hadoop-2.5.0-cdh5.3.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/opt/sqoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/data/platform/hbase0.96/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

16/08/23 17:43:41 INFO manager.SqlManager: Executing SQL statement: select sku_id,goods_id from feiniu_comment where store_type = 2 and is_deleted=0 and is_illegal=0 and goods_id!='' and sk

16/08/23 17:43:41 INFO manager.SqlManager: Executing SQL statement: select sku_id,goods_id from feiniu_comment where store_type = 2 and is_deleted=0 and is_illegal=0 and goods_id!='' and sk

16/08/23 17:43:41 INFO manager.SqlManager: Executing SQL statement: select sku_id,goods_id from feiniu_comment where store_type = 2 and is_deleted=0 and is_illegal=0 and goods_id!='' and sk

16/08/23 17:43:41 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/hadoop

Note: /tmp/sqoop-hadoop/compile/91bb6d8e8213c559ec3cc1a191f021fc/QueryResult.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

16/08/23 17:43:44 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/91bb6d8e8213c559ec3cc1a191f021fc/QueryResult.jar

16/08/23 17:43:44 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

16/08/23 17:43:45 INFO tool.ImportTool: Destination directory /user/hive/tmp1 is not present, hence not deleting.

16/08/23 17:43:45 INFO mapreduce.ImportJobBase: Beginning query import.

16/08/23 17:43:45 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar

16/08/23 17:43:45 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps

16/08/23 17:43:45 INFO client.RMProxy: Connecting to ResourceManager at cloud1/10.200.42.1:8032

16/08/23 17:43:54 INFO db.DBInputFormat: Using read commited transaction isolation

16/08/23 17:43:54 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(sku_id), MAX(sku_id) FROM (select sku_id,goods_id from feiniu_comment where store_type = 2 and is_deleted=0

16/08/23 17:43:54 WARN db.TextSplitter: Generating splits for a textual index column.

16/08/23 17:43:54 WARN db.TextSplitter: If your database sorts in a case-insensitive order, this may result in a partial import or duplicate records.

16/08/23 17:43:54 WARN db.TextSplitter: You are strongly encouraged to choose an integral split column.

16/08/23 17:43:54 INFO mapreduce.JobSubmitter: number of splits:5

16/08/23 17:43:55 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1468754987139_0279

16/08/23 17:43:56 INFO impl.YarnClientImpl: Submitted application application_1468754987139_0279

16/08/23 17:43:56 INFO mapreduce.Job: The url to track the job: http://cloud1:8088/proxy/application_1468754987139_0279/

16/08/23 17:43:56 INFO mapreduce.Job: Running job: job_1468754987139_0279

16/08/23 17:44:11 INFO mapreduce.Job: Job job_1468754987139_0279 running in uber mode : false

16/08/23 17:44:11 INFO mapreduce.Job: map 0% reduce 0%

16/08/23 17:44:20 INFO mapreduce.Job: map 20% reduce 0%

16/08/23 17:44:21 INFO mapreduce.Job: map 40% reduce 0%

16/08/23 17:44:42 INFO mapreduce.Job: map 60% reduce 0%

16/08/23 17:44:44 INFO mapreduce.Job: map 100% reduce 0%

16/08/23 17:44:45 INFO mapreduce.Job: Job job_1468754987139_0279 completed successfully

16/08/23 17:44:45 INFO mapreduce.Job: Counters: 31

File System Counters

FILE: Number of bytes read=0

FILE: Number of bytes written=889220

FILE: Number of read operations=0

FILE: Number of large read operations=0

FILE: Number of write operations=0

HDFS: Number of bytes read=641

HDFS: Number of bytes written=20619

HDFS: Number of read operations=20

HDFS: Number of large read operations=0

HDFS: Number of write operations=10

Job Counters

Killed map tasks=1

Launched map tasks=6

Other local map tasks=6

Total time spent by all maps in occupied slots (ms)=112628

Total time spent by all reduces in occupied slots (ms)=0

Total time spent by all map tasks (ms)=112628

Total vcore-seconds taken by all map tasks=112628

Total megabyte-seconds taken by all map tasks=115331072

Map-Reduce Framework

Map input records=1045

Map output records=1045

Input split bytes=641

Spilled Records=0

Failed Shuffles=0

Merged Map outputs=0

GC time elapsed (ms)=364

CPU time spent (ms)=13420

Physical memory (bytes) snapshot=1256665088

Virtual memory (bytes) snapshot=8962592768

Total committed heap usage (bytes)=2521825280

File Input Format Counters

Bytes Read=0

File Output Format Counters

Bytes Written=20619

16/08/23 17:44:45 INFO mapreduce.ImportJobBase: Transferred 20.1357 KB in 60.2529 seconds (342.2078 bytes/sec)

16/08/23 17:44:45 INFO mapreduce.ImportJobBase: Retrieved 1045 records.

16/08/23 17:44:46 INFO manager.SqlManager: Executing SQL statement: select sku_id,goods_id from feiniu_comment where store_type = 2 and is_deleted=0 and is_illegal=0 and goods_id!='' and sk

16/08/23 17:44:46 INFO manager.SqlManager: Executing SQL statement: select sku_id,goods_id from feiniu_comment where store_type = 2 and is_deleted=0 and is_illegal=0 and goods_id!='' and sk

16/08/23 17:44:46 INFO hive.HiveImport: Loading uploaded data into Hive

16/08/23 17:44:46 INFO Configuration.deprecation: mapred.input.dir.recursive is deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive

16/08/23 17:44:46 INFO Configuration.deprecation: mapred.max.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize

16/08/23 17:44:46 INFO Configuration.deprecation: mapred.min.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize

16/08/23 17:44:46 INFO Configuration.deprecation: mapred.min.split.size.per.rack is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.rack

16/08/23 17:44:46 INFO Configuration.deprecation: mapred.min.split.size.per.node is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.node

16/08/23 17:44:46 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces

16/08/23 17:44:46 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative

Logging initialized using configuration in jar:file:/opt/sqoop/lib/hive-common-0.12.0-cdh5.0.2.jar!/hive-log4j.properties

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. InvalidObjectException(message:There is no database named dw)

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值