sqoop导出出现问题

[root@master ~]# hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

Logging initialized using configuration in jar:file:/opt/hive/lib/hive-common-2.3.0.jar!/hive-log4j2.properties Async: true
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
hive> use mydb2;
OK
Time taken: 3.743 seconds
hive> show tables;
OK
tb_staff
Time taken: 0.221 seconds, Fetched: 1 row(s)
hive> drop table tb_staff;
OK
Time taken: 1.589 seconds
hive> CREATE TABLE tb_staff(
    >         id INT,
    >         `name` string,
    >         birthday  date
    > ) row format delimited
    >  fields terminated by ',';
OK
Time taken: 0.366 seconds
hive> select * from tb_staff
    > ;
OK
1       宋江    1979-08-12
2       林冲    1976-04-12
3       鲁智深  1979-08-12
4       公孙胜  1976-09-13
5       秦明    1980-07-29
6       李逵    1980-06-28
7       武松    1979-06-02
8       朱武    1975-07-06
9       黄信    1969-08-09
10      孙立    1974-07-08
Time taken: 0.18 seconds, Fetched: 10 row(s)
hive>  show create table tb_staff;
OK
CREATE TABLE `tb_staff`(
  `id` int, 
  `name` string, 
  `birthday` date)
ROW FORMAT SERDE 
  'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' 
WITH SERDEPROPERTIES ( 
  'field.delim'=',', 
  'serialization.format'=',') 
STORED AS INPUTFORMAT 
  'org.apache.hadoop.mapred.TextInputFormat' 
OUTPUTFORMAT 
  'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
LOCATION
  'hdfs://ns1/user/hive/warehouse/mydb2.db/tb_staff'
TBLPROPERTIES (
  'transient_lastDdlTime'='1618919821')
Time taken: 0.125 seconds, Fetched: 17 row(s)
hive> 

因为之前已经创建了一张表,所以现在我重新删除了,上面是我重新创建了一个hive,已经插入了数据,在hdfs上面是可以下载查看。

在Linux上的MySQL上也创建了一张空表

mariadb> CREATE TABLE tb_emp_bak (
	id INT PRIMARY KEY AUTO_INCREMENT COMMENT '编号', 
	`name`  VARCHAR(20) NOT NULL COMMENT '姓名',
	birthday DATE COMMENT '生日'
);
Query OK, 0 rows affected (0.01 sec)

mariadb> SELECT * FROM tb_emp_bak;
Empty set

mariadb> DESC tb_emp_bak;
+----------+-------------+------+-----+---------+----------------+
| Field    | Type        | Null | Key | Default | Extra          |
+----------+-------------+------+-----+---------+----------------+
| id       | int(11)     | NO   | PRI | NULL    | auto_increment |
| name     | varchar(20) | NO   |     | NULL    |                |
| birthday | date        | YES  |     | NULL    |                |
+----------+-------------+------+-----+---------+----------------+
3 rows in set (0.06 sec)

mariadb> 

上面所提到的一些只不过是想要说一下我觉得操作都应该是正确了。

另外还说一下虚拟机的版本是mariadb5.5.68,hadoop-2.7.6,sqoop-1.4.7.bin__hadoop-2.6.0.tar.gz

mariadb> select version();
+----------------+
| version()      |
+----------------+
| 5.5.68-MariaDB |
+----------------+
1 row in set (0.01 sec)

下面是我的sqoop中的lib目录,里面已经添加了mysql的驱动包

[root@master lib]# ll
总用量 17616
-rw-r--r-- 1 root root  224277 1月   6 18:31 ant-contrib-1.0b3.jar
-rw-r--r-- 1 root root   36455 1月   6 18:31 ant-eclipse-1.0-jvm1.2.jar
-rw-r--r-- 1 root root 1344870 1月   6 18:31 avro-1.8.1.jar
-rw-r--r-- 1 root root  186260 1月   6 18:31 avro-mapred-1.8.1-hadoop2.jar
-rw-r--r-- 1 root root   58160 1月   6 18:31 commons-codec-1.4.jar
-rw-r--r-- 1 root root  365552 1月   6 18:31 commons-compress-1.8.1.jar
-rw-r--r-- 1 root root  109043 1月   6 18:31 commons-io-1.4.jar
-rw-r--r-- 1 root root  267634 1月   6 18:31 commons-jexl-2.1.1.jar
-rw-r--r-- 1 root root  434678 1月   6 18:31 commons-lang3-3.4.jar
-rw-r--r-- 1 root root   60686 1月   6 18:31 commons-logging-1.1.1.jar
-rw-r--r-- 1 root root  437405 1月   6 18:31 hive-common-2.3.0.jar
-rw-r--r-- 1 root root  706710 1月   6 18:31 hsqldb-1.8.0.10.jar
-rw-r--r-- 1 root root   46968 1月   6 18:35 jackson-annotations-2.6.0.jar
-rw-r--r-- 1 root root  258876 1月   6 18:35 jackson-core-2.6.5.jar
-rw-r--r-- 1 root root 1171380 1月   6 18:36 jackson-databind-2.6.5.jar
-rw-r--r-- 1 root root   84697 4月  20 19:34 java-json.jar
-rw-r--r-- 1 root root 2178774 1月   6 18:31 kite-data-core-1.1.0.jar
-rw-r--r-- 1 root root 1801469 1月   6 18:31 kite-data-hive-1.1.0.jar
-rw-r--r-- 1 root root 1768012 1月   6 18:31 kite-data-mapreduce-1.1.0.jar
-rw-r--r-- 1 root root 1765905 1月   6 18:31 kite-hadoop-compatibility-1.1.0.jar
-rw-r--r-- 1 root root 1004838 4月  20 19:18 mysql-connector-java-5.1.46.jar
-rw-r--r-- 1 root root   19827 1月   6 18:31 opencsv-2.3.jar
-rw-r--r-- 1 root root   34604 1月   6 18:31 paranamer-2.7.jar
-rw-r--r-- 1 root root   53464 1月   6 18:31 parquet-avro-1.6.0.jar
-rw-r--r-- 1 root root  892808 1月   6 18:31 parquet-column-1.6.0.jar
-rw-r--r-- 1 root root   20998 1月   6 18:31 parquet-common-1.6.0.jar
-rw-r--r-- 1 root root  279012 1月   6 18:31 parquet-encoding-1.6.0.jar
-rw-r--r-- 1 root root  375618 1月   6 18:31 parquet-format-2.2.0-rc1.jar
-rw-r--r-- 1 root root   20744 1月   6 18:31 parquet-generator-1.6.0.jar
-rw-r--r-- 1 root root  205389 1月   6 18:31 parquet-hadoop-1.6.0.jar
-rw-r--r-- 1 root root 1033299 1月   6 18:31 parquet-jackson-1.6.0.jar
-rw-r--r-- 1 root root   25496 1月   6 18:31 slf4j-api-1.6.1.jar
-rw-r--r-- 1 root root  592319 1月   6 18:31 snappy-java-1.1.1.6.jar
-rw-r--r-- 1 root root   99555 1月   6 18:31 xz-1.5.jar
[root@master lib]# 

还有sqoop中的sqoop-env.sh

# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# included in all the hadoop scripts with source command
# should not be executable directly
# also should not be passed any arguments, since we need original $*

# Set Hadoop-specific environment variables here.

#Set path to where bin/hadoop is available
export HADOOP_COMMON_HOME=/opt/hadoop

#Set path to where hadoop-*-core.jar is available
#export HADOOP_MAPRED_HOME=

#set the path to where bin/hbase is available
export HBASE_HOME=/opt/sqoop/hbase-1.6.0-jars

#Set the path to where bin/hive is available
export HIVE_HOME=/opt/hive

#Set the path for where zookeper config dir is
export ZOOCFGDIR=/opt/zk/conf
export ZOOKEEPER_HOME=/opt/zk

export HCAT_HOME=/opt/hive/hcatalog

实操过程:
第一次执行没成功


```powershell
[root@master sqoop]# sqoop export\
>  --connect 'jdbc:mysql://master:3306/sqoop?useUnicode=true&characterEncoding=utf-8'\
>  --username root\
>  --password 123456\
>  --table tb_emp_bak\
>  --input-fields-terminated-by ','\
>  --export-dir /user/hive/warehouse/mydb2.db/tb_staff/part-m-00000
Warning: /opt/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
21/04/20 20:04:06 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
21/04/20 20:04:06 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
21/04/20 20:04:07 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
21/04/20 20:04:07 INFO tool.CodeGenTool: Beginning code generation
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/sqoop/hbase-1.6.0-jars/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
21/04/20 20:04:07 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_emp_bak` AS t LIMIT 1
21/04/20 20:04:07 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_emp_bak` AS t LIMIT 1
21/04/20 20:04:07 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/hadoop
注: /tmp/sqoop-root/compile/f9b507627fd47a634bdf9e7ef47fcdca/tb_emp_bak.java使用或覆盖了已过时的 API。
注: 有关详细信息, 请使用 -Xlint:deprecation 重新编译。
21/04/20 20:04:10 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/f9b507627fd47a634bdf9e7ef47fcdca/tb_emp_bak.jar
21/04/20 20:04:10 INFO mapreduce.ExportJobBase: Beginning export of tb_emp_bak
21/04/20 20:04:11 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
21/04/20 20:04:12 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
21/04/20 20:04:12 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
21/04/20 20:04:12 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
21/04/20 20:04:16 INFO input.FileInputFormat: Total input paths to process : 1
21/04/20 20:04:16 INFO input.FileInputFormat: Total input paths to process : 1
21/04/20 20:04:16 INFO mapreduce.JobSubmitter: number of splits:4
21/04/20 20:04:16 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
21/04/20 20:04:16 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1618917590412_0010
21/04/20 20:04:17 INFO impl.YarnClientImpl: Submitted application application_1618917590412_0010
21/04/20 20:04:17 INFO mapreduce.Job: The url to track the job: http://slave1:8088/proxy/application_1618917590412_0010/
21/04/20 20:04:17 INFO mapreduce.Job: Running job: job_1618917590412_0010
21/04/20 20:04:24 INFO mapreduce.Job: Job job_1618917590412_0010 running in uber mode : false
21/04/20 20:04:24 INFO mapreduce.Job:  map 0% reduce 0%
21/04/20 20:04:32 INFO mapreduce.Job:  map 100% reduce 0%
21/04/20 20:04:33 INFO mapreduce.Job: Job job_1618917590412_0010 failed with state FAILED due to: Task failed task_1618917590412_0010_m_000003
Job failed as tasks failed. failedMaps:1 failedReduces:0

21/04/20 20:04:33 INFO mapreduce.Job: Counters: 12
        Job Counters 
                Failed map tasks=1
                Killed map tasks=3
                Launched map tasks=4
                Data-local map tasks=4
                Total time spent by all maps in occupied slots (ms)=21886
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=21886
                Total vcore-milliseconds taken by all map tasks=21886
                Total megabyte-milliseconds taken by all map tasks=22411264
        Map-Reduce Framework
                CPU time spent (ms)=0
                Physical memory (bytes) snapshot=0
                Virtual memory (bytes) snapshot=0
21/04/20 20:04:33 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
21/04/20 20:04:33 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 21.2356 seconds (0 bytes/sec)
21/04/20 20:04:33 INFO mapreduce.ExportJobBase: Exported 0 records.
21/04/20 20:04:33 ERROR mapreduce.ExportJobBase: Export job failed!
21/04/20 20:04:33 ERROR tool.ExportTool: Error during export: 
Export job failed!
        at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:445)
        at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:931)
        at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80)
        at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

第二次执行,修改了分隔符,因为网上说大部分是因为分隔符的问题,可是没有成功。可能是存着侥幸心理,但是我特别指定了“,”为分隔符。

[root@master sqoop]# sqoop export\
>  --connect 'jdbc:mysql://master:3306/sqoop?useUnicode=true&characterEncoding=utf-8'\
>  --username root\
>  --password 123456\
>  --table tb_emp_bak\
>  --input-fields-terminated-by '\001'\
>  --export-dir /user/hive/warehouse/mydb2.db/tb_staff/
Warning: /opt/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
21/04/20 20:31:10 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
21/04/20 20:31:10 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
21/04/20 20:31:10 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
21/04/20 20:31:10 INFO tool.CodeGenTool: Beginning code generation
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/sqoop/hbase-1.6.0-jars/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
21/04/20 20:31:11 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_emp_bak` AS t LIMIT 1
21/04/20 20:31:11 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_emp_bak` AS t LIMIT 1
21/04/20 20:31:11 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/hadoop
注: /tmp/sqoop-root/compile/c2eaacaec71a04fe9aee710f89b4153f/tb_emp_bak.java使用或覆盖了已过时的 API。
注: 有关详细信息, 请使用 -Xlint:deprecation 重新编译。
21/04/20 20:31:19 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/c2eaacaec71a04fe9aee710f89b4153f/tb_emp_bak.jar
21/04/20 20:31:19 INFO mapreduce.ExportJobBase: Beginning export of tb_emp_bak
21/04/20 20:31:20 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
21/04/20 20:31:22 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
21/04/20 20:31:22 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
21/04/20 20:31:22 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
21/04/20 20:31:29 INFO input.FileInputFormat: Total input paths to process : 2
21/04/20 20:31:29 INFO input.FileInputFormat: Total input paths to process : 2
21/04/20 20:31:29 INFO mapreduce.JobSubmitter: number of splits:3
21/04/20 20:31:29 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
21/04/20 20:31:29 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1618917590412_0012
21/04/20 20:31:30 INFO impl.YarnClientImpl: Submitted application application_1618917590412_0012
21/04/20 20:31:30 INFO mapreduce.Job: The url to track the job: http://slave1:8088/proxy/application_1618917590412_0012/
21/04/20 20:31:30 INFO mapreduce.Job: Running job: job_1618917590412_0012
21/04/20 20:31:41 INFO mapreduce.Job: Job job_1618917590412_0012 running in uber mode : false
21/04/20 20:31:41 INFO mapreduce.Job:  map 0% reduce 0%
21/04/20 20:31:57 INFO mapreduce.Job:  map 100% reduce 0%
21/04/20 20:31:58 INFO mapreduce.Job: Job job_1618917590412_0012 failed with state FAILED due to: Task failed task_1618917590412_0012_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0

21/04/20 20:31:58 INFO mapreduce.Job: Counters: 8
        Job Counters 
                Failed map tasks=3
                Launched map tasks=3
                Data-local map tasks=3
                Total time spent by all maps in occupied slots (ms)=36131
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=36131
                Total vcore-milliseconds taken by all map tasks=36131
                Total megabyte-milliseconds taken by all map tasks=36998144
21/04/20 20:31:58 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
21/04/20 20:31:58 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 35.4898 seconds (0 bytes/sec)
21/04/20 20:31:58 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
21/04/20 20:31:58 INFO mapreduce.ExportJobBase: Exported 0 records.
21/04/20 20:31:58 ERROR mapreduce.ExportJobBase: Export job failed!
21/04/20 20:31:58 ERROR tool.ExportTool: Error during export: 
Export job failed!
        at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:445)
        at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:931)
        at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80)
        at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
[root@master sqoop]# 

第三次,把分隔符改回来了,虽然数据中没有空值,但还是将设置加上了。

[root@master sqoop]# sqoop export\
>  --connect 'jdbc:mysql://master:3306/sqoop?useUnicode=true&characterEncoding=utf-8'\
>  --username root\
>  --password 123456\
>  --table tb_emp_bak\
>  --input-fields-terminated-by ','\
>  --export-dir /user/hive/warehouse/mydb2.db/tb_staff/part-m-00000\
>  --input-null-string '\\N' \
>  --input-null-non-string '\\N'
Warning: /opt/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
21/04/20 20:07:01 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
21/04/20 20:07:01 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
21/04/20 20:07:01 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
21/04/20 20:07:01 INFO tool.CodeGenTool: Beginning code generation
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/sqoop/hbase-1.6.0-jars/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
21/04/20 20:07:02 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_emp_bak` AS t LIMIT 1
21/04/20 20:07:02 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_emp_bak` AS t LIMIT 1
21/04/20 20:07:02 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/hadoop
注: /tmp/sqoop-root/compile/494c9dd47e075ae19fccc77d7a680fb9/tb_emp_bak.java使用或覆盖了已过时的 API。
注: 有关详细信息, 请使用 -Xlint:deprecation 重新编译。
21/04/20 20:07:04 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/494c9dd47e075ae19fccc77d7a680fb9/tb_emp_bak.jar
21/04/20 20:07:04 INFO mapreduce.ExportJobBase: Beginning export of tb_emp_bak
21/04/20 20:07:04 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
21/04/20 20:07:05 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
21/04/20 20:07:05 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
21/04/20 20:07:05 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
21/04/20 20:07:09 INFO input.FileInputFormat: Total input paths to process : 1
21/04/20 20:07:09 INFO input.FileInputFormat: Total input paths to process : 1
21/04/20 20:07:09 INFO mapreduce.JobSubmitter: number of splits:4
21/04/20 20:07:09 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
21/04/20 20:07:09 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1618917590412_0011
21/04/20 20:07:09 INFO impl.YarnClientImpl: Submitted application application_1618917590412_0011
21/04/20 20:07:09 INFO mapreduce.Job: The url to track the job: http://slave1:8088/proxy/application_1618917590412_0011/
21/04/20 20:07:09 INFO mapreduce.Job: Running job: job_1618917590412_0011
21/04/20 20:07:16 INFO mapreduce.Job: Job job_1618917590412_0011 running in uber mode : false
21/04/20 20:07:16 INFO mapreduce.Job:  map 0% reduce 0%
21/04/20 20:07:25 INFO mapreduce.Job:  map 100% reduce 0%
21/04/20 20:07:26 INFO mapreduce.Job: Job job_1618917590412_0011 failed with state FAILED due to: Task failed task_1618917590412_0011_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0

21/04/20 20:07:27 INFO mapreduce.Job: Counters: 12
        Job Counters 
                Failed map tasks=2
                Killed map tasks=2
                Launched map tasks=4
                Data-local map tasks=4
                Total time spent by all maps in occupied slots (ms)=25685
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=25685
                Total vcore-milliseconds taken by all map tasks=25685
                Total megabyte-milliseconds taken by all map tasks=26301440
        Map-Reduce Framework
                CPU time spent (ms)=0
                Physical memory (bytes) snapshot=0
                Virtual memory (bytes) snapshot=0
21/04/20 20:07:27 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
21/04/20 20:07:27 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 21.3258 seconds (0 bytes/sec)
21/04/20 20:07:27 INFO mapreduce.ExportJobBase: Exported 0 records.
21/04/20 20:07:27 ERROR mapreduce.ExportJobBase: Export job failed!
21/04/20 20:07:27 ERROR tool.ExportTool: Error during export: 
Export job failed!
        at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:445)
        at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:931)
        at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80)
        at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

第四次,加上了一个-m参数

[root@master sqoop]# sqoop export\
>  --connect 'jdbc:mysql://master:3306/sqoop?useUnicode=true&characterEncoding=utf-8'\
>  --username root\
>  --password 123456\
>  --table tb_emp_bak\
>  --input-fields-terminated-by '\001'\
>  -m 1\
>  --export-dir /user/hive/warehouse/mydb2.db/tb_staff/
Warning: /opt/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
21/04/20 20:36:31 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
21/04/20 20:36:31 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
21/04/20 20:36:31 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
21/04/20 20:36:31 INFO tool.CodeGenTool: Beginning code generation
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/sqoop/hbase-1.6.0-jars/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
21/04/20 20:36:32 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_emp_bak` AS t LIMIT 1
21/04/20 20:36:32 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_emp_bak` AS t LIMIT 1
21/04/20 20:36:32 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/hadoop
注: /tmp/sqoop-root/compile/60fc6919f80c8e7f23ee1a464151bbf5/tb_emp_bak.java使用或覆盖了已过时的 API。
注: 有关详细信息, 请使用 -Xlint:deprecation 重新编译。
21/04/20 20:36:36 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/60fc6919f80c8e7f23ee1a464151bbf5/tb_emp_bak.jar
21/04/20 20:36:36 INFO mapreduce.ExportJobBase: Beginning export of tb_emp_bak
21/04/20 20:36:37 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
21/04/20 20:36:39 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
21/04/20 20:36:39 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
21/04/20 20:36:39 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
21/04/20 20:36:45 INFO input.FileInputFormat: Total input paths to process : 2
21/04/20 20:36:45 INFO input.FileInputFormat: Total input paths to process : 2
21/04/20 20:36:45 INFO mapreduce.JobSubmitter: number of splits:1
21/04/20 20:36:45 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
21/04/20 20:36:45 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1618917590412_0014
21/04/20 20:36:46 INFO impl.YarnClientImpl: Submitted application application_1618917590412_0014
21/04/20 20:36:46 INFO mapreduce.Job: The url to track the job: http://slave1:8088/proxy/application_1618917590412_0014/
21/04/20 20:36:46 INFO mapreduce.Job: Running job: job_1618917590412_0014
21/04/20 20:36:57 INFO mapreduce.Job: Job job_1618917590412_0014 running in uber mode : false
21/04/20 20:36:57 INFO mapreduce.Job:  map 0% reduce 0%
21/04/20 20:37:09 INFO mapreduce.Job:  map 100% reduce 0%
21/04/20 20:37:10 INFO mapreduce.Job: Job job_1618917590412_0014 failed with state FAILED due to: Task failed task_1618917590412_0014_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0

21/04/20 20:37:10 INFO mapreduce.Job: Counters: 8
        Job Counters 
                Failed map tasks=1
                Launched map tasks=1
                Data-local map tasks=1
                Total time spent by all maps in occupied slots (ms)=7716
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=7716
                Total vcore-milliseconds taken by all map tasks=7716
                Total megabyte-milliseconds taken by all map tasks=7901184
21/04/20 20:37:10 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
21/04/20 20:37:10 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 30.9485 seconds (0 bytes/sec)
21/04/20 20:37:10 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
21/04/20 20:37:10 INFO mapreduce.ExportJobBase: Exported 0 records.
21/04/20 20:37:10 ERROR mapreduce.ExportJobBase: Export job failed!
21/04/20 20:37:10 ERROR tool.ExportTool: Error during export: 
Export job failed!
        at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:445)
        at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:931)
        at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80)
        at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
[root@master sqoop]# 

不知道出现了什么问题

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值