sqoop建表_使用 sqoop 将mysql数据导入到hive表(import)

EFdeMacBook-Pro:jarfile FengZhen$ sqoop import --connect jdbc:mysql://localhost:3306/sqooptest --username root --password 123qwe --table sqoop_test --hive-import --hive-overwrite --hive-table sqoop_test_table --fields-terminated-by ',' -m 1

Warning: /Users/FengZhen/Desktop/Hadoop/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/../hcatalog does not exist!HCatalog jobs will fail.

Pleaseset$HCAT_HOME to the root of your HCatalog installation.

Warning:/Users/FengZhen/Desktop/Hadoop/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/../accumulo does not exist!Accumulo imports will fail.

Pleaseset$ACCUMULO_HOME to the root of your Accumulo installation.

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found bindingin [jar:file:/Users/FengZhen/Desktop/Hadoop/hadoop-2.8.0/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found bindingin [jar:file:/Users/FengZhen/Desktop/Hadoop/hbase-1.3.0/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding isof type [org.slf4j.impl.Log4jLoggerFactory]17/09/15 09:33:42 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6

17/09/15 09:33:42 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.17/09/15 09:33:42INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.17/09/15 09:33:42INFO tool.CodeGenTool: Beginning code generation17/09/15 09:33:43 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `sqoop_test` AS t LIMIT 1

17/09/15 09:33:43 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `sqoop_test` AS t LIMIT 1

17/09/15 09:33:43 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /Users/FengZhen/Desktop/Hadoop/hadoop-2.8.0

17/09/15 09:33:48 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-FengZhen/compile/241f28a04b0ece18cd4a07bd6939d50a/sqoop_test.jar17/09/15 09:33:48 WARN manager.MySQLManager: It looks like you are importing frommysql.17/09/15 09:33:48 WARN manager.MySQLManager: This transfer can be faster! Use the --direct17/09/15 09:33:48 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.17/09/15 09:33:48INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)17/09/15 09:33:48INFO mapreduce.ImportJobBase: Beginning import of sqoop_test17/09/15 09:33:48 INFO Configuration.deprecation: mapred.job.tracker isdeprecated. Instead, use mapreduce.jobtracker.address17/09/15 09:33:49 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes whereapplicable17/09/15 09:33:49 INFO Configuration.deprecation: mapred.jar isdeprecated. Instead, use mapreduce.job.jar17/09/15 09:33:50 INFO Configuration.deprecation: mapred.map.tasks isdeprecated. Instead, use mapreduce.job.maps17/09/15 09:33:50 INFO client.RMProxy: Connecting to ResourceManager at localhost/127.0.0.1:8032

17/09/15 09:33:52INFO db.DBInputFormat: Using read commited transaction isolation17/09/15 09:33:52 INFO mapreduce.JobSubmitter: number of splits:1

17/09/15 09:33:52 INFO mapreduce.JobSubmitter: Submitting tokens forjob: job_1505439184374_000117/09/15 09:33:53INFO impl.YarnClientImpl: Submitted application application_1505439184374_000117/09/15 09:33:53 INFO mapreduce.Job: The url to track the job: http://192.168.1.64:8088/proxy/application_1505439184374_0001/

17/09/15 09:33:53INFO mapreduce.Job: Running job: job_1505439184374_000117/09/15 09:34:04 INFO mapreduce.Job: Job job_1505439184374_0001 running in uber mode : false

17/09/15 09:34:04 INFO mapreduce.Job: map 0% reduce 0%

17/09/15 09:34:11 INFO mapreduce.Job: map 100% reduce 0%

17/09/15 09:34:11INFO mapreduce.Job: Job job_1505439184374_0001 completed successfully17/09/15 09:34:12 INFO mapreduce.Job: Counters: 30File System Counters

FILE: Number of bytes read=0FILE: Number of bytes written=156812FILE: Number of read operations=0FILE: Number of large read operations=0FILE: Number of write operations=0HDFS: Number of bytes read=87HDFS: Number of bytes written=72HDFS: Number of read operations=4HDFS: Number of large read operations=0HDFS: Number of write operations=2Job Counters

Launched map tasks=1Other local map tasks=1Total time spent by all mapsin occupied slots (ms)=4657Total time spent by all reducesin occupied slots (ms)=0Total time spent by all map tasks (ms)=4657Total vcore-milliseconds taken by all map tasks=4657Total megabyte-milliseconds taken by all map tasks=4768768Map-Reduce Framework

Map input records=6Map output records=6Input split bytes=87Spilled Records=0Failed Shuffles=0Merged Map outputs=0GC time elapsed (ms)=40CPU time spent (ms)=0Physical memory (bytes) snapshot=0Virtual memory (bytes) snapshot=0Total committed heap usage (bytes)=154140672File Input Format Counters

Bytes Read=0File Output Format Counters

Bytes Written=72

17/09/15 09:34:12 INFO mapreduce.ImportJobBase: Transferred 72 bytes in 22.0614 seconds (3.2636 bytes/sec)17/09/15 09:34:12 INFO mapreduce.ImportJobBase: Retrieved 6records.17/09/15 09:34:12 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `sqoop_test` AS t LIMIT 1

17/09/15 09:34:12INFO hive.HiveImport: Loading uploaded data into Hive17/09/15 09:34:19INFO hive.HiveImport: SLF4J: Class path contains multiple SLF4J bindings.17/09/15 09:34:19 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/Users/FengZhen/Desktop/Hadoop/hadoop-2.8.0/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]17/09/15 09:34:19 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/Users/FengZhen/Desktop/Hadoop/hbase-1.3.0/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]17/09/15 09:34:19 INFO hive.HiveImport: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

17/09/15 09:34:19 INFO hive.HiveImport: SLF4J: Actual binding isof type [org.slf4j.impl.Log4jLoggerFactory]17/09/15 09:34:20 INFO hive.HiveImport: 17/09/15 09:34:20WARN conf.HiveConf: HiveConf of name hive.metastore.local does not exist17/09/15 09:34:20INFO hive.HiveImport:17/09/15 09:34:20 INFO hive.HiveImport: Logging initialized using configuration in file:/Users/FengZhen/Desktop/Hadoop/hive/apache-hive-1.2.2-bin/conf/hive-log4j.properties17/09/15 09:34:31INFO hive.HiveImport: OK17/09/15 09:34:31 INFO hive.HiveImport: Time taken: 1.502seconds17/09/15 09:34:31 INFO hive.HiveImport: Loading data to table default.sqoop_test_table17/09/15 09:34:31 INFO hive.HiveImport: Table default.sqoop_test_table stats: [numFiles=1, numRows=0, totalSize=72, rawDataSize=0]17/09/15 09:34:31INFO hive.HiveImport: OK17/09/15 09:34:31 INFO hive.HiveImport: Time taken: 0.762seconds17/09/15 09:34:32INFO hive.HiveImport: Hive import complete.17/09/15 09:34:32 INFO hive.HiveImport: Export directory is contains the _SUCCESS file only, removing the directory.

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
使用 SqoopMySQL 数据导入Hive 中,需要遵循以下步骤: 1. 确保已经安装了 SqoopHive。 2. 在 Hive 中创建一个数据库,用于存储导入数据。 3. 在 HDFS 中创建一个目录,用于存储导入数据。 4. 使用 Sqoopimport 命令将 MySQL 数据导入到 HDFS 中: ``` sqoop import \ --connect jdbc:mysql://mysql-server:3306/mydatabase \ --username mysqluser \ --password mysqlpassword \ --table mytable \ --target-dir /user/hive/warehouse/mydatabase.db/mytable \ --fields-terminated-by ',' \ --hive-import \ --hive-table mydatabase.mytable ``` 其中,`jdbc:mysql://mysql-server:3306/mydatabase` 是 MySQL 数据库的连接 URL,`mysqluser` 和 `mysqlpassword` 分别是 MySQL 数据库的用户名和密码,`mytable` 是要导入数据名,`/user/hive/warehouse/mydatabase.db/mytable` 是数据导入到 HDFS 中的目录,`--fields-terminated-by ','` 指定了字段分隔符为逗号,`--hive-import` 示需要将数据导入Hive 中,`--hive-table mydatabase.mytable` 指定了在 Hive 中创建的目标的名称和所在的数据库。 5. 在 Hive 中创建一个外部,将 HDFS 目录中的数据映射为 Hive : ``` CREATE EXTERNAL TABLE mydatabase.mytable ( column1 datatype1, column2 datatype2, ... ) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' STORED AS TEXTFILE LOCATION '/user/hive/warehouse/mydatabase.db/mytable'; ``` 注意,这里的结构要与 MySQL 数据的结构保持一致。 6. 使用 Hive 的 INSERT INTO 语句将数据从外部中插入到 Hive 中: ``` INSERT INTO mydatabase.mytable SELECT * FROM mydatabase.mytable; ``` 这样就完成了将 MySQL 数据导入Hive 中的操作。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值