sqoop之hive数据导入mysql

1、查询hive表中的student数据及信息

hive> select * from student;
OK
001	xiaohong	002
002	xiaolan	002
001	xiaohong	003
002	xiaolan	003
Time taken: 0.114 seconds, Fetched: 4 row(s)
hive> desc formatted student;
OK
# col_name            	data_type           	comment             
	 	 
id                  	string              	                    
name                	string              	                    
	 	 
# Partition Information	 	 
# col_name            	data_type           	comment             
	 	 
classroom           	string              	                    
	 	 
# Detailed Table Information	 	 
Database:           	default             	 
Owner:              	root                	 
CreateTime:         	Thu Mar 19 14:00:19 CST 2020	 
LastAccessTime:     	UNKNOWN             	 
Retention:          	0                   	 
Location:           	hdfs://master:9000/user/hive/warehouse/student	 
Table Type:         	MANAGED_TABLE       	 
Table Parameters:	 	 
	transient_lastDdlTime	1584597619          
	 	 
# Storage Information	 	 
SerDe Library:      	org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe	 
InputFormat:        	org.apache.hadoop.mapred.TextInputFormat	 
OutputFormat:       	org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat	 
Compressed:         	No                  	 
Num Buckets:        	-1                  	 
Bucket Columns:     	[]                  	 
Sort Columns:       	[]                  	 
Storage Desc Params:	 	 
	field.delim         	,                   
	serialization.format	,                   
Time taken: 0.065 seconds, Fetched: 32 row(s)

从上面的信息可知,student是以classroom为分区进行存储的;数据存储位置为/user/hive/warehouse/student;

2、查看hive中student表数据的存储位置

[root@master ~]# hdfs dfs -ls /user/hive/warehouse/student/
Found 2 items
drwxr-xr-x   - root supergroup          0 2020-03-19 14:04 /user/hive/warehouse/student/classroom=002
drwxr-xr-x   - root supergroup          0 2020-03-19 14:10 /user/hive/warehouse/student/classroom=003
[root@master ~]# hdfs dfs -ls /user/hive/warehouse/student/classroom=002
Found 1 items
-rwxr-xr-x   1 root supergroup         25 2020-03-19 14:04 /user/hive/warehouse/student/classroom=002/stu.txt
[root@master ~]# hdfs dfs -cat /user/hive/warehouse/student/classroom=002/stu.txt
001,xiaohong
002,xiaolan

由此可知根据逗号分隔数据;

3、先在mysql中创建字段和类型相同的表

mysql> create table student(id varchar(255),name varchar(255));
Query OK, 0 rows affected (0.02 sec)

mysql> select * from student;
Empty set (0.00 sec)

mysql> 

4、sqoop导入数据

[root@master bin]# ./sqoop export --connect "jdbc:mysql://192.168.230.21:3306/mysql" --username root --password 123456 --table student  --fields-terminated-by "," --export-dir "/user/hive/warehouse/student/classroom=002";
Warning: /opt/softWare/sqoop/sqoop-1.4.7/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /opt/softWare/sqoop/sqoop-1.4.7/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /opt/softWare/sqoop/sqoop-1.4.7/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
20/06/29 17:27:57 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
20/06/29 17:27:58 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
20/06/29 17:27:58 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
20/06/29 17:27:58 INFO tool.CodeGenTool: Beginning code generation
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/softWare/hadoop/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/softWare/hbase/hbase-1.2.6/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/softWare/hive/apache-hive-2.1.1-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
20/06/29 17:27:58 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `student` AS t LIMIT 1
20/06/29 17:27:58 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `student` AS t LIMIT 1
20/06/29 17:27:58 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/softWare/hadoop/hadoop-2.7.3
Note: /tmp/sqoop-root/compile/ee585e722d7b5a438d67d53c369ed36e/student.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
20/06/29 17:28:00 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/ee585e722d7b5a438d67d53c369ed36e/student.jar
20/06/29 17:28:00 INFO mapreduce.ExportJobBase: Beginning export of student
20/06/29 17:28:00 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
20/06/29 17:28:01 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
20/06/29 17:28:01 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
20/06/29 17:28:01 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
20/06/29 17:28:01 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.230.21:8032
20/06/29 17:28:06 INFO input.FileInputFormat: Total input paths to process : 1
20/06/29 17:28:06 INFO input.FileInputFormat: Total input paths to process : 1
20/06/29 17:28:06 INFO mapreduce.JobSubmitter: number of splits:4
20/06/29 17:28:06 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
20/06/29 17:28:07 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1593394152340_0027
20/06/29 17:28:07 INFO impl.YarnClientImpl: Submitted application application_1593394152340_0027
20/06/29 17:28:07 INFO mapreduce.Job: The url to track the job: http://master:8088/proxy/application_1593394152340_0027/
20/06/29 17:28:07 INFO mapreduce.Job: Running job: job_1593394152340_0027
20/06/29 17:28:16 INFO mapreduce.Job: Job job_1593394152340_0027 running in uber mode : false
20/06/29 17:28:16 INFO mapreduce.Job:  map 0% reduce 0%
20/06/29 17:28:37 INFO mapreduce.Job:  map 50% reduce 0%
20/06/29 17:28:38 INFO mapreduce.Job:  map 100% reduce 0%
20/06/29 17:28:38 INFO mapreduce.Job: Job job_1593394152340_0027 completed successfully
20/06/29 17:28:38 INFO mapreduce.Job: Counters: 30
	File System Counters
		FILE: Number of bytes read=0
		FILE: Number of bytes written=551488
		FILE: Number of read operations=0
		FILE: Number of large read operations=0
		FILE: Number of write operations=0
		HDFS: Number of bytes read=764
		HDFS: Number of bytes written=0
		HDFS: Number of read operations=19
		HDFS: Number of large read operations=0
		HDFS: Number of write operations=0
	Job Counters 
		Launched map tasks=4
		Data-local map tasks=4
		Total time spent by all maps in occupied slots (ms)=71778
		Total time spent by all reduces in occupied slots (ms)=0
		Total time spent by all map tasks (ms)=71778
		Total vcore-milliseconds taken by all map tasks=71778
		Total megabyte-milliseconds taken by all map tasks=73500672
	Map-Reduce Framework
		Map input records=2
		Map output records=2
		Input split bytes=681
		Spilled Records=0
		Failed Shuffles=0
		Merged Map outputs=0
		GC time elapsed (ms)=799
		CPU time spent (ms)=3130
		Physical memory (bytes) snapshot=445382656
		Virtual memory (bytes) snapshot=8325513216
		Total committed heap usage (bytes)=170655744
	File Input Format Counters 
		Bytes Read=0
	File Output Format Counters 
		Bytes Written=0
20/06/29 17:28:38 INFO mapreduce.ExportJobBase: Transferred 764 bytes in 36.6371 seconds (20.8532 bytes/sec)
20/06/29 17:28:38 INFO mapreduce.ExportJobBase: Exported 2 records.
[root@master bin]# ./sqoop export --connect "jdbc:mysql://192.168.230.21:3306/mysql" --username root --password 123456 --table student  --fields-terminated-by "," --export-dir "/user/hive/warehouse/student/classroom=003";
Warning: /opt/softWare/sqoop/sqoop-1.4.7/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /opt/softWare/sqoop/sqoop-1.4.7/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /opt/softWare/sqoop/sqoop-1.4.7/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
20/06/29 17:29:21 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
20/06/29 17:29:21 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
20/06/29 17:29:22 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
20/06/29 17:29:22 INFO tool.CodeGenTool: Beginning code generation
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/softWare/hadoop/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/softWare/hbase/hbase-1.2.6/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/softWare/hive/apache-hive-2.1.1-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
20/06/29 17:29:22 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `student` AS t LIMIT 1
20/06/29 17:29:22 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `student` AS t LIMIT 1
20/06/29 17:29:22 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/softWare/hadoop/hadoop-2.7.3
Note: /tmp/sqoop-root/compile/b4bafb65b47b2cded3caa3989b6e4d00/student.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
20/06/29 17:29:24 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/b4bafb65b47b2cded3caa3989b6e4d00/student.jar
20/06/29 17:29:24 INFO mapreduce.ExportJobBase: Beginning export of student
20/06/29 17:29:24 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
20/06/29 17:29:25 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
20/06/29 17:29:25 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
20/06/29 17:29:25 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
20/06/29 17:29:25 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.230.21:8032
20/06/29 17:29:31 INFO input.FileInputFormat: Total input paths to process : 1
20/06/29 17:29:31 INFO input.FileInputFormat: Total input paths to process : 1
20/06/29 17:29:31 INFO mapreduce.JobSubmitter: number of splits:4
20/06/29 17:29:31 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
20/06/29 17:29:32 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1593394152340_0028
20/06/29 17:29:32 INFO impl.YarnClientImpl: Submitted application application_1593394152340_0028
20/06/29 17:29:32 INFO mapreduce.Job: The url to track the job: http://master:8088/proxy/application_1593394152340_0028/
20/06/29 17:29:32 INFO mapreduce.Job: Running job: job_1593394152340_0028
20/06/29 17:29:41 INFO mapreduce.Job: Job job_1593394152340_0028 running in uber mode : false
20/06/29 17:29:41 INFO mapreduce.Job:  map 0% reduce 0%
20/06/29 17:29:52 INFO mapreduce.Job:  map 50% reduce 0%
20/06/29 17:29:54 INFO mapreduce.Job:  map 75% reduce 0%
20/06/29 17:29:55 INFO mapreduce.Job:  map 100% reduce 0%
20/06/29 17:29:55 INFO mapreduce.Job: Job job_1593394152340_0028 completed successfully
20/06/29 17:29:56 INFO mapreduce.Job: Counters: 31
	File System Counters
		FILE: Number of bytes read=0
		FILE: Number of bytes written=551488
		FILE: Number of read operations=0
		FILE: Number of large read operations=0
		FILE: Number of write operations=0
		HDFS: Number of bytes read=764
		HDFS: Number of bytes written=0
		HDFS: Number of read operations=19
		HDFS: Number of large read operations=0
		HDFS: Number of write operations=0
	Job Counters 
		Launched map tasks=4
		Data-local map tasks=2
		Rack-local map tasks=2
		Total time spent by all maps in occupied slots (ms)=35902
		Total time spent by all reduces in occupied slots (ms)=0
		Total time spent by all map tasks (ms)=35902
		Total vcore-milliseconds taken by all map tasks=35902
		Total megabyte-milliseconds taken by all map tasks=36763648
	Map-Reduce Framework
		Map input records=2
		Map output records=2
		Input split bytes=681
		Spilled Records=0
		Failed Shuffles=0
		Merged Map outputs=0
		GC time elapsed (ms)=500
		CPU time spent (ms)=2940
		Physical memory (bytes) snapshot=448909312
		Virtual memory (bytes) snapshot=8319729664
		Total committed heap usage (bytes)=166592512
	File Input Format Counters 
		Bytes Read=0
	File Output Format Counters 
		Bytes Written=0
20/06/29 17:29:56 INFO mapreduce.ExportJobBase: Transferred 764 bytes in 31.064 seconds (24.5944 bytes/sec)
20/06/29 17:29:56 INFO mapreduce.ExportJobBase: Exported 2 records.

5、查看mysql中student表数据结果

mysql> select * from student;
+------+----------+
| id   | name     |
+------+----------+
| 001  | xiaohong |
| 002  | xiaolan  |
| 001  | xiaohong |
| 002  | xiaolan  |
+------+----------+
4 rows in set (0.00 sec)

mysql> 

 

  • 1
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 1
    评论
sqoop是一种用于在Hadoop和关系型数据库之间传输数据的工具。要将Hive数据导入MySQL,可以使用以下步骤: 1. 确保已经安装了sqoopMySQL驱动程序。 2. 在Hive中创建一个表,并将其导出为一个文件。例如,可以使用以下命令将Hive表导出为一个CSV文件: ``` hive -e 'SELECT * FROM my_table' | sed 's/[\t]/,/g' > my_table.csv ``` 3. 使用sqoop将CSV文件导入MySQL。例如,可以使用以下命令将CSV文件导入MySQL中的一个表: ``` sqoop import --connect jdbc:mysql://localhost/my_database --username my_username --password my_password --table my_table --fields-terminated-by ',' --lines-terminated-by '\n' --input-null-string '\\N' --input-null-non-string '\\N' --delete-target-dir --target-dir /tmp/my_table --bindir /tmp/my_table/bin --outdir /tmp/my_table/out --input-fields-terminated-by ',' ``` 其中,--connect指定MySQL连接字符串,--username和--password指定MySQL用户名和密码,--table指定要导入MySQL表名,--fields-terminated-by指定CSV文件中字段之间的分隔符,--lines-terminated-by指定CSV文件中行之间的分隔符,--input-null-string和--input-null-non-string指定CSV文件中的空值表示方式,--delete-target-dir指定在导入之前删除目标目录,--target-dir指定HDFS中的目标目录,--bindir、--outdir和--input-fields-terminated-by指定生成的Java类的位置和属性分隔符。 4. 确认数据已经成功导入MySQL中的表中。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

郝少

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值