明明新建了个mysql的数据库,但是当你往这个数据库里的表里插入数据时,却提示找不到这个数据库

下面的fluxdb是我新建的数据库

mysql> show databases;
+--------------------+
| Database           |
+--------------------+
| information_schema |
| fluxdb             |
| hive               |
| mysql              |
| performance_schema |
| test               |
+--------------------+
mysql> use fluxdb;
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with -A

Database changed
mysql> show tables;
+------------------+
| Tables_in_fluxdb |
+------------------+
| tongji1          |
+------------------+
1 row in set (0.00 sec)

里面新建了个表
然后当我使用sqoop将hive表里的数据往这个表里插入的时候,却提示这个无法识别这个数据库

[root@hadoop02 bin]# sh sqoop export --connect jdbc:mysql://192.168.43.166:3306/fluxdb --username root --password root --export-dir 'hdfs://ns/user/hive/warehouse/fluxdb.db/tongji1' --table tongji1 -m 1 --fields-terminated-by '/t';
20/03/26 19:45:20 ERROR manager.SqlManager: Error executing statement: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown database 'fluxdb'
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown database 'fluxdb'

问题原因:
bin/sh sqoop export --connect jdbc:mysql://192.168.43.166:3306/fluxdb --username root --password root …
是因为 root@192.168.43.166 没有test数据库的权限

解决办法:
使用主机名或主机真实ip
bin/sh sqoop export --connect jdbc:mysql://127.0.0.1:3306/fluxdb --username root --password root …

OK!sqoop成功将数据发送给mysql

[root@hadoop02 bin]# sh sqoop export --connect jdbc:mysql://127.0.0.1:3306/fluxdb --username root --password root --export-dir 'hdfs://ns/user/hive/warehouse/fluxdb.db/tongji1' --table tongji1 -m 1 --fields-terminated-by '|';
Warning: /usr/lib/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
20/03/26 19:52:00 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
20/03/26 19:52:01 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
20/03/26 19:52:01 INFO tool.CodeGenTool: Beginning code generation
20/03/26 19:52:02 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tongji1` AS t LIMIT 1
20/03/26 19:52:02 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tongji1` AS t LIMIT 1
20/03/26 19:52:02 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/01/software/hadoop-2.7.1
Note: /tmp/sqoop-root/compile/0422cfac89eea2a543cefbfe0ec36e20/tongji1.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
20/03/26 19:52:06 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/0422cfac89eea2a543cefbfe0ec36e20/tongji1.jar
20/03/26 19:52:06 INFO mapreduce.ExportJobBase: Beginning export of tongji1
20/03/26 19:52:06 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/03/26 19:52:07 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
20/03/26 19:52:11 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
20/03/26 19:52:11 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
20/03/26 19:52:11 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
20/03/26 19:52:12 INFO client.ConfiguredRMFailoverProxyProvider: Failing over to rm2
20/03/26 19:52:19 INFO input.FileInputFormat: Total input paths to process : 1
20/03/26 19:52:19 INFO input.FileInputFormat: Total input paths to process : 1
20/03/26 19:52:20 INFO mapreduce.JobSubmitter: number of splits:1
20/03/26 19:52:22 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1584430211185_0041
20/03/26 19:52:26 INFO impl.YarnClientImpl: Submitted application application_1584430211185_0041
20/03/26 19:52:26 INFO mapreduce.Job: The url to track the job: http://hadoop02:8088/proxy/application_1584430211185_0041/
20/03/26 19:52:26 INFO mapreduce.Job: Running job: job_1584430211185_0041
20/03/26 19:52:58 INFO mapreduce.Job: Job job_1584430211185_0041 running in uber mode : false
20/03/26 19:52:58 INFO mapreduce.Job:  map 0% reduce 0%
20/03/26 19:53:15 INFO mapreduce.Job:  map 100% reduce 0%
20/03/26 19:53:17 INFO mapreduce.Job: Job job_1584430211185_0041 completed successfully
20/03/26 19:53:17 INFO mapreduce.Job: Counters: 30
	File System Counters
		FILE: Number of bytes read=0
		FILE: Number of bytes written=126188
		FILE: Number of read operations=0
		FILE: Number of large read operations=0
		FILE: Number of write operations=0
		HDFS: Number of bytes read=178
		HDFS: Number of bytes written=0
		HDFS: Number of read operations=4
		HDFS: Number of large read operations=0
		HDFS: Number of write operations=0
	Job Counters 
		Launched map tasks=1
		Rack-local map tasks=1
		Total time spent by all maps in occupied slots (ms)=14301
		Total time spent by all reduces in occupied slots (ms)=0
		Total time spent by all map tasks (ms)=14301
		Total vcore-seconds taken by all map tasks=14301
		Total megabyte-seconds taken by all map tasks=14644224
	Map-Reduce Framework
		Map input records=1
		Map output records=1
		Input split bytes=137
		Spilled Records=0
		Failed Shuffles=0
		Merged Map outputs=0
		GC time elapsed (ms)=88
		CPU time spent (ms)=1040
		Physical memory (bytes) snapshot=98652160
		Virtual memory (bytes) snapshot=2061893632
		Total committed heap usage (bytes)=17846272
	File Input Format Counters 
		Bytes Read=0
	File Output Format Counters 
		Bytes Written=0
20/03/26 19:53:17 INFO mapreduce.ExportJobBase: Transferred 178 bytes in 66.3869 seconds (2.6813 bytes/sec)
20/03/26 19:53:18 INFO mapreduce.ExportJobBase: Exported 1 records.

检查下数据:

mysql> show databases;
+--------------------+
| Database           |
+--------------------+
| information_schema |
| fluxdb             |
| hive               |
| mysql              |
| performance_schema |
| test               |
+--------------------+
6 rows in set (0.10 sec)

mysql> use fluxdb
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with -A

Database changed
mysql> show tables;
+------------------+
| Tables_in_fluxdb |
+------------------+
| tongji1          |
+------------------+
1 row in set (0.00 sec)

mysql> select * from tongji1;
+------------+------+------+------+------+-------+---------+---------+---------+
| reportTime | pv   | uv   | vv   | br   | newip | newcust | avgtime | avgdeep |
+------------+------+------+------+------+-------+---------+---------+---------+
| 2020-03-25 |   13 |    1 |    1 |    0 |     1 |       1 |   41613 |       2 |
| 2020-03-25 |   13 |    1 |    1 |    0 |     1 |       1 |   41613 |       2 |
+------------+------+------+------+------+-------+---------+---------+---------+
2 rows in set (0.00 sec)
  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值