2022-12-06 04:00:22,503 ERROR tool.ExportTool: Encountered IOException running export job:
java.net.ConnectException: Call From hadoop1/192.168.69.137 to hadoop1:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:831)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:755)
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1495)
at org.apache.hadoop.ipc.Client.call(Client.java:1437)
at org.apache.hadoop.ipc.Client.call(Client.java:1347)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
at com.sun.proxy.$Proxy11.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:883)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1652)
at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1569)
at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1566)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1581)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:136)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:113)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:151)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1682)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1588)
at org.apache.sqoop.mapreduce.ExportJobBase.doSubmitJob(ExportJobBase.java:324)
at org.apache.sqoop.mapreduce.ExportJobBase.runJob(ExportJobBase.java:301)
at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:442)
at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:931)
at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80)
at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:685)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:788)
at org.apache.hadoop.ipc.Client$Connection.access$3500(Client.java:409)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1552)
at org.apache.hadoop.ipc.Client.call(Client.java:1383)
... 42 more
start-all.sh
2022-12-06 04:13:58,377 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging/root/.staging/job_1670317331429_0001
2022-12-06 04:13:58,402 ERROR tool.ExportTool: Encountered IOException running export job:
org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: hdfs://hadoop1:9000/app/text.txt
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:330)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:272)
at org.apache.sqoop.mapreduce.ExportInputFormat.getJobSize(ExportInputFormat.java:51)
at org.apache.sqoop.mapreduce.ExportInputFormat.getSplits(ExportInputFormat.java:64)
at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:313)
at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:330)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:203)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1682)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1588)
at org.apache.sqoop.mapreduce.ExportJobBase.doSubmitJob(ExportJobBase.java:324)
at org.apache.sqoop.mapreduce.ExportJobBase.runJob(ExportJobBase.java:301)
at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:442)
at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:931)
at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80)
at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
[root@hadoop1 sqoop]#
这个问题找不见建的text.txt的文件,重新再sqoop目录下简历文件即可
[root@hadoop1 sqoop]#
[root@hadoop1 sqoop]# hadoop fs -put text.txt /sqoop
[root@hadoop1 sqoop]#
[root@hadoop1 sqoop]#
[root@hadoop1 sqoop]#
[root@hadoop1 sqoop]# sqoop export --connect jdbc:mysql://192.168.69.137:3306/hdfsdb --username root --password 9785wyb --table fruit --export-dir /app/text.txt -m 1
Warning: /app/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /app/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Error: Could not find or load main class org.apache.hadoop.hbase.util.GetJavaProperty
2022-12-06 04:23:21,472 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
2022-12-06 04:23:21,520 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
2022-12-06 04:23:21,650 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
2022-12-06 04:23:21,654 INFO tool.CodeGenTool: Beginning code generation
Tue Dec 06 04:23:21 EST 2022 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
2022-12-06 04:23:22,452 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `fruit` AS t LIMIT 1
2022-12-06 04:23:22,480 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `fruit` AS t LIMIT 1
2022-12-06 04:23:22,493 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /app/hadoop
Note: /tmp/sqoop-root/compile/c8bb01707712bfaac967f07e1be52e28/fruit.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
2022-12-06 04:23:24,405 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/c8bb01707712bfaac967f07e1be52e28/fruit.jar
2022-12-06 04:23:24,469 INFO mapreduce.ExportJobBase: Beginning export of fruit
2022-12-06 04:23:24,470 INFO Configuration.deprecation: mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
2022-12-06 04:23:24,688 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
2022-12-06 04:23:26,417 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
2022-12-06 04:23:26,424 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
2022-12-06 04:23:26,434 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
2022-12-06 04:23:26,688 INFO client.RMProxy: Connecting to ResourceManager at hadoop1/192.168.69.137:8032
2022-12-06 04:23:27,236 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /tmp/hadoop-yarn/staging/root/.staging/job_1670317331429_0002
2022-12-06 04:23:29,497 INFO input.FileInputFormat: Total input files to process : 1
2022-12-06 04:23:29,499 INFO input.FileInputFormat: Total input files to process : 1
2022-12-06 04:23:29,712 INFO mapreduce.JobSubmitter: number of splits:1
2022-12-06 04:23:29,776 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
2022-12-06 04:23:29,777 INFO Configuration.deprecation: yarn.resourcemanager.system-metrics-publisher.enabled is deprecated. Instead, use yarn.system-metrics-publisher.enabled
2022-12-06 04:23:29,945 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1670317331429_0002
2022-12-06 04:23:29,949 INFO mapreduce.JobSubmitter: Executing with tokens: []
2022-12-06 04:23:30,260 INFO conf.Configuration: resource-types.xml not found
2022-12-06 04:23:30,260 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'.
2022-12-06 04:23:30,918 INFO impl.YarnClientImpl: Submitted application application_1670317331429_0002
2022-12-06 04:23:31,008 INFO mapreduce.Job: The url to track the job: http://hadoop1:8088/proxy/application_1670317331429_0002/
2022-12-06 04:23:31,009 INFO mapreduce.Job: Running job: job_1670317331429_0002
2022-12-06 04:23:44,286 INFO mapreduce.Job: Job job_1670317331429_0002 running in uber mode : false
2022-12-06 04:23:44,294 INFO mapreduce.Job: map 0% reduce 0%
2022-12-06 04:23:52,616 INFO mapreduce.Job: map 100% reduce 0%
2022-12-06 04:23:52,635 INFO mapreduce.Job: Job job_1670317331429_0002 completed successfully
2022-12-06 04:23:52,757 INFO mapreduce.Job: Counters: 32
File System Counters
FILE: Number of bytes read=0
FILE: Number of bytes written=220555
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=180
HDFS: Number of bytes written=0
HDFS: Number of read operations=4
HDFS: Number of large read operations=0
HDFS: Number of write operations=0
Job Counters
Launched map tasks=1
Rack-local map tasks=1
Total time spent by all maps in occupied slots (ms)=5100
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=5100
Total vcore-milliseconds taken by all map tasks=5100
Total megabyte-milliseconds taken by all map tasks=5222400
Map-Reduce Framework
Map input records=3
Map output records=3
Input split bytes=113
Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=208
CPU time spent (ms)=2930
Physical memory (bytes) snapshot=239378432
Virtual memory (bytes) snapshot=2802765824
Total committed heap usage (bytes)=170393600
Peak Map Physical memory (bytes)=239587328
Peak Map Virtual memory (bytes)=2802765824
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=0
2022-12-06 04:23:52,767 INFO mapreduce.ExportJobBase: Transferred 180 bytes in 26.2651 seconds (6.8532 bytes/sec)
2022-12-06 04:23:52,771 INFO mapreduce.ExportJobBase: Exported 3 records.
执行成功