1.报错
wordcount在页面上运行不出来,只出个temp
报错日志信息
[atguigu@hadoop102 hadoop-3.1.3]$ hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-3.1.3.jar wordcount /wcinput /wcoutput
2024-08-22 11:27:34,543 INFO client.RMProxy: Connecting to ResourceManager at hadoop103/192.168.188.103:8032
2024-08-22 11:27:35,312 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /tmp/hadoop-yarn/staging/atguigu/.staging/job_1724293494980_0002
2024-08-22 11:27:35,446 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2024-08-22 11:27:35,685 INFO input.FileInputFormat: Total input files to process : 1
2024-08-22 11:27:35,726 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2024-08-22 11:27:35,798 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2024-08-22 11:27:35,860 INFO mapreduce.JobSubmitter: number of splits:1
2024-08-22 11:27:36,025 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2024-08-22 11:27:36,168 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1724293494980_0002
2024-08-22 11:27:36,168 INFO mapreduce.JobSubmitter: Executing with tokens: []
2024-08-22 11:27:36,607 INFO conf.Configuration: resource-types.xml not found
2024-08-22 11:27:36,607 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'.
2024-08-22 11:27:36,838 INFO impl.YarnClientImpl: Submitted application application_1724293494980_0002
2024-08-22 11:27:37,029 INFO mapreduce.Job: The url to track the job: http://hadoop103:8088/proxy/application_1724293494980_0002/
2024-08-22 11:27:37,029 INFO mapreduce.Job: Running job: job_1724293494980_0002
2024-08-22 11:27:43,238 INFO mapreduce.Job: Job job_1724293494980_0002 running in uber mode : false
2024-08-22 11:27:43,240 INFO mapreduce.Job: map 0% reduce 0%
2024-08-22 11:27:43,273 INFO mapreduce.Job: Job job_1724293494980_0002 failed with state FAILED due to: Application application_1724293494980_0002 failed 2 times due to AM Container for appattempt_1724293494980_0002_000002 exited with exitCode: 1
Failing this attempt.Diagnostics: [2024-08-22 11:27:43.061]Exception from container-launch.
Container id: container_1724293494980_0002_02_000001
Exit code: 1
[2024-08-22 11:27:43.064]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
Error: Could not find or load main class org.apache.hadoop.mapreduce.v2.app.MRAppMaster
2.
改完后需要分发
/opt/module/hadoop-3.1.3
./bin/xsync /opt/module/hadoop-3.1.3/etc/hadoop/mapred-site.xml
103
cat /opt/module/hadoop-3.1.3/etc/hadoop/mapred-site.xml
104
cat /opt/module/hadoop-3.1.3/etc/hadoop/mapred-site.xml
在mapred-site.xml文件添加
<property>
<name>yarn.app.mapreduce.am.env</name>
<value>HADOOP_MAPRED_HOME=/opt/module/hadoop-3.1.3</value>
</property>
<property>
<name>mapreduce.map.env</name>
<value>HADOOP_MAPRED_HOME=/opt/module/hadoop-3.1.3</value>
</property>
<property>
<name>mapreduce.reduce.env</name>
<value>HADOOP_MAPRED_HOME=/opt/module/hadoop-3.1.3</value>
</property>
还需要将这个配置文件分发到所有的服务器上
[atguigu@hadoop102 hadoop-3.1.3]$ pwd
/opt/module/hadoop-3.1.3
[atguigu@hadoop102 hadoop-3.1.3]$ hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-3.1.3.jar wordcount /wcinput /wcoutput
3.
关闭
sbin/stop-yarn.sh
sbin/stop-dfs.sh
4.
[atguigu@hadoop104 hadoop-3.1.3]$ pwd
/opt/module/hadoop-3.1.3
启动
sbin/start-dfs.sh
103上启动 yarn
[atguigu@hadoop102 hadoop-3.1.3]$ pwd
cd /opt/module/hadoop-3.1.3
启动
sbin/start-yarn.sh
5.
删除wcoutput
重新运行之后日志
2024-08-22 12:10:09,389 INFO mapreduce.Job: map 0% reduce 0%
2024-08-22 12:10:14,515 INFO mapreduce.Job: map 100% reduce 0%
2024-08-22 12:10:21,625 INFO mapreduce.Job: map 100% reduce 100%
完整日志
[atguigu@hadoop102 hadoop-3.1.3]$ pwd
/opt/module/hadoop-3.1.3
[atguigu@hadoop102 hadoop-3.1.3]$ hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-3.1.3.jar wordcount /wcinput /wcoutput
2024-08-22 12:08:16,968 INFO client.RMProxy: Connecting to ResourceManager at hadoop103/192.168.188.103:8032
org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://hadoop102:8020/wcoutput already exists
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:164)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:277)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:143)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1588)
at org.apache.hadoop.examples.WordCount.main(WordCount.java:87)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:71)
at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144)
at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:74)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
[atguigu@hadoop102 hadoop-3.1.3]$
[atguigu@hadoop102 hadoop-3.1.3]$
[atguigu@hadoop102 hadoop-3.1.3]$
[atguigu@hadoop102 hadoop-3.1.3]$ hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-3.1.3.jar wordcount /wcinput /wcoutput
2024-08-22 12:09:52,212 INFO client.RMProxy: Connecting to ResourceManager at hadoop103/192.168.188.103:8032
2024-08-22 12:09:52,990 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /tmp/hadoop-yarn/staging/atguigu/.staging/job_1724293494980_0004
2024-08-22 12:09:53,137 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2024-08-22 12:09:53,400 INFO input.FileInputFormat: Total input files to process : 1
2024-08-22 12:09:53,449 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2024-08-22 12:09:53,514 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2024-08-22 12:09:53,562 INFO mapreduce.JobSubmitter: number of splits:1
2024-08-22 12:09:53,727 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2024-08-22 12:09:53,896 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1724293494980_0004
2024-08-22 12:09:53,897 INFO mapreduce.JobSubmitter: Executing with tokens: []
2024-08-22 12:09:54,544 INFO conf.Configuration: resource-types.xml not found
2024-08-22 12:09:54,544 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'.
2024-08-22 12:09:54,838 INFO impl.YarnClientImpl: Submitted application application_1724293494980_0004
2024-08-22 12:09:55,071 INFO mapreduce.Job: The url to track the job: http://hadoop103:8088/proxy/application_1724293494980_0004/
2024-08-22 12:09:55,072 INFO mapreduce.Job: Running job: job_1724293494980_0004
2024-08-22 12:10:09,386 INFO mapreduce.Job: Job job_1724293494980_0004 running in uber mode : false
2024-08-22 12:10:09,389 INFO mapreduce.Job: map 0% reduce 0%
2024-08-22 12:10:14,515 INFO mapreduce.Job: map 100% reduce 0%
2024-08-22 12:10:21,625 INFO mapreduce.Job: map 100% reduce 100%
2024-08-22 12:10:22,668 INFO mapreduce.Job: Job job_1724293494980_0004 completed successfully
2024-08-22 12:10:22,821 INFO mapreduce.Job: Counters: 53
File System Counters
FILE: Number of bytes read=64
FILE: Number of bytes written=435581
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=139
HDFS: Number of bytes written=38
HDFS: Number of read operations=8
HDFS: Number of large read operations=0
HDFS: Number of write operations=2
Job Counters
Launched map tasks=1
Launched reduce tasks=1
Data-local map tasks=1
Total time spent by all maps in occupied slots (ms)=3381
Total time spent by all reduces in occupied slots (ms)=4495
Total time spent by all map tasks (ms)=3381
Total time spent by all reduce tasks (ms)=4495
Total vcore-milliseconds taken by all map tasks=3381
Total vcore-milliseconds taken by all reduce tasks=4495
Total megabyte-milliseconds taken by all map tasks=3462144
Total megabyte-milliseconds taken by all reduce tasks=4602880
Map-Reduce Framework
Map input records=6
Map output records=7
Map output bytes=63
Map output materialized bytes=64
Input split bytes=103
Combine input records=7
Combine output records=5
Reduce input groups=5
Reduce shuffle bytes=64
Reduce input records=5
Reduce output records=5
Spilled Records=10
Shuffled Maps =1
Failed Shuffles=0
Merged Map outputs=1
GC time elapsed (ms)=130
CPU time spent (ms)=1630
Physical memory (bytes) snapshot=556175360
Virtual memory (bytes) snapshot=5131911168
Total committed heap usage (bytes)=465567744
Peak Map Physical memory (bytes)=315908096
Peak Map Virtual memory (bytes)=2561228800
Peak Reduce Physical memory (bytes)=240267264
Peak Reduce Virtual memory (bytes)=2570682368
Shuffle Errors
BAD_ID=0
CONNECTION=0
IO_ERROR=0
WRONG_LENGTH=0
WRONG_MAP=0
WRONG_REDUCE=0
File Input Format Counters
Bytes Read=36
File Output Format Counters
Bytes Written=38
[atguigu@hadoop102 hadoop-3.1.3]$