8.hadoop3 启动的流程以及现象

1.102上    sbin/start-dfs.sh

[atguigu@hadoop102 ~]$ cd /opt/module/hadoop-3.1.3
[atguigu@hadoop102 hadoop-3.1.3]$
[atguigu@hadoop102 hadoop-3.1.3]$
[atguigu@hadoop102 hadoop-3.1.3]$ sbin/start-dfs.sh
Starting namenodes on [hadoop102]
hadoop102: namenode is running as process 2013.  Stop it first.
Starting datanodes
hadoop102: datanode is running as process 2086.  Stop it first.
hadoop104: WARNING: /opt/module/hadoop-3.1.3/logs does not exist. Creating.
hadoop103: datanode is running as process 2020.  Stop it first.
hadoop104: datanode is running as process 2162.  Stop it first.
Starting secondary namenodes [hadoop104]

103上  sbin/start-yarn.sh

[atguigu@hadoop103 ~]$ cd /opt/module/hadoop-3.1.3
[atguigu@hadoop103 hadoop-3.1.3]$
[atguigu@hadoop103 hadoop-3.1.3]$
[atguigu@hadoop103 hadoop-3.1.3]$
[atguigu@hadoop103 hadoop-3.1.3]$ sbin/start-yarn.sh

启动后jps查看

102

103

104

hadoop网址

http://hadoop102:9870/explorer.html#/

http://hadoop103:8088/cluster/nodes

检验mapreduce是否正常工作

[atguigu@hadoop102 hadoop-3.1.3]$ pwd
/opt/module/hadoop-3.1.3
[atguigu@hadoop102 hadoop-3.1.3]$ hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-3.1.3.jar wordcount  /wcinput  /wcoutput

失败的日志

[atguigu@hadoop102 hadoop-3.1.3]$ pwd
/opt/module/hadoop-3.1.3
[atguigu@hadoop102 hadoop-3.1.3]$
[atguigu@hadoop102 hadoop-3.1.3]$ hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-3.1.3.jar wordcount  /wcinput  /wcoutput
2024-08-22 15:47:02,220 INFO client.RMProxy: Connecting to ResourceManager at hadoop103/192.168.188.103:8032
2024-08-22 15:47:04,195 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /tmp/hadoop-yarn/staging/atguigu/.staging/job_1724310448291_0001
2024-08-22 15:47:04,613 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2024-08-22 15:47:06,077 INFO input.FileInputFormat: Total input files to process : 1
2024-08-22 15:47:06,141 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2024-08-22 15:47:06,214 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2024-08-22 15:47:06,262 INFO mapreduce.JobSubmitter: number of splits:1
2024-08-22 15:47:06,682 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2024-08-22 15:47:06,867 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1724310448291_0001
2024-08-22 15:47:06,867 INFO mapreduce.JobSubmitter: Executing with tokens: []
2024-08-22 15:47:07,792 INFO conf.Configuration: resource-types.xml not found
2024-08-22 15:47:07,792 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'.
2024-08-22 15:47:08,302 INFO impl.YarnClientImpl: Submitted application application_1724310448291_0001
2024-08-22 15:47:08,619 INFO mapreduce.Job: The url to track the job: http://hadoop103:8088/proxy/application_1724310448291_0001/
2024-08-22 15:47:08,620 INFO mapreduce.Job: Running job: job_1724310448291_0001
2024-08-22 15:47:26,919 INFO mapreduce.Job: Job job_1724310448291_0001 running in uber mode : false
2024-08-22 15:47:26,920 INFO mapreduce.Job:  map 0% reduce 0%
2024-08-22 15:47:39,123 INFO mapreduce.Job: Task Id : attempt_1724310448291_0001_m_000000_0, Status : FAILED
[2024-08-22 15:47:36.966]Container [pid=3230,containerID=container_1724310448291_0001_01_000002] is running 301898240B beyond the 'VIRTUAL' memory limit. Current usage: 177.9 MB of 1 GB physical memory used; 2.4 GB of 2.1 GB virtual memory used. Killing container.
Dump of the process-tree for container_1724310448291_0001_01_000002 :
        |- PID PPID PGRPID SESSID CMD_NAME USER_MODE_TIME(MILLIS) SYSTEM_TIME(MILLIS) VMEM_USAGE(BYTES) RSSMEM_USAGE(PAGES) FULL_CMD_LINE
        |- 3230 3229 3230 3230 (bash) 0 0 9744384 288 /bin/bash -c /opt/module/jdk1.8.0_212/bin/java -Djava.net.preferIPv4Stack=true -Dhadoop.metrics.log.level=WARN   -Xmx820m -Djava.io.tmpdir=/opt/module/hadoop-3.1.3/data/nm-local-dir/usercache/atguigu/appcache/application_1724310448291_0001/container_1724310448291_0001_01_000002/tmp -Dlog4j.configuration=container-log4j.properties -Dyarn.app.container.log.dir=/opt/module/hadoop-3.1.3/logs/userlogs/application_1724310448291_0001/container_1724310448291_0001_01_000002 -Dyarn.app.container.log.filesize=0 -Dhadoop.root.logger=INFO,CLA -Dhadoop.root.logfile=syslog org.apache.hadoop.mapred.YarnChild 192.168.188.103 41084 attempt_1724310448291_0001_m_000000_0 2 1>/opt/module/hadoop-3.1.3/logs/userlogs/application_1724310448291_0001/container_1724310448291_0001_01_000002/stdout 2>/opt/module/hadoop-3.1.3/logs/userlogs/application_1724310448291_0001/container_1724310448291_0001_01_000002/stderr
        |- 3240 3230 3230 3230 (java) 389 399 2547011584 45262 /opt/module/jdk1.8.0_212/bin/java -Djava.net.preferIPv4Stack=true -Dhadoop.metrics.log.level=WARN -Xmx820m -Djava.io.tmpdir=/opt/module/hadoop-3.1.3/data/nm-local-dir/usercache/atguigu/appcache/application_1724310448291_0001/container_1724310448291_0001_01_000002/tmp -Dlog4j.configuration=container-log4j.properties -Dyarn.app.container.log.dir=/opt/module/hadoop-3.1.3/logs/userlogs/application_1724310448291_0001/container_1724310448291_0001_01_000002 -Dyarn.app.container.log.filesize=0 -Dhadoop.root.logger=INFO,CLA -Dhadoop.root.logfile=syslog org.apache.hadoop.mapred.YarnChild 192.168.188.103 41084 attempt_1724310448291_0001_m_000000_0 2

[2024-08-22 15:47:37.039]Container killed on request. Exit code is 143
[2024-08-22 15:47:37.046]Container exited with a non-zero exit code 143.

2024-08-22 15:47:47,280 INFO mapreduce.Job: Task Id : attempt_1724310448291_0001_m_000000_1, Status : FAILED
[2024-08-22 15:47:46.025]Container [pid=3284,containerID=container_1724310448291_0001_01_000003] is running 302950912B beyond the 'VIRTUAL' memory limit. Current usage: 287.0 MB of 1 GB physical memory used; 2.4 GB of 2.1 GB virtual memory used. Killing container.
Dump of the process-tree for container_1724310448291_0001_01_000003 :
        |- PID PPID PGRPID SESSID CMD_NAME USER_MODE_TIME(MILLIS) SYSTEM_TIME(MILLIS) VMEM_USAGE(BYTES) RSSMEM_USAGE(PAGES) FULL_CMD_LINE
        |- 3284 3283 3284 3284 (bash) 0 0 9744384 288 /bin/bash -c /opt/module/jdk1.8.0_212/bin/java -Djava.net.preferIPv4Stack=true -Dhadoop.metrics.log.level=WARN   -Xmx820m -Djava.io.tmpdir=/opt/module/hadoop-3.1.3/data/nm-local-dir/usercache/atguigu/appcache/application_1724310448291_0001/container_1724310448291_0001_01_000003/tmp -Dlog4j.configuration=container-log4j.properties -Dyarn.app.container.log.dir=/opt/module/hadoop-3.1.3/logs/userlogs/application_1724310448291_0001/container_1724310448291_0001_01_000003 -Dyarn.app.container.log.filesize=0 -Dhadoop.root.logger=INFO,CLA -Dhadoop.root.logfile=syslog org.apache.hadoop.mapred.YarnChild 192.168.188.103 41084 attempt_1724310448291_0001_m_000000_1 3 1>/opt/module/hadoop-3.1.3/logs/userlogs/application_1724310448291_0001/container_1724310448291_0001_01_000003/stdout 2>/opt/module/hadoop-3.1.3/logs/userlogs/application_1724310448291_0001/container_1724310448291_0001_01_000003/stderr
        |- 3294 3284 3284 3284 (java) 407 354 2548064256 73174 /opt/module/jdk1.8.0_212/bin/java -Djava.net.preferIPv4Stack=true -Dhadoop.metrics.log.level=WARN -Xmx820m -Djava.io.tmpdir=/opt/module/hadoop-3.1.3/data/nm-local-dir/usercache/atguigu/appcache/application_1724310448291_0001/container_1724310448291_0001_01_000003/tmp -Dlog4j.configuration=container-log4j.properties -Dyarn.app.container.log.dir=/opt/module/hadoop-3.1.3/logs/userlogs/application_1724310448291_0001/container_1724310448291_0001_01_000003 -Dyarn.app.container.log.filesize=0 -Dhadoop.root.logger=INFO,CLA -Dhadoop.root.logfile=syslog org.apache.hadoop.mapred.YarnChild 192.168.188.103 41084 attempt_1724310448291_0001_m_000000_1 3

[2024-08-22 15:47:46.145]Container killed on request. Exit code is 143
[2024-08-22 15:47:46.164]Container exited with a non-zero exit code 143.

2024-08-22 15:47:59,551 INFO mapreduce.Job: Task Id : attempt_1724310448291_0001_m_000000_2, Status : FAILED
[2024-08-22 15:47:58.499]Container [pid=3523,containerID=container_1724310448291_0001_01_000004] is running 291801600B beyond the 'VIRTUAL' memory limit. Current usage: 108.8 MB of 1 GB physical memory used; 2.4 GB of 2.1 GB virtual memory used. Killing container.
Dump of the process-tree for container_1724310448291_0001_01_000004 :
        |- PID PPID PGRPID SESSID CMD_NAME USER_MODE_TIME(MILLIS) SYSTEM_TIME(MILLIS) VMEM_USAGE(BYTES) RSSMEM_USAGE(PAGES) FULL_CMD_LINE
        |- 3523 3521 3523 3523 (bash) 0 2 9744384 289 /bin/bash -c /opt/module/jdk1.8.0_212/bin/java -Djava.net.preferIPv4Stack=true -Dhadoop.metrics.log.level=WARN   -Xmx820m -Djava.io.tmpdir=/opt/module/hadoop-3.1.3/data/nm-local-dir/usercache/atguigu/appcache/application_1724310448291_0001/container_1724310448291_0001_01_000004/tmp -Dlog4j.configuration=container-log4j.properties -Dyarn.app.container.log.dir=/opt/module/hadoop-3.1.3/logs/userlogs/application_1724310448291_0001/container_1724310448291_0001_01_000004 -Dyarn.app.container.log.filesize=0 -Dhadoop.root.logger=INFO,CLA -Dhadoop.root.logfile=syslog org.apache.hadoop.mapred.YarnChild 192.168.188.103 41084 attempt_1724310448291_0001_m_000000_2 4 1>/opt/module/hadoop-3.1.3/logs/userlogs/application_1724310448291_0001/container_1724310448291_0001_01_000004/stdout 2>/opt/module/hadoop-3.1.3/logs/userlogs/application_1724310448291_0001/container_1724310448291_0001_01_000004/stderr
        |- 3532 3523 3523 3523 (java) 188 405 2536914944 27574 /opt/module/jdk1.8.0_212/bin/java -Djava.net.preferIPv4Stack=true -Dhadoop.metrics.log.level=WARN -Xmx820m -Djava.io.tmpdir=/opt/module/hadoop-3.1.3/data/nm-local-dir/usercache/atguigu/appcache/application_1724310448291_0001/container_1724310448291_0001_01_000004/tmp -Dlog4j.configuration=container-log4j.properties -Dyarn.app.container.log.dir=/opt/module/hadoop-3.1.3/logs/userlogs/application_1724310448291_0001/container_1724310448291_0001_01_000004 -Dyarn.app.container.log.filesize=0 -Dhadoop.root.logger=INFO,CLA -Dhadoop.root.logfile=syslog org.apache.hadoop.mapred.YarnChild 192.168.188.103 41084 attempt_1724310448291_0001_m_000000_2 4

[2024-08-22 15:47:58.597]Container killed on request. Exit code is 143
[2024-08-22 15:47:58.636]Container exited with a non-zero exit code 143.

2024-08-22 15:48:06,700 INFO mapreduce.Job:  map 100% reduce 100%
2024-08-22 15:48:07,740 INFO mapreduce.Job: Job job_1724310448291_0001 failed with state FAILED due to: Task failed task_1724310448291_0001_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0 killedMaps:0 killedReduces: 0

2024-08-22 15:48:07,837 INFO mapreduce.Job: Counters: 13
        Job Counters
                Failed map tasks=4
                Killed reduce tasks=1
                Launched map tasks=4
                Other local map tasks=3
                Data-local map tasks=1
                Total time spent by all maps in occupied slots (ms)=29799
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=29799
                Total vcore-milliseconds taken by all map tasks=29799
                Total megabyte-milliseconds taken by all map tasks=30514176
        Map-Reduce Framework
                CPU time spent (ms)=0
                Physical memory (bytes) snapshot=0
                Virtual memory (bytes) snapshot=0

解决办法输出重新执行(具体原理不懂)

成功的日志和截屏

[atguigu@hadoop102 hadoop-3.1.3]$ hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-3.1.3.jar wordcount  /wcinput  /wcoutput
2024-08-22 15:50:03,430 INFO client.RMProxy: Connecting to ResourceManager at hadoop103/192.168.188.103:8032
2024-08-22 15:50:04,159 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /tmp/hadoop-yarn/staging/atguigu/.staging/job_1724310448291_0002
2024-08-22 15:50:04,289 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2024-08-22 15:50:04,542 INFO input.FileInputFormat: Total input files to process : 1
2024-08-22 15:50:04,582 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2024-08-22 15:50:04,645 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2024-08-22 15:50:04,691 INFO mapreduce.JobSubmitter: number of splits:1
2024-08-22 15:50:04,849 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2024-08-22 15:50:04,917 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1724310448291_0002
2024-08-22 15:50:04,917 INFO mapreduce.JobSubmitter: Executing with tokens: []
2024-08-22 15:50:05,134 INFO conf.Configuration: resource-types.xml not found
2024-08-22 15:50:05,135 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'.
2024-08-22 15:50:05,214 INFO impl.YarnClientImpl: Submitted application application_1724310448291_0002
2024-08-22 15:50:05,285 INFO mapreduce.Job: The url to track the job: http://hadoop103:8088/proxy/application_1724310448291_0002/
2024-08-22 15:50:05,287 INFO mapreduce.Job: Running job: job_1724310448291_0002
2024-08-22 15:50:19,560 INFO mapreduce.Job: Job job_1724310448291_0002 running in uber mode : false
2024-08-22 15:50:19,561 INFO mapreduce.Job:  map 0% reduce 0%
2024-08-22 15:50:24,701 INFO mapreduce.Job:  map 100% reduce 0%
2024-08-22 15:50:30,754 INFO mapreduce.Job:  map 100% reduce 100%
2024-08-22 15:50:30,773 INFO mapreduce.Job: Job job_1724310448291_0002 completed successfully
2024-08-22 15:50:30,897 INFO mapreduce.Job: Counters: 53
        File System Counters
                FILE: Number of bytes read=64
                FILE: Number of bytes written=435581
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=139
                HDFS: Number of bytes written=38
                HDFS: Number of read operations=8
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=2
        Job Counters
                Launched map tasks=1
                Launched reduce tasks=1
                Data-local map tasks=1
                Total time spent by all maps in occupied slots (ms)=3228
                Total time spent by all reduces in occupied slots (ms)=3229
                Total time spent by all map tasks (ms)=3228
                Total time spent by all reduce tasks (ms)=3229
                Total vcore-milliseconds taken by all map tasks=3228
                Total vcore-milliseconds taken by all reduce tasks=3229
                Total megabyte-milliseconds taken by all map tasks=3305472
                Total megabyte-milliseconds taken by all reduce tasks=3306496
        Map-Reduce Framework
                Map input records=6
                Map output records=7
                Map output bytes=63
                Map output materialized bytes=64
                Input split bytes=103
                Combine input records=7
                Combine output records=5
                Reduce input groups=5
                Reduce shuffle bytes=64
                Reduce input records=5
                Reduce output records=5
                Spilled Records=10
                Shuffled Maps =1
                Failed Shuffles=0
                Merged Map outputs=1
                GC time elapsed (ms)=141
                CPU time spent (ms)=1470
                Physical memory (bytes) snapshot=532729856
                Virtual memory (bytes) snapshot=5131214848
                Total committed heap usage (bytes)=454033408
                Peak Map Physical memory (bytes)=285954048
                Peak Map Virtual memory (bytes)=2557808640
                Peak Reduce Physical memory (bytes)=246775808
                Peak Reduce Virtual memory (bytes)=2573406208
        Shuffle Errors
                BAD_ID=0
                CONNECTION=0
                IO_ERROR=0
                WRONG_LENGTH=0
                WRONG_MAP=0
                WRONG_REDUCE=0
        File Input Format Counters
                Bytes Read=36
        File Output Format Counters
                Bytes Written=38
[atguigu@hadoop102 hadoop-3.1.3]$

结果生成

涉及原理不是很清楚

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值