p34_日志聚集功能 总结

1.

http://hadoop103:8088/cluster/apps

2.

3.

4.

5.

http://hadoop102:19888/jobhistory/logs/hadoop103:37930/container_1725590779063_0001_01_000001/job_1725590779063_0001/atguigu

点击但是不能查看相关的日志

6.

修改xml

/opt/module/hadoop-3.1.3/etc/hadoop

<?xml version="1.0"?>
<!--
  Licensed under the Apache License, Version 2.0 (the "License");
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
-->
<configuration>
   <!-- 指定 MR 走 shuffle -->
    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
    <!-- 指定 ResourceManager 的地址-->
    <property>
        <name>yarn.resourcemanager.hostname</name>
        <value>hadoop103</value>
    </property>
    <!-- 环境变量的继承 -->
    <property>
        <name>yarn.nodemanager.env-whitelist</name>

        <value>JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CO
            NF_DIR,CLASSPATH_PREPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_MAP
            RED_HOME</value>
    </property>
   
   <!--开启日志聚集功能-->
   <property>
	<name>yarn.log-aggregation-enable</name>
	<value>true</value>
   </property>
  <!--设置日志服务器地址-->
  <property>
	<name>yarn.log.server.url</name>
	<value>http://hadoop102:19888/jobhistory/logs</value>
  </property>
  <!--设置日志保留时间为7天-->
  <property>
	<name>yarn.log-aggregation.retain-seconds</name>
	<value>604800</value>
 </property>
</configuration>

[atguigu@hadoop102 hadoop]$ xsync yarn-site.xml

==================== hadoop102 ====================

sending incremental file list

sent 57 bytes  received 12 bytes  138.00 bytes/sec

total size is 1671  speedup is 24.22

==================== hadoop103 ====================

sending incremental file list

yarn-site.xml

sent 1075 bytes  received 43 bytes  2236.00 bytes/sec

total size is 1671  speedup is 1.49

==================== hadoop104 ====================

sending incremental file list

yarn-site.xml

sent 1075 bytes  received 43 bytes  2236.00 bytes/sec

total size is 1671  speedup is 1.49

102操作

[atguigu@hadoop102 hadoop]$ pwd

/opt/module/hadoop-3.1.3/etc/hadoop

mapred  --daemon stop historyserver

103上操作

[atguigu@hadoop103 hadoop-3.1.3]$ pwd

/opt/module/hadoop-3.1.3

sbin/stop-yarn.sh

102操作

[atguigu@hadoop102 hadoop]$ pwd

/opt/module/hadoop-3.1.3/etc/hadoop

mapred  --daemon start historyserver

103操作

[atguigu@hadoop103 hadoop-3.1.3]$ pwd

/opt/module/hadoop-3.1.3

sbin/start-yarn.sh

然后再执行一遍任务

/opt/module/hadoop-3.1.3

[atguigu@hadoop102 hadoop-3.1.3]$ hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-3.1.3.jar wordcount /input /output

[atguigu@hadoop102 hadoop-3.1.3]$ hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-3.1.3.jar wordcount /input /output2

这样可以看到历史信息了

启停脚本

#!/bin/bash
if [ $# -lt 1 ]
then
 echo "No Args Input..."
 exit ;
fi
case $1 in
"start")
 echo " =================== 启动 hadoop 集群 ==================="
 echo " --------------- 启动 hdfs ---------------"
 ssh hadoop102 "/opt/module/hadoop-3.1.3/sbin/start-dfs.sh"
 echo " --------------- 启动 yarn ---------------"
  ssh hadoop103 "/opt/module/hadoop-3.1.3/sbin/start-yarn.sh"
 echo " --------------- 启动 historyserver ---------------"
 ssh hadoop102 "/opt/module/hadoop-3.1.3/bin/mapred --daemon start historyserver"
;;
"stop")
 echo " =================== 关闭 hadoop 集群 ==================="
 echo " --------------- 关闭 historyserver ---------------"
 ssh hadoop102 "/opt/module/hadoop-3.1.3/bin/mapred --daemon stop historyserver"
 echo " --------------- 关闭 yarn ---------------"
 ssh hadoop103 "/opt/module/hadoop-3.1.3/sbin/stop-yarn.sh"
 echo " --------------- 关闭 hdfs ---------------"
 ssh hadoop102 "/opt/module/hadoop-3.1.3/sbin/stop-dfs.sh"
;;
*)
 echo "Input Args Error..."
;;
esac

附录:

Log Type: directory.info

Log Upload Time: 星期五 九月 06 11:40:21 +0800 2024

Log Length: 2336

ls -l:
total 20
-rw-r--r--. 1 atguigu atguigu  100 Sep  6 11:39 container_tokens
-rwx------. 1 atguigu atguigu  778 Sep  6 11:39 default_container_executor.sh
-rwx------. 1 atguigu atguigu  723 Sep  6 11:39 default_container_executor_session.sh
lrwxrwxrwx. 1 atguigu atguigu  121 Sep  6 11:39 job.jar -> /opt/module/hadoop-3.1.3/data/nm-local-dir/usercache/atguigu/appcache/application_1725593577897_0004/filecache/11/job.jar
lrwxrwxrwx. 1 atguigu atguigu  121 Sep  6 11:39 job.xml -> /opt/module/hadoop-3.1.3/data/nm-local-dir/usercache/atguigu/appcache/application_1725593577897_0004/filecache/13/job.xml
drwxrwxr-x. 2 atguigu atguigu   48 Sep  6 11:39 jobSubmitDir
-rwx------. 1 atguigu atguigu 5112 Sep  6 11:39 launch_container.sh
drwx--x---. 2 atguigu atguigu    6 Sep  6 11:39 tmp
find -L . -maxdepth 5 -ls:
67679904    4 drwx--x---   4 atguigu  atguigu      4096 Sep  6 11:39 .
  7212    0 drwx--x---   2 atguigu  atguigu         6 Sep  6 11:39 ./tmp
67679905    4 -rw-r--r--   1 atguigu  atguigu       100 Sep  6 11:39 ./container_tokens
67679906    4 -rw-r--r--   1 atguigu  atguigu        12 Sep  6 11:39 ./.container_tokens.crc
67679907    8 -rwx------   1 atguigu  atguigu      5112 Sep  6 11:39 ./launch_container.sh
67679908    4 -rw-r--r--   1 atguigu  atguigu        48 Sep  6 11:39 ./.launch_container.sh.crc
67679909    4 -rwx------   1 atguigu  atguigu       723 Sep  6 11:39 ./default_container_executor_session.sh
67679910    4 -rw-r--r--   1 atguigu  atguigu        16 Sep  6 11:39 ./.default_container_executor_session.sh.crc
67679911    4 -rwx------   1 atguigu  atguigu       778 Sep  6 11:39 ./default_container_executor.sh
67679912    4 -rw-r--r--   1 atguigu  atguigu        16 Sep  6 11:39 ./.default_container_executor.sh.crc
40183862    0 drwx------   2 atguigu  atguigu        21 Sep  6 11:39 ./job.jar
40183863  312 -r-x------   1 atguigu  atguigu    316382 Sep  6 11:39 ./job.jar/job.jar
40183870    0 drwxrwxr-x   2 atguigu  atguigu        48 Sep  6 11:39 ./jobSubmitDir
67679900    4 -r-x------   1 atguigu  atguigu       108 Sep  6 11:39 ./jobSubmitDir/job.split
102923475    4 -r-x------   1 atguigu  atguigu        43 Sep  6 11:39 ./jobSubmitDir/job.splitmetainfo
102923478  184 -r-x------   1 atguigu  atguigu    185670 Sep  6 11:39 ./job.xml
broken symlinks(find -L . -maxdepth 5 -type l -ls):

Log Type: launch_container.sh

Log Upload Time: 星期五 九月 06 11:40:21 +0800 2024

Log Length: 5112

Showing 4096 bytes of 5112 total. Click here for the full log.

export LOCAL_DIRS="/opt/module/hadoop-3.1.3/data/nm-local-dir/usercache/atguigu/appcache/application_1725593577897_0004"
export LOCAL_USER_DIRS="/opt/module/hadoop-3.1.3/data/nm-local-dir/usercache/atguigu/"
export LOG_DIRS="/opt/module/hadoop-3.1.3/logs/userlogs/application_1725593577897_0004/container_1725593577897_0004_01_000001"
export USER="atguigu"
export LOGNAME="atguigu"
export HOME="/home/"
export PWD="/opt/module/hadoop-3.1.3/data/nm-local-dir/usercache/atguigu/appcache/application_1725593577897_0004/container_1725593577897_0004_01_000001"
export JVM_PID="$$"
export MALLOC_ARENA_MAX="4"
export NM_AUX_SERVICE_mapreduce_shuffle="AAA0+gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA="
export APPLICATION_WEB_PROXY_BASE="/proxy/application_1725593577897_0004"
export SHELL="/bin/bash"
export HADOOP_MAPRED_HOME="/opt/module/hadoop-3.1.3"
export CLASSPATH="$PWD:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/share/hadoop/common/*:$HADOOP_COMMON_HOME/share/hadoop/common/lib/*:$HADOOP_HDFS_HOME/share/hadoop/hdfs/*:$HADOOP_HDFS_HOME/share/hadoop/hdfs/lib/*:$HADOOP_YARN_HOME/share/hadoop/yarn/*:$HADOOP_YARN_HOME/share/hadoop/yarn/lib/*:$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*:$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*:job.jar/*:job.jar/classes/:job.jar/lib/*:$PWD/*"
export APP_SUBMIT_TIME_ENV="1725593981345"
export LD_LIBRARY_PATH="$PWD:$HADOOP_COMMON_HOME/lib/native"
echo "Setting up job resources"
ln -sf -- "/opt/module/hadoop-3.1.3/data/nm-local-dir/usercache/atguigu/appcache/application_1725593577897_0004/filecache/11/job.jar" "job.jar"
mkdir -p jobSubmitDir
ln -sf -- "/opt/module/hadoop-3.1.3/data/nm-local-dir/usercache/atguigu/appcache/application_1725593577897_0004/filecache/12/job.split" "jobSubmitDir/job.split"
mkdir -p jobSubmitDir
ln -sf -- "/opt/module/hadoop-3.1.3/data/nm-local-dir/usercache/atguigu/appcache/application_1725593577897_0004/filecache/10/job.splitmetainfo" "jobSubmitDir/job.splitmetainfo"
ln -sf -- "/opt/module/hadoop-3.1.3/data/nm-local-dir/usercache/atguigu/appcache/application_1725593577897_0004/filecache/13/job.xml" "job.xml"
echo "Copying debugging information"
# Creating copy of launch script
cp "launch_container.sh" "/opt/module/hadoop-3.1.3/logs/userlogs/application_1725593577897_0004/container_1725593577897_0004_01_000001/launch_container.sh"
chmod 640 "/opt/module/hadoop-3.1.3/logs/userlogs/application_1725593577897_0004/container_1725593577897_0004_01_000001/launch_container.sh"
# Determining directory contents
echo "ls -l:" 1>"/opt/module/hadoop-3.1.3/logs/userlogs/application_1725593577897_0004/container_1725593577897_0004_01_000001/directory.info"
ls -l 1>>"/opt/module/hadoop-3.1.3/logs/userlogs/application_1725593577897_0004/container_1725593577897_0004_01_000001/directory.info"
echo "find -L . -maxdepth 5 -ls:" 1>>"/opt/module/hadoop-3.1.3/logs/userlogs/application_1725593577897_0004/container_1725593577897_0004_01_000001/directory.info"
find -L . -maxdepth 5 -ls 1>>"/opt/module/hadoop-3.1.3/logs/userlogs/application_1725593577897_0004/container_1725593577897_0004_01_000001/directory.info"
echo "broken symlinks(find -L . -maxdepth 5 -type l -ls):" 1>>"/opt/module/hadoop-3.1.3/logs/userlogs/application_1725593577897_0004/container_1725593577897_0004_01_000001/directory.info"
find -L . -maxdepth 5 -type l -ls 1>>"/opt/module/hadoop-3.1.3/logs/userlogs/application_1725593577897_0004/container_1725593577897_0004_01_000001/directory.info"
echo "Launching container"
exec /bin/bash -c "$JAVA_HOME/bin/java -Djava.io.tmpdir=$PWD/tmp -Dlog4j.configuration=container-log4j.properties -Dyarn.app.container.log.dir=/opt/module/hadoop-3.1.3/logs/userlogs/application_1725593577897_0004/container_1725593577897_0004_01_000001 -Dyarn.app.container.log.filesize=0 -Dhadoop.root.logger=INFO,CLA -Dhadoop.root.logfile=syslog  -Xmx1024m org.apache.hadoop.mapreduce.v2.app.MRAppMaster 1>/opt/module/hadoop-3.1.3/logs/userlogs/application_1725593577897_0004/container_1725593577897_0004_01_000001/stdout 2>/opt/module/hadoop-3.1.3/logs/userlogs/application_1725593577897_0004/container_1725593577897_0004_01_000001/stderr "

Log Type: prelaunch.err

Log Upload Time: 星期五 九月 06 11:40:21 +0800 2024

Log Length: 0


Log Type: prelaunch.out

Log Upload Time: 星期五 九月 06 11:40:21 +0800 2024

Log Length: 100

Setting up env variables
Setting up job resources
Copying debugging information
Launching container

Log Type: stderr

Log Upload Time: 星期五 九月 06 11:40:21 +0800 2024

Log Length: 1729

Sep 06, 2024 11:39:46 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.mapreduce.v2.app.webapp.JAXBContextResolver as a provider class
Sep 06, 2024 11:39:46 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Sep 06, 2024 11:39:46 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.mapreduce.v2.app.webapp.AMWebServices as a root resource class
Sep 06, 2024 11:39:46 AM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.19 02/11/2015 03:25 AM'
Sep 06, 2024 11:39:46 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.mapreduce.v2.app.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Sep 06, 2024 11:39:46 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Sep 06, 2024 11:39:46 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.mapreduce.v2.app.webapp.AMWebServices to GuiceManagedComponentProvider with the scope "PerRequest"
log4j:WARN No appenders could be found for logger (org.apache.hadoop.mapreduce.v2.app.MRAppMaster).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

Log Type: stdout

Log Upload Time: 星期五 九月 06 11:40:21 +0800 2024

Log Length: 0


Log Type: syslog

Log Upload Time: 星期五 九月 06 11:40:21 +0800 2024

Log Length: 75504

Showing 4096 bytes of 75504 total. Click here for the full log.

hdfs://hadoop102:8020/tmp/hadoop-yarn/staging/history/done_intermediate/atguigu/job_1725593577897_0004.summary_tmp to hdfs://hadoop102:8020/tmp/hadoop-yarn/staging/history/done_intermediate/atguigu/job_1725593577897_0004.summary
2024-09-06 11:40:14,958 INFO [eventHandlingThread] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Moved tmp to done: hdfs://hadoop102:8020/tmp/hadoop-yarn/staging/history/done_intermediate/atguigu/job_1725593577897_0004_conf.xml_tmp to hdfs://hadoop102:8020/tmp/hadoop-yarn/staging/history/done_intermediate/atguigu/job_1725593577897_0004_conf.xml
2024-09-06 11:40:14,961 INFO [eventHandlingThread] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Moved tmp to done: hdfs://hadoop102:8020/tmp/hadoop-yarn/staging/history/done_intermediate/atguigu/job_1725593577897_0004-1725593981345-atguigu-word+count-1725594014600-1-1-SUCCEEDED-default-1725593987028.jhist_tmp to hdfs://hadoop102:8020/tmp/hadoop-yarn/staging/history/done_intermediate/atguigu/job_1725593577897_0004-1725593981345-atguigu-word+count-1725594014600-1-1-SUCCEEDED-default-1725593987028.jhist
2024-09-06 11:40:14,964 INFO [Thread-76] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Stopped JobHistoryEventHandler. super.stop()
2024-09-06 11:40:14,966 INFO [Thread-76] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: KILLING attempt_1725593577897_0004_r_000000_2
2024-09-06 11:40:14,991 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1725593577897_0004_r_000000_2 TaskAttempt Transitioned from SUCCESS_FINISHING_CONTAINER to SUCCEEDED
2024-09-06 11:40:14,997 INFO [Thread-76] org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator: Setting job diagnostics to 
2024-09-06 11:40:14,997 INFO [Thread-76] org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator: History url is http://hadoop102:19888/jobhistory/job/job_1725593577897_0004
2024-09-06 11:40:15,009 INFO [Thread-76] org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator: Waiting for application to be successfully unregistered.
2024-09-06 11:40:16,016 INFO [Thread-76] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Final Stats: PendingReds:0 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:0 AssignedReds:1 CompletedMaps:1 CompletedReds:0 ContAlloc:4 ContRel:0 HostLocal:1 RackLocal:0
2024-09-06 11:40:16,017 INFO [Thread-76] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Deleting staging directory hdfs://hadoop102:8020 /tmp/hadoop-yarn/staging/atguigu/.staging/job_1725593577897_0004
2024-09-06 11:40:16,022 INFO [Thread-76] org.apache.hadoop.ipc.Server: Stopping server on 37471
2024-09-06 11:40:16,026 INFO [IPC Server listener on 0] org.apache.hadoop.ipc.Server: Stopping IPC Server listener on 0
2024-09-06 11:40:16,027 INFO [IPC Server Responder] org.apache.hadoop.ipc.Server: Stopping IPC Server Responder
2024-09-06 11:40:16,033 INFO [TaskHeartbeatHandler PingChecker] org.apache.hadoop.mapreduce.v2.app.TaskHeartbeatHandler: TaskHeartbeatHandler thread interrupted
2024-09-06 11:40:16,034 INFO [Ping Checker] org.apache.hadoop.yarn.util.AbstractLivelinessMonitor: TaskAttemptFinishingMonitor thread interrupted
2024-09-06 11:40:21,042 INFO [Thread-76] org.apache.hadoop.ipc.Server: Stopping server on 42523
2024-09-06 11:40:21,043 INFO [IPC Server listener on 0] org.apache.hadoop.ipc.Server: Stopping IPC Server listener on 0
2024-09-06 11:40:21,044 INFO [IPC Server Responder] org.apache.hadoop.ipc.Server: Stopping IPC Server Responder
2024-09-06 11:40:21,047 INFO [Thread-76] org.eclipse.jetty.server.handler.ContextHandler: Stopped o.e.j.w.WebAppContext@7f9e8421{/,null,UNAVAILABLE}{/mapreduce}
2024-09-06 11:40:21,051 INFO [Thread-76] org.eclipse.jetty.server.AbstractConnector: Stopped ServerConnector@2703d91{HTTP/1.1,[http/1.1]}{0.0.0.0:0}
2024-09-06 11:40:21,052 INFO [Thread-76] org.eclipse.jetty.server.handler.ContextHandler: Stopped o.e.j.s.ServletContextHandler@2e34384c{/static,jar:file:/opt/module/hadoop-3.1.3/share/hadoop/yarn/hadoop-yarn-common-3.1.3.jar!/webapps/static,UNAVAILABLE}

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值