spark on yarn cluster查看日志

8 篇文章 0 订阅
1 篇文章 0 订阅

spark on yarn cluster查看日志

[hadoop@hadoop001 shell]$ yarn logs -applicationId application_1420997455428_0005
15/01/12 04:34:51 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
/tmp/logs/hadoop/logs/application_1420997455428_0005does not exist.
Log aggregation has not completed or is not enabled.
原因是聚合日志没有开启

解决如下:

[hadoop@hadoop001 hadoop]$ vi yarn-site.xml
  <!--开启聚合日志--> 
    <property>
        <name>yarn.log-aggregation-enable</name>
        <value>true</value>
    </property>
      <!--设置聚合日志放置的位置--> ///tmp/logs是hdfs分布式系统上面的,自己创建自定义的文件夹
    <property>
        <name>yarn.nodemanager.remote-app-log-dir</name>
        <value>/tmp/logs</value>
    </property>
    <!--设置聚合日志存放的时间,单位为秒,会自动移除-->
    <property>
        <name>yarn.log-aggregation.retain-seconds</name>
  <value>3600</value>
    </property>

配置完yarn-site.xml之后重启节点,再次查看日志

[hadoop@hadoop001 shell]$ yarn logs -applicationId application_1420997455428_0005
15/01/12 09:53:43 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
Container: container_1421027534751_0001_01_000003 on hadoop001_40205
======================================================================
LogType:stderr
Log Upload Time:Mon Jan 12 09:53:04 +0800 2015
LogLength:7912
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/tmp/hadoop-hadoop/nm-local-dir/usercache/hadoop/filecache/10/__spark_libs__7968601071120350016.zip/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/01/12 09:52:58 INFO executor.CoarseGrainedExecutorBackend: Started daemon with process name: 31008@hadoop001
15/01/12 09:52:58 INFO util.SignalUtils: Registered signal handler for TERM
15/01/12 09:52:58 INFO util.SignalUtils: Registered signal handler for HUP
15/01/12 09:52:58 INFO util.SignalUtils: Registered signal handler for INT
15/01/12 09:52:59 INFO spark.SecurityManager: Changing view acls to: hadoop
15/01/12 09:52:59 INFO spark.SecurityManager: Changing modify acls to: hadoop
15/01/12 09:52:59 INFO spark.SecurityManager: Changing view acls groups to: 
15/01/12 09:52:59 INFO spark.SecurityManager: Changing modify acls groups to: 

注:1.查看某个应用的日志

yarn logs -applicationId application_1420997455428_0005

2.查看某个应用的状态

yarn application -status application_1420997455428_0005

3.杀掉某个应用
(直接在UI界面或者是终端kill掉任务都是不对的,该任务可能还会继续执行下去,所以要用如下命令才算完全停止该应用的执行)

yarn application -kill application_1420997455428_0005
  • 1
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值