Spark-task相关

Spark-task相关

@(spark)[Task]

TaskState

private[spark] object TaskState extends Enumeration {                                                                                                                   

  val LAUNCHING, RUNNING, FINISHED, FAILED, KILLED, LOST = Value                                                                                                        

  val FINISHED_STATES = Set(FINISHED, FAILED, KILLED, LOST)      

Task的状态,本文件还包含了mesos相关状态的转换

TaskEndReason

本质上是个枚举类,标识了所有的task end 的reason

/**                                                                                                                                                                     
 * :: DeveloperApi ::                                                                                                                                                   
 * Various possible reasons why a task ended. The low-level TaskScheduler is supposed to retry                                                                          
 * tasks several times for "ephemeral" failures, and only report back failures that require some                                                                        
 * old stages to be resubmitted, such as shuffle map fetch failures.                                                                                                    
 */                                                                                                                                                                     
@DeveloperApi                                                                                                                                                           
sealed trait TaskEndReason   

TaskContext

/**                                                                                                                                                                     
 * Contextual information about a task which can be read or mutated during                                                                                              
 * execution. To access the TaskContext for a running task, use:                                                                                                        
 * {{{                                                                                                                                                                  
 *   org.apache.spark.TaskContext.get()                                                                                                                                 
 * }}}                                                                                                                                                                  
 */                                                                                                                                                                     
abstract class TaskContext extends Serializable {   

接口定义

TaskContextHelper

/**                                                                                                                                                                     
 * This class exists to restrict the visibility of TaskContext setters.                                                                                                 
 */                                                                                                                                                                     
private [spark] object TaskContextHelper {   

TaskContextImpl

实现类,基本上提供了一些metric和listen的钩子

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值