Hadoop Rpc解析

hadoop Rpc概述

Hadoop的RPC使用了两种RPC引擎:WritableRpcEngine和ProtobufRpcEngine。

RPC的包格式如下所示:由两部分组成:RpcRequestHeader和RpcRequest。

  • 若是使用ProtobufRpcEngine的引擎,格式如下:

4字节总长度

RpcRequestHeader长度(可变长)

RpcRequestHeader

Requestheader长度(可变长)

Requesheader

request长度(可变长)

request

  • 若是使用WritableRpcEngine的引擎,格式如下:

4字节总长度

RpcRequestHeader长度(可变长)

RpcRequestHeader

RpcRequest

RpcRequestHeader格式

RpcRequestHeader的反序列化都是使用RpcRequestHeaderProto(是ProtobufRpcEngine类型),可从中获取RpcKind。

  1. 若为“RPC_WRITABLE”,则是使用了WritableRpcEngine。
  2. 若为“RPC_PROTOCOL_BUFFER”,则是使用了ProtobufRpcEngine。
  3. 若为“RPC_BUILTIN”,是测试用的,此处不考虑。

例子:(我猜测前2个字节为“0802”则是“RPC_PROTOCOL_BUFFER”,若为“0801”则是“RPC_WRITABLE”)

0802100018052210c898014f84fe480586462367efc1d2da28013a330a316d725f6170706d61737465725f617070617474656d70745f313538323137363836383034345f303031355f303030303031

对应的ascii为:

......"....O..H..F#g....(.:3.1mr_appmaster_appattempt_1582176868044_0015_000001

 

RpcRequest格式

  • ProtobufEngine类型

RpcRequest包含了RequestHeader和Request部分

1.RequestHeaderProto

其中RequestHeader是使用了RequestHeaderProto,从中可获得Request使用的protocol name和调用的方法method。

例子:0a0f7374617274436f6e7461696e65727312386f72672e6170616368652e6861646f6f702e7961726e2e6170692e436f6e7461696e65724d616e6167656d656e7450726f746f636f6c50421801

Ascii:

..startContainers.8org.apache.hadoop.yarn.api.ContainerManagementProtocolPB..

在此处protocol name = “org.apache.hadoop.yarn.api.ContainerManagementProtocolPB”

method = “startContainers”

 

在ContainerManagementProtocolService中定义了

rpc startContainers(StartContainersRequestProto) returns (StartContainersResponseProto);

我们从RequestHeader中可得知他想调用的方法是startContainers,那么接下来的Request就是采用了StartContainersRequestProto(Request是传的参数)

2.SubmitApplicationRequestProto

Map task的个数:通过SubmitApplicationRequestProto可获得job.xml的地址,从而获得map task的个数。

 

例子:Rpc格式如下

4字节总长 | header可变长 | header内容 | ReaquestHeader长度 | ReaquestHeader内容 | request请求长度 | request请求内容

000006d5//总长
21//header长度
080210001856221066f2577aebab4c34922b4d885d799fa128003a050a03434c49//header内容
对应的ascii:
.....V".f.Wz..L4.+M.]y..(.:...CLI
4d//ReaquestHeader长度
0a117375626d69744170706c69636174696f6e12366f72672e6170616368652e6861646f6f702e7961726e2e6170692e4170706c69636174696f6e436c69656e7450726f746f636f6c50421801//ReaquestHeader内容
对应的ascii:
..submitApplication.6org.apache.hadoop.yarn.api.ApplicationClientProtocolPB..
e30c//request长度
0ae00c0a09082310ccaddc88862e120f51756173694d6f6e74654361726c6f1a0764656661756c742aee0b0a93010a1e6a6f625375626d69744469722f6a6f622e73706c69746d657461696e666f12710a620a046864667312066d617374657218a846224f2f746d702f6861646f6f702d7961726e2f73746167696e672f7469616e2f2e73746167696e672f6a6f625f313538323137363836383034345f303033352f6a6f622e73706c69746d657461696e666f101418d4d0aebf8c2e200228030a89010a076a6f622e6a6172127e0a580a046864667312066d617374657218a84622452f746d702f6861646f6f702d7961726e2f73746167696e672f7469616e2f2e73746167696e672f6a6f625f313538323137363836383034345f303033352f6a6f622e6a617210fca81318e7ceaebf8c2e200328033213283f3a636c61737365732f7c6c69622f292e2a0a84010a166a6f625375626d69744469722f6a6f622e73706c6974126a0a5a0a046864667312066d617374657218a84622472f746d702f6861646f6f702d7961726e2f73746167696e672f7469616e2f2e73746167696e672f6a6f625f313538323137363836383034345f303033352f6a6f622e73706c69741096011892d0aebf8c2e200228030a740a076a6f622e786d6c12690a580a046864667312066d617374657218a84622452f746d702f6861646f6f702d7961726e2f73746167696e672f7469616e2f2e73746167696e672f6a6f625f313538323137363836383034345f303033352f6a6f622e786d6c10a5a30b18d6d1aebf8c2e20022803122648445453000001154d617052656475636553687566666c65546f6b656e0807b9d611ca38477c22120a055348454c4c12092f62696e2f6261736822390a0f4c445f4c4942524152595f504154481226245057443a7b7b4841444f4f505f434f4d4d4f4e5f484f4d457d7d2f6c69622f6e617469766522540a124841444f4f505f4d41505245445f484f4d45123e2f686f6d652f7469616e2f486f2f6861646f6f702d332e312e312d7372632f6861646f6f702d646973742f7461726765742f6861646f6f702d332e312e3122a9030a09434c41535350415448129b03245057443a244841444f4f505f434f4e465f4449523a244841444f4f505f434f4d4d4f4e5f484f4d452f73686172652f6861646f6f702f636f6d6d6f6e2f2a3a244841444f4f505f434f4d4d4f4e5f484f4d452f73686172652f6861646f6f702f636f6d6d6f6e2f6c69622f2a3a244841444f4f505f484446535f484f4d452f73686172652f6861646f6f702f686466732f2a3a244841444f4f505f484446535f484f4d452f73686172652f6861646f6f702f686466732f6c69622f2a3a244841444f4f505f5941524e5f484f4d452f73686172652f6861646f6f702f7961726e2f2a3a244841444f4f505f5941524e5f484f4d452f73686172652f6861646f6f702f7961726e2f6c69622f2a3a244841444f4f505f4d41505245445f484f4d452f73686172652f6861646f6f702f6d61707265647563652f2a3a244841444f4f505f4d41505245445f484f4d452f73686172652f6861646f6f702f6d61707265647563652f6c69622f2a3a6a6f622e6a61722f2a3a6a6f622e6a61722f636c61737365732f3a6a6f622e6a61722f6c69622f2a3a245057442f2a2ac502244a4156415f484f4d452f62696e2f6a617661202d446a6176612e696f2e746d706469723d245057442f746d70202d446c6f67346a2e636f6e66696775726174696f6e3d636f6e7461696e65722d6c6f67346a2e70726f70657274696573202d447961726e2e6170702e636f6e7461696e65722e6c6f672e6469723d3c4c4f475f4449523e202d447961726e2e6170702e636f6e7461696e65722e6c6f672e66696c6573697a653d30202d446861646f6f702e726f6f742e6c6f676765723d494e464f2c434c41202d446861646f6f702e726f6f742e6c6f6766696c653d7379736c6f6720202d586d78313032346d206f72672e6170616368652e6861646f6f702e6d61707265647563652e76322e6170702e4d524170704d617374657220313e3c4c4f475f4449523e2f7374646f757420323e3c4c4f475f4449523e2f7374646572722032050802120120320508011201203001400252094d41505245445543458a01380a02080012012a1a2b08800c10011a140a096d656d6f72792d6d6210800c1a024d6920001a0e0a0676636f72657310011a00200020012801//request内容
对应的ascii:
......#.........QuasiMonteCarlo..default*.......jobSubmitDir/job.splitmetainfo.q.b..hdfs..master..F"O/tmp/hadoop-yarn/staging/tian/.staging/job_1582176868044_0035/job.splitmetainfo......... .(......job.jar.~.X..hdfs..master..F"E/tmp/hadoop-yarn/staging/tian/.staging/job_1582176868044_0035/job.jar........... .(.2.(?:classes/|lib/).*.....jobSubmitDir/job.split.j.Z..hdfs..master..F"G/tmp/hadoop-yarn/staging/tian/.staging/job_1582176868044_0035/job.split.......... .(..t..job.xml.i.X..hdfs..master..F"E/tmp/hadoop-yarn/staging/tian/.staging/job_1582176868044_0035/job.xml........... .(..&HDTS....MapReduceShuffleToken......8G|"...SHELL../bin/bash"9..LD_LIBRARY_PATH.&$PWD:{{HADOOP_COMMON_HOME}}/lib/native"T..HADOOP_MAPRED_HOME.>/home/tian/Ho/hadoop-3.1.1-src/hadoop-dist/target/hadoop-3.1.1"....CLASSPATH...$PWD:$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/share/hadoop/common/*:$HADOOP_COMMON_HOME/share/hadoop/common/lib/*:$HADOOP_HDFS_HOME/share/hadoop/hdfs/*:$HADOOP_HDFS_HOME/share/hadoop/hdfs/lib/*:$HADOOP_YARN_HOME/share/hadoop/yarn/*:$HADOOP_YARN_HOME/share/hadoop/yarn/lib/*:$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*:$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*:job.jar/*:job.jar/classes/:job.jar/lib/*:$PWD/**..$JAVA_HOME/bin/java -Djava.io.tmpdir=$PWD/tmp -Dlog4j.configuration=container-log4j.properties -Dyarn.app.container.log.dir=<LOG_DIR> -Dyarn.app.container.log.filesize=0 -Dhadoop.root.logger=INFO,CLA -Dhadoop.root.logfile=syslog  -Xmx1024m org.apache.hadoop.mapreduce.v2.app.MRAppMaster 1><LOG_DIR>/stdout 2><LOG_DIR>/stderr 2..... 2..... 0.@.R.MAPREDUCE..8......*.+.........memory-mb.....Mi .....vcores.... . .(.
  • WritableRpcEngine类型

这里的格式是:

4字节总长 | header长度 | header内容 | request内容

00000106 //总长
46 //header长度
08011000182822106b6c4489ba94444faf2a2a3639f1c88028003a2a0a286d725f617474656d70745f313538323137363836383034345f303031355f725f3030303030305f30
对应的ascii:
.....(".klD...DO.**69...(.:*.(mr_attempt_1582176868044_0015_r_000000_0

//接下来是rpc request
0000000000000002  //rpc version=0x2 (8字节)此处是long类型
002e  //protocol name 的长度(最低为2字节,是可变长的) 
6f72672e6170616368652e6861646f6f702e6d61707265642e5461736b556d62696c6963616c50726f746f636f6c //protocolname的内容
对应的ascii:
org.apache.hadoop.mapred.TaskUmbilicalProtocol
0004 //method的长度 (最低为2字节,是可变长的)
646f6e65 //method内容
对应的ascii:
done
0000000000000015 //client version = 0x15(8字节)此处是long类型
e780bc03 //clientMethodsHash (4字节)此处是int类型
00000001 //参数的个数(即object的个数)(4字节表示)此处是int类型

//接下来是object的格式(要对应具体的object的类的write)object就是传的参数,在此object是task_attempt
0026 //(最低为2字节,是可变长的)
6f72672e6170616368652e6861646f6f702e6d61707265642e5461736b417474656d70744944
对应的ascii:
org.apache.hadoop.mapred.TaskAttemptID //是具体类的write函数
0026 //(最低为2字节,是可变长的)
6f72672e6170616368652e6861646f6f702e6d61707265642e5461736b417474656d70744944 
对应的ascii:
org.apache.hadoop.mapred.TaskAttemptID //是具体类的write函数

00000000 //0号attempt (4字节 int)
00000000  //0号task (4字节 int)
0000000f  //jobid 15 (4字节 int)
0d //长度 (可变长)
31353832313736383638303434  //表示JobTracker启动时间
对应的ascii:
1582176868044
06 // 长度(可变长)
524544554345  //  task的类型 : MAP, REDUCE, JOB_SETUP, JOB_CLEANUP, TASK_CLEANUP  
对应的ascii:
REDUCE

综上所述object反序列得:mr_attempt_1582176868044_0015_r_000000_0

Task_attempt的格式为:

Attempt_JobTracker启动时间_jobid_m/r_taskid_taskattemptid

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值