目录
- 1、软件安装
- 2、查看过程
1、软件安装
安装curl工具,来发送http请求:
故使用 以下命令安装:
yum install -y curl
2、查看过程
使用curl RESTful 方式查看 spark applications 、jobs、stages
命令行输入:
curl http://192.168.2.201:18080/api/v1/applications
结果显示为 json 格式,
本人查询如下:
[ {
"id" : "application_1563363497227_0002",
"name" : "Spark shell",
"attempts" : [ {
"startTime" : "2019-07-23T10:52:49.818GMT",
"endTime" : "2019-07-23T10:56:01.634GMT",
"sparkUser" : "hadoop",
"completed" : true
} ]
}, {
"id" : "app-20190723184852-0000",
"name" : "Spark shell",
"attempts" : [ {
"startTime" : "2019-07-23T10:48:43.722GMT",
"endTime" : "2019-07-23T10:51:27.473GMT",
"sparkUser" : "hadoop",
"completed" : true
} ]
}, {
"id" : "app-20190723180009-0000",
"name" : "Spark shell",
"attempts" : [ {
"startTime" : "2019-07-23T10:00:00.379GMT",
"endTime" : "2019-07-23T10:07:52.489GMT",
"sparkUser" : "hadoop",
"completed" : true
} ]
}, {
"id" : "app-20190723202726-0001",
"name" : "Spark shell",
"attempts" : [ {
"startTime" : "2019-07-23T12:27:18.239GMT",
"endTime" : "1969-12-31T23:59:59.999GMT",
"sparkUser" : "hadoop",
"completed" : false
} ]
} ]
相关查询语法为:
以下是所有API的说明
/applications 获取作业列表
/applications/[app-id]/jobs 指定作业的job列表
/applications/[app-id]/jobs/[job-id] 指定job的信息
/applications/[app-id]/stages 指定作业的stage列表
/applications/[app-id]/stages/[stage-id] 指定stage的所有attempt列表
/applications/[app-id]/stages/[stage-id]/[stage-attempt-id] 指定stage attempt的信息
/applications/[app-id]/stages/[stage-id]/[stage-attempt-id]/taskSummary 指定stage attempt所有task的metrics统计信息
/applications/[app-id]/stages/[stage-id]/[stage-attempt-id]/taskList 指定stage attempt的task列表
/applications/[app-id]/executors 指定作业的executor列表
/applications/[app-id]/storage/rdd 指定作业的持久化rdd列表
/applications/[app-id]/storage/rdd/[rdd-id] 指定持久化rdd的信息
/applications/[app-id]/logs 下载指定作业的所有日志的压缩包
/applications/[app-id]/[attempt-id]/logs 下载指定作业的某次attempt的所有日志的压缩包