数据文件为:“JsonTest02.json”
用SparkCore或SparkSQL实现均可,需求如下:
- 统计每个用户充值总金额并降序排序(10分)
- 统计所有系统类型登录总次数并降序排序(10分)
- 统计所有用户在各省登录的次数的Top3(20分)
部分字段:
phoneNum:手机号(用户账号)
terminal:系统类型
province:省份
money:充值金额
status:充值状态(成功为 ‘1’ 失败为 ‘0’)
部分数据:
{"openid":"opEu45VAwuzCsDr6iGIf4qhnUZUI","phoneNum":"18334832972","money":"30","date":"2018-09-13T02:15:16.054Z","lat":39.688011,"log":116.066689,"province":"四川省","city":"成都市","district":"房山区","terminal":"ios","status":"1"}
{"openid":"opEu45VAwuzCsDr6iGIf4qhnUZUI","phoneNum":"15101592939","money":"50","date":"2018-09-13T02:15:16.054Z","lat":39.688011,"log":116.066689,"province":"山西省","city":"大同市","district":"房山区","terminal":"Android","status":"0"}
val spark=SparkSession.builder().appName("phone1").master("local[*]").getOrCreate()
val df1=spark.read.json("data/JsonTest02.json")
df1.createOrReplaceTempView("t")
spark.sql(" select phoneNum,sum(money) as sm from t where status =1 group by phoneNum order by sm desc").show()
spark.sql(" select terminal,count(1) as c from t group by terminal order by c desc").show()
spark.sql("select * from (select *,row_number() over(distribute by province sort by totalCo desc) rn " +
"from (select phoneNum,province,count(1) totalCo from t group by province,phoneNum) a)b where rn<=3").show()