1 spark sql group by org.apache.spark.sql.AnalysisException: expression 'getbyid.userId
' is neither present in the group by, nor is it an aggregate function. Add to group by or wrap in first() (or first_value) if you don't care which value you get
在报错得字段上加上first()方法
2 对字段值转换日期 使用date_format方法