- spark sql中使用$来引用字段时,需引入import spark.implicits._,否则会报错
- session读取json文件的时候,默认一行作为一个完整的json解析,若实际的json串行,则会报错:
ERROR FileFormatWriter: Aborting job null.
org.apache.spark.sql.AnalysisException: Since Spark 2.3, the queries from raw JSON/CSV files are disallowed when the
referenced columns only include the internal corrupt record column
(name