pyspark 判断文件夹是否存在,如果存在就删除,避免了因为文件存在,保存失败问题
产考文档:https://stackoverflow.com/questions/30405728/apache-spark-check-if-file-exists?lq=1
fs = self.sc._jvm.org.apache.hadoop.fs.FileSystem.get(spark.sparkContext._jsc.hadoopConfiguration())
is_exist = fs.exists(spark.sparkContext._jvm.org.apache.hadoop.fs.Path(
"hdfs://mishibigdata01:8020/mishi_sales_forecast/models/"))
# 如果存在,就删除
if is_exist:
fs.delete(spark.sparkContext._jvm.org.apache.hadoop.fs.Path(
"hdfs://mishibigdata01:8020/mishi_sales_forecast/models/"), True)