一、本地删除文件后,远程服务器不删除
二、PySpark环境添加数据库连接驱动
1. spark环境
/home/xxx/kdh/spark/jars
2. pyspark环境
cd /software/anaconda3/envs/pyspark_env/lib/python3.8/site-packages/pyspark/jars
3. yarn使用
hdfs:///tmp/zzdb/hadoop27/spark/jars/
可从/home/xxx/kdh/spark/conf/spark-defaults.conf查看spark.yarn.jars