(/root/.conda/envs/untitled) [root@master untitled]# spark-submit /root/IdeaProjects/untitled/test.py25/06/21 03:39:23 INFO spark.SparkContext: Running Spark version 3.2.425/06/21 03:39:23 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable25/06/21 03:39:23 INFO resource.ResourceUtils: ==============================================================25/06/21 03:39:23 INFO resource.ResourceUtils: No custom resources configured for spark.driver.25/06/21 03:39:23 INFO resource.ResourceUtils: ==============================================================25/06/21 03:39:23 INFO spark.SparkContext: Submitted application: NBAPlayerStatsAnalysis25/06/21 03:39:23 INFO resource.ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)25/06/21 03:39:23 INFO resource.ResourceProfile: Limiting resource is cpu25/06/21 03:39:23 INFO resource.ResourceProfileManager: Added ResourceProfile id: 025/06/21 03:39:23 INFO spark.SecurityManager: Changing view acls to: root25/06/21 03:39:23 INFO spark.SecurityManager: Changing modify acls to: root25/06/21 03:39:23 INFO spark.SecurityManager: Changing view acls groups to: 25/06/21 03:39:23 INFO spark.SecurityManager: Changing modify acls groups to: 25/06/21 03:39:23 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()25/06/21 03:39:23 INFO util.Utils: Successfully started service 'sparkDriver' on port 38477.25/06/21 03:39:23 INFO spark.SparkEnv: Registering MapOutputTracker25/06/21 03:39:23 INFO spark.SparkEnv: Registering BlockManagerMaster25/06/21 03:39:23 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information25/06/21 03:39:23 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up25/06/21 03:39:23 INFO spark.SparkEnv: Registering BlockManagerMasterHeartbeat25/06/21 03:39:23 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-87b7f430-0829-4b2f-a6db-66a1918672c425/06/21 03:39:23 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MiB25/06/21 03:39:23 INFO spark.SparkEnv: Registering OutputCommitCoordinator25/06/21 03:39:23 INFO util.log: Logging initialized @3782ms to org.sparkproject.jetty.util.log.Slf4jLog25/06/21 03:39:24 INFO server.Server: jetty-9.4.44.v20210927; built: 2021-09-27T23:02:44.612Z; git: 8da83308eeca865e495e53ef315a249d63ba9332; jvm 1.8.0_241-b0725/06/21 03:39:24 INFO server.Server: Started @3903ms25/06/21 03:39:24 INFO server.AbstractConnector: Started ServerConnector@2f4a9795{HTTP/1.1, (http/1.1)}{0.0.0.0:4040}25/06/21 03:39:24 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@78b4468f{/jobs,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6a05bffa{/jobs/json,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@27327a28{/jobs/job,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@28681368{/jobs/job/json,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1128b2de{/stages,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5f721d99{/stages/json,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@69bede41{/stages/stage,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5f9681b9{/stages/stage/json,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1acbd51a{/stages/pool,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@198ff579{/stages/pool/json,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2c276600{/storage,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@415fc67c{/storage/json,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3e5f2770{/storage/rdd,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@480634c3{/storage/rdd/json,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5c301060{/environment,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@79e3b995{/environment/json,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@209ce76a{/executors,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@15978f66{/executors/json,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@34bec70d{/executors/threadDump,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@94dd3ec{/executors/threadDump/json,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4e1f7856{/static,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5e864ef9{/,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7cfc3ab1{/api,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@57a41479{/jobs/job/kill,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2e673e9c{/stages/stage/kill,null,AVAILABLE,@Spark}25/06/21 03:39:24 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://master:404025/06/21 03:39:24 INFO executor.Executor: Starting executor ID driver on host master25/06/21 03:39:24 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 39469.25/06/21 03:39:24 INFO netty.NettyBlockTransferService: Server created on master:3946925/06/21 03:39:24 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy25/06/21 03:39:24 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, master, 39469, None)25/06/21 03:39:24 INFO storage.BlockManagerMasterEndpoint: Registering block manager master:39469 with 366.3 MiB RAM, BlockManagerId(driver, master, 39469, None)25/06/21 03:39:24 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, master, 39469, None)25/06/21 03:39:24 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, master, 39469, None)25/06/21 03:39:24 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@72f89ab{/metrics/json,null,AVAILABLE,@Spark}25/06/21 03:39:25 INFO internal.SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir.25/06/21 03:39:25 INFO internal.SharedState: Warehouse path is 'file:/root/IdeaProjects/untitled/spark-warehouse'.25/06/21 03:39:25 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6035fe3d{/SQL,null,AVAILABLE,@Spark}25/06/21 03:39:25 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4e7501f{/SQL/json,null,AVAILABLE,@Spark}25/06/21 03:39:25 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3725262b{/SQL/execution,null,AVAILABLE,@Spark}25/06/21 03:39:25 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@47ac637{/SQL/execution/json,null,AVAILABLE,@Spark}25/06/21 03:39:25 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1c6e4ad8{/static/sql,null,AVAILABLE,@Spark}Traceback (most recent call last): File "/root/IdeaProjects/untitled/test.py", line 8, in <module> df = spark.read.option("header", True).option("inferSchema", True).csv("hdfs://master:9000/usr/local/hadoop/clean_data_final.csv") File "/opt/spark3.2/python/lib/pyspark.zip/pyspark/sql/readwriter.py", line 410, in csv File "/opt/spark3.2/python/lib/py4j-0.10.9.5-src.zip/py4j/java_gateway.py", line 1322, in __call__ File "/opt/spark3.2/python/lib/pyspark.zip/pyspark/sql/utils.py", line 117, in decopyspark.sql.utils.AnalysisException: Path does not exist: hdfs://master:9000/usr/local/hadoop/clean_data_final.csv25/06/21 03:39:26 INFO spark.SparkContext: Invoking stop() from shutdown hook25/06/21 03:39:26 INFO server.AbstractConnector: Stopped Spark@2f4a9795{HTTP/1.1, (http/1.1)}{0.0.0.0:4040}25/06/21 03:39:26 INFO ui.SparkUI: Stopped Spark web UI at http://master:404025/06/21 03:39:26 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!25/06/21 03:39:26 INFO memory.MemoryStore: MemoryStore cleared25/06/21 03:39:26 INFO storage.BlockManager: BlockManager stopped25/06/21 03:39:26 INFO storage.BlockManagerMaster: BlockManagerMaster stopped25/06/21 03:39:26 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!25/06/21 03:39:26 INFO spark.SparkContext: Successfully stopped SparkContext25/06/21 03:39:26 INFO util.ShutdownHookManager: Shutdown hook called25/06/21 03:39:26 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-3541346c-fa51-49e7-8bc4-702377c3536f25/06/21 03:39:26 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-3541346c-fa51-49e7-8bc4-702377c3536f/pyspark-b60f643a-25ae-49bd-a26e-159c23c5ebb425/06/21 03:39:26 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-bf55404c-8e84-4f31-bd07-d3f8b28a2704
怎么解决