from pyspark.context import SparkContext from pyspark.sql.session import SparkSession sc = SparkContext('local') spark = SparkSession(sc)df = spark.read.csv('aaa.csv')
使用spark.read.csv ,出现NameError: name 'spark' is not defined
最新推荐文章于 2021-07-03 00:25:40 发布