解决rdd无法调用toDF及 value toDF is not a member of org.apache.spark.rdd.RDD[names]

需加入隐式导入才能toDF

  val spark = SparkSession.builder().appName("ch2homework1").master("local[4]").getOrCreate()
  import spark.implicits._ //不加入隐式导入则不能toDF

编译代码

object homework01 {
 
  def main(args: Array[String]): Unit = {
  
    val conf =  new SparkConf().setAppName("ch2homework01").setMaster("local[4]")
    val sc = new SparkContext(conf)
    val spark = SparkSession.builder().appName("ch2homework1").master("local[4]").getOrCreate()
    case class names(name:String,count:Int)
    import spark.implicits._ //不加入隐式导入则不能toDF
    val data = sc.textFile("file:///E://FTP//spark//2-sparkCore1//kddcup.data.gz").flatMap(_.split("\n")).map(line =>
     names((line.split(",").reverse(0)), 1)).toDF()
    data.show()
  }
}

出现

Error:(31, 46) value toDF is not a member of org.apache.spark.rdd.RDD[names]

解决:
将case class names(name:String,count:Int)放到需要用到names方法的方法体外

object homework01 {

  case class names(name:String,count:Int)

  def main(args: Array[String]): Unit = {
    val conf =  new SparkConf().setAppName("ch2homework01").setMaster("local[4]")
    val sc = new SparkContext(conf)
    val spark = SparkSession.builder().appName("ch2homework1").master("local[4]").getOrCreate()
    import spark.implicits._
    val data = sc.textFile("file:///E://FTP//spark//2-sparkCore1//kddcup.data.gz").flatMap(_.split("\n")).map(line =>
     names((line.split(",").reverse(0)), 1)).toDF()

    data.show()
  }
}
~~
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值