Spark Streaming的spark代码运行时发现收取不到hdfs相应的是数据流
案例代码
import org.apache.spark.SparkConf
import org.apache.spark.streaming.dstream.DStream
import org.apache.spark.streaming.{Seconds, StreamingContext}
object HDFSInputDStreamDemo extends App {
val conf: SparkConf = new SparkConf().setMaster("local[*]").setAppName("hdfsDemo")
val ssc = new StreamingContext(conf,Seconds(5))
//TODO 创建一个输入流,读取文件系统上的数据
val lines: DStream[String] = ssc.textFileStream("hdfs://192.168.222.115:9000/data")
val wordcounts: DStream[(String, Int)] = lines.flatMap(_.split(" ")).map((_,1)).reduceByKey(_+_)
wordcounts.print()
ssc.start()
ssc.awaitTermination()
}
向hdfs相应文件夹提交数据,但是idea取收不到