最近使用Scala操作Spark,想要输出RDD中相关内容去查找错误。
这样就遇到了一个问题:在单机模式下输出了相关的内容,而在集群模式中的操作却没有输出。
试过了一些方法,包含idea中调试,Print输出都不好使。
实际上在官方文档中已经有了这样的一句话。
Printing elements of an RDD
Another common idiom is attempting to print out the elements of an RDD using rdd.foreach(println) or rdd.map(println). On a single machine, this will generate the expected output and print all the RDD’s elements. However, in cluster mode, the output to stdout being called by the executors is now writing to the executor’s stdout instead, not the one on the driver, so stdout on the driver won’t show these! To print all elements on the driver, one can use the collect() method to first bring the RDD to the driver node thus: rdd.collect().foreach(println). This can cause the driver to run out of me