spark
xlj3
静以修身,俭以养德
展开
-
spark开发第一天
package com.dt.spark import org.apache.spark.SparkConf import org.apache.spark.SparkContext object WorkCount { def main(args: Array[String]) { /** * 第一步 :设置spark的配置队形SparkConf,设置Spark运行原创 2016-10-31 11:00:03 · 332 阅读 · 0 评论 -
启动jar在linux spark集群
相对路径 ./spark-submit --class com.dt.spark.WordCont_Cluster --master spark://7077 /root/Document/SparkApps/WordCont.jar 绝对路径 /usr/local/spark/spark-1.6.0/bin/spark-submit --class com.dt.spark.WordCon原创 2016-10-31 13:11:18 · 590 阅读 · 0 评论 -
升序降序
package com.dt.spark import org.apache.spark.SparkConf import org.apache.spark.SparkContext import org.apache.spark.rdd.RDD.rddToOrderedRDDFunctions import org.apache.spark.rdd.RDD.rddToPairRDDFunct原创 2016-10-31 15:17:12 · 455 阅读 · 0 评论