![](https://img-blog.csdnimg.cn/20201014180756925.png?x-oss-process=image/resize,m_fixed,h_64,w_64)
spark
PeppaKing
这个作者很懒,什么都没留下…
展开
-
[Spark][spark_streaming]#5_spark_streaming&spark_sql
import org.apache.spark.SparkConfimport org.apache.spark.rdd.RDDimport org.apache.spark.sql.SparkSessionimport org.apache.spark.streaming.{Seconds, StreamingContext, Time}/** * Spark Streaming整...原创 2019-12-18 16:31:05 · 73 阅读 · 0 评论 -
[Spark][spark_streaming]#4_Transform
import org.apache.spark.SparkConfimport org.apache.spark.streaming.{Seconds, StreamingContext}/** * 黑名单过滤 */object TransformApp { def main(args: Array[String]): Unit = { val sparkConf...原创 2019-12-18 16:19:29 · 55 阅读 · 0 评论 -
[Spark][spark_streaming]#3_将结果写入到MySQL
import java.sql.DriverManagerimport org.apache.spark.SparkConfimport org.apache.spark.streaming.{Seconds, StreamingContext}/** * 使用Spark Streaming完成词频统计,并将结果写入到MySQL数据库中 */object ForeachRDDA...原创 2019-12-18 16:18:14 · 69 阅读 · 0 评论 -
[Spark][spark_streaming]#2_Stateful
import org.apache.spark.SparkConfimport org.apache.spark.streaming.{Seconds, StreamingContext}/** * 使用Spark Streaming完成有状态统计(统计到目前为止出现过的单词计数(保存以前的状态)) */object StatefulWordCount { def main(...原创 2019-12-18 16:17:05 · 72 阅读 · 0 评论 -
[Spark][spark_streaming]#1_QuickStart
spark-submitbin/spark-submit --master local[2] --class org.apache.spark.example.streaming.NetworkWordCount --name NetworkWordCount /root/spark-2.4.3-bin-2.6.0-cdh5.15.1/examples/jars/spark-examples_2...原创 2019-12-17 18:36:28 · 71 阅读 · 0 评论 -
[Spark][spark_sql]#4_DataSource API
import java.util.Propertiesimport com.typesafe.config.ConfigFactoryimport org.apache.spark.sql.{DataFrame, Dataset, SaveMode, SparkSession}object DataSourceApp { def main(args: Array[String])...原创 2019-10-29 16:48:52 · 67 阅读 · 0 评论 -
[Spark][spark_sql]#3_SparkSQL API
SparkSessionimport org.apache.spark.sql.{DataFrame, SparkSession}bject SparkSessionApp { def main(args: Array[String]): Unit = { // DF/DS编程的入口点 val spark: SparkSession = SparkSession.b...原创 2019-10-29 15:30:11 · 170 阅读 · 0 评论 -
[Spark][spark_ml]#5_projects
文本情感object Main { def main(args: Array[String]): Unit = { val conf = new SparkConf().setMaster("local").setAppName("SA") val spark = SparkSession.builder().config(conf).getOrCreate() sp...原创 2019-06-11 21:53:42 · 116 阅读 · 0 评论 -
[Spark][spark_ml]#4_PCA降维
object PCA { def main(args: Array[String]): Unit = { val conf = new SparkConf().setMaster("local").setAppName("iris") val spark = SparkSession.builder().config(conf).getOrCreate() spark...原创 2019-06-11 18:48:22 · 260 阅读 · 0 评论 -
[Spark][spark_ml]#3_聚类算法
object KMeans { def main(args: Array[String]): Unit = { val conf = new SparkConf().setMaster("local").setAppName("iris") val spark = SparkSession.builder().config(conf).getOrCreate() va...原创 2019-06-11 18:32:32 · 196 阅读 · 0 评论 -
[Spark][spark_ml]#2_分类算法
object Main { def main(args: Array[String]): Unit = { val conf = new SparkConf().setMaster("local").setAppName("iris") val spark = SparkSession.builder().config(conf).getOrCreate() spar...原创 2019-06-11 18:30:16 · 250 阅读 · 0 评论 -
[Spark][spark_ml]#1_回归算法
object LinearRegression { def main(args: Array[String]): Unit = { val conf = new SparkConf().setAppName("linear").setMaster("local") val sc = new SparkContext(conf) val spark = SparkSes...原创 2019-06-11 17:35:29 · 103 阅读 · 0 评论 -
[Spark][spark_sql]#2_DataFrame&Dataset
/** * DataFrame API基本操作 */object DataFrameApp { def main(args: Array[String]) { val spark = SparkSession.builder().appName("DataFrameApp").master("local[2]").getOrCreate() // 将json文件加载...原创 2019-06-09 16:53:26 · 109 阅读 · 0 评论 -
[Spark][spark_sql]#1_sparksql入门
SQLContext的使用object SQLContextApp { def main(args: Array[String]): Unit = { val path = "../resources/people.json" //1.创建相应Context val sparkConf = new SparkConf() sparkConf.setApp...原创 2019-06-09 16:00:38 · 77 阅读 · 0 评论 -
[Spark][spark_core]#1_spark入门
[root@node00 sbin]# spark-shell --master local[2]val file = spark.sparkContext.textFile("file:///usr/local/wc.txt")val wordCounts = file.flatMap(line => line.split(",")).map((word => (word,1))...原创 2019-06-08 20:02:02 · 75 阅读 · 0 评论 -
[Spark][spark_core]#0_spark安装
编译安装Local模式[root@node00 java]# spark-shell --master local[2]http://192.168.106.100:4040/jobs/Standalone模式cp conf/spark-env.sh.template spark-env.shvi spark-env.shSPARK_MASTER_HOST=localhostS...原创 2019-06-08 19:53:18 · 83 阅读 · 0 评论