spark与mysql整合

maven添加msyql 依赖
      <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
       <dependency>
           <groupId>org.apache.spark</groupId>
           <artifactId>spark-core_2.11</artifactId>
           <version>2.4.3</version>
       </dependency>
       <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
       <dependency>
           <groupId>org.apache.spark</groupId>
           <artifactId>spark-sql_2.11</artifactId>
           <version>2.4.3</version>
       </dependency>
       <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming -->
       <dependency>
           <groupId>org.apache.spark</groupId>
           <artifactId>spark-streaming_2.11</artifactId>
           <version>2.4.3</version>
       </dependency>
       <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-mllib -->
       <dependency>
           <groupId>org.apache.spark</groupId>
           <artifactId>spark-mllib_2.11</artifactId>
           <version>2.4.3</version>
       </dependency>
       <!-- https://mvnrepository.com/artifact/com.alibaba/fastjson -->
       <dependency>
           <groupId>com.alibaba</groupId>
           <artifactId>fastjson</artifactId>
           <version>1.2.58</version>
       </dependency>
       <!-- https://mvnrepository.com/artifact/mysql/mysql-connector-java -->
       <dependency>
           <groupId>mysql</groupId>
           <artifactId>mysql-connector-java</artifactId>
           <version>5.1.47</version>
       </dependency>

spark 向mysql 写数据

spark 官方文档

1.在mysql中新建表结构
2.编写代码
    val spark = SparkSession.builder().master("local").appName("log").getOrCreate()
    val sc = spark.sparkContext
    //val fileRDD = sc.textFiles(path)
    //val lineRDD = fileRDD.tranformation
    // case class XX()
    val log = lineRDD .map(x => XX())
    val rlog = spark.createDataFrame(log)
        rlog.write
          .mode(SaveMode.Append)
          .format("jdbc")
          .option("url", "jdbc:mysql://ip:3306")
          .option("driver", "com.mysql.jdbc.Driver")
          .option("dbtable", "dbtable")
          .option("user", "us")
          .option("password", "pswd")
          .save()
    sc.stop()
    spark.close()
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值