SparkSQL连接MySQL8测试

闲着没事,做一下SparkSQL连接MySQL8测试

1、依赖引入

<dependency>

<groupId>mysql</groupId>

<artifactId>mysql-connector-java</artifactId>

<scope>runtime</scope>

</dependency>

<dependency>

<groupId>org.apache.spark</groupId>

<artifactId>spark-sql_2.12</artifactId>

<version>2.4.2</version>

</dependency>

2、MySQL表结构

3、测试代码

public static void test() throws Exception {

val url = "jdbc:mysql://127.0.0.1:3306/test?serverTimezone=GMT%2B8&useUnicode=true&characterEncoding=UTF-8&allowMultiQueries=true";

SparkSession spark = SparkSession.builder()

.appName("Spark")

.master("local[2]") //如果有spark配置,此处可设置相应的名称

.getOrCreate();

 

Dataset<Row> dataset2 = spark.read().jdbc(url, "(select userName,password from test.t_user) t", new Properties() {

{

put("user", "you mysql user name");

put("password", "you mysql user passord");

put("customSchema", "userName STRING, password STRING");

}

});

dataset2.show();

}

4、由于我没有安装hadoop,估计会报如下错误信息:

java.io.IOException: Hadoop home directory /usr/local/Cellar/hadoop/3.1.0 does not exist, is not a directory, or is not an absolute path.

at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:335)

at org.apache.hadoop.util.Shell.<clinit>(Shell.java:350)

at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)

at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:116)

at org.apache.hadoop.security.Groups.<init>(Groups.java:93)

at org.apache.hadoop.security.Groups.<init>(Groups.java:73)

at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:293)

at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:283)

at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260)

at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:789)

at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:774)

at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:647)

at org.apache.spark.util.Utils$.$anonfun$getCurrentUserName$1(Utils.scala:2422)

at scala.Option.getOrElse(Option.scala:138)

at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2422)

at org.apache.spark.SparkContext.<init>(SparkContext.scala:293)

at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)

at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$5(SparkSession.scala:935)

at scala.Option.getOrElse(Option.scala:138)

at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)

at demo.spark.sparksql.mysql.ConnectMySQL.test(ConnectMySQL.java:34)

at demo.spark.sparksql.mysql.ConnectMySQL.main(ConnectMySQL.java:19)

5、结果打印:

 

     

 

 

 

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值