spark-submit提交作业时,mysql数据库连接不上,ClassNotFound执行类找不到

scala+mysql,The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.

提交作业时,出现连接不上的问题

在本地IDEA生成jar包,放到虚拟机上执行提交任务时,出现以下错误。
下面展示一些 报错片段

Exception in thread "main" java.sql.SQLNonTransientConnectionException: Could not create connection to database server. Attempted reconnect 3 times. Giving up.
        at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:110)
        at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)
        at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:89)
        at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:63)
        at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:73)
        at com.mysql.cj.jdbc.ConnectionImpl.connectWithRetries(ConnectionImpl.java:906)
        at com.mysql.cj.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:831)
        at com.mysql.cj.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:456)
        at com.mysql.cj.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:246)
        at com.mysql.cj.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:199)
        at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$createConnectionFactory$1(JdbcUtils.scala:64)
        at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:56)
        at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:226)
        at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:35)
        at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:339)
        at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:279)
        at org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:268)
        at scala.Option.getOrElse(Option.scala:189)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:268)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:203)
        at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:294)
        at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:336)
        at summ.ReqGenerateProdSrc_7$.main(ReqGenerateProdSrc_7.scala:50)
        at summ.ReqGenerateProdSrc_7.main(ReqGenerateProdSrc_7.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: com.mysql.cj.exceptions.CJCommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at com.mysql.cj.exceptions.ExceptionFactory.createException(ExceptionFactory.java:61)
        at com.mysql.cj.exceptions.ExceptionFactory.createException(ExceptionFactory.java:105)
        at com.mysql.cj.exceptions.ExceptionFactory.createException(ExceptionFactory.java:151)
        at com.mysql.cj.exceptions.ExceptionFactory.createCommunicationsException(ExceptionFactory.java:167)
        at com.mysql.cj.protocol.a.NativeSocketConnection.connect(NativeSocketConnection.java:91)
        at com.mysql.cj.NativeSession.connect(NativeSession.java:144)
        at com.mysql.cj.jdbc.ConnectionImpl.connectWithRetries(ConnectionImpl.java:850)
        ... 30 more
Caused by: java.net.ConnectException: 拒绝连接 (Connection refused)
        at java.net.PlainSocketImpl.socketConnect(Native Method)
        at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
        at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
        at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
        at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
        at java.net.Socket.connect(Socket.java:589)
        at com.mysql.cj.protocol.StandardSocketFactory.connect(StandardSocketFactory.java:155)
        at com.mysql.cj.protocol.a.NativeSocketConnection.connect(NativeSocketConnection.java:65)
        ... 32 more

解决方案

百度了问题,也有很多博主给出了解决方案,最终还是不太行,下面是我的解决方案,同时附上可能出现问题的地方。

  1. 在IDEA中创建连接写的数据源连接代码,需要加上自动连接
    下面展示一些 代码片
--autoReconnect=true&failOverReadOnly=false,这两个参数需要加上
val url = "jdbc:mysql://172.1.3.111:3306/alm_result_0?useUnicode=true&characterEncoding=UTF-8&autoReconnect=true&failOverReadOnly=false&serverTimezone=UTC"
我是通过虚拟机与本地连接的方式,mysql在我本地,所以IP也需要改成我本地的IP。
  1. 注释掉setMaster(“local[]")以及.master("local[]”)
val conf: SparkConf = new SparkConf()
//      .setMaster("local[*]")
     .setAppName("ReqGenerateProdSrc_7")
val sc = new SparkContext(conf)
   val sparkSession: SparkSession = SparkSession.builder()
//      .master("local[*]")     
  1. 提交作业的时候,一直出现classnotfound的报错,验证了不是数据连接的问题,那么就要考虑是否是jar包的问题,在jar Artifacts的时候,把多余的依赖包去掉。只剩下自身,如果后续说依赖包不存在,再添加即可
    只留下mian所属的依赖包

附上spark-submit命令

bin/spark-submit \
--class summ.ReqGenerateProdSrc_7 \
--master yarn \
--deploy-mode client \
/opt/module/jar/batch.jar \
--driver-memory 2g \
--executor-memory 2g \
--num-executors 2 \
--executor-cores 24

今天你摸鱼了吗~欢迎留言

  • 0
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

今天去哪摸鱼

码字不易,感谢金主

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值