java spark jar包_Spark jar包找不到解决方法

今天在使用中DataFrame往Mysql中插入RDD,但是一直报出以下的异常次信息:

[[email protected] ~]$ bin/spark-submit --master local[2]

--jars lib/mysql-connector-java-5.1.35.jar

--class spark.sparkToJDBC ./spark-test_2.10-1.0.jar

spark assembly has been built with Hive, including Datanucleus jars on classpath

Exception in thread "main" java.sql.SQLException: No suitable driver found for

jdbc:mysql://www.iteblog.com:3306/spark?user=root&password=123&useUnicode=

true&characterEncoding=utf8&autoReconnect=true

at java.sql.DriverManager.getConnection(DriverManager.java:602)

at java.sql.DriverManager.getConnection(DriverManager.java:207)

at org.apache.spark.sql.DataFrame.createJDBCTable(DataFrame.scala:1189)

at spark.SparkToJDBC$.toMysqlFromJavaBean(SparkToJDBC.scala:20)

at spark.SparkToJDBC$.main(SparkToJDBC.scala:47)

at spark.SparkToJDBC.main(SparkToJDBC.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

at java.lang.reflect.Method.invoke(Method.java:597)

at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$

$runMain(SparkSubmit.scala:569)

at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)

at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)

at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)

at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

感觉很奇怪,我在启动作业的时候加了Mysql驱动啊在,怎么会出现这种异常呢??经过查找,发现在–jars参数里面加入Mysql是没有用的。通过查找,发现提交的作业可以通过加入--driver-class-path参数来设置driver的classpath,试了一下果然没有出现错误!

[[email protected] ~]$ bin/spark-submit --master local[2]

--driver-class-path lib/mysql-connector-java-5.1.35.jar

--class spark.SparkToJDBC ./spark-test_2.10-1.0.jar

其实,我们还可以在spark安装包的conf/spark-env.sh通过配置SPARK_CLASSPATH来设置driver的环境变量,如下:

export SPARK_CLASSPATH=$SPARK_CLASSPATH:/iteblog/com/mysql-connector-java-5.1.35.jar

这样也可以解决上面出现的异常。但是,我们不能同时在conf/spark-env.sh里面配置SPARK_CLASSPATH和提交作业加上–driver-class-path参数,否则会出现以下异常:

[[email protected] ~]$ bin/spark-submit --master local[2]

--driver-class-path lib/mysql-connector-java-5.1.35.jar

--class spark.SparkToJDBC ./spark-test_2.10-1.0.jar

Spark assembly has been built with Hive, including Datanucleus jars on classpath

Exception in thread "main" org.apache.spark.SparkException:

Found both spark.driver.extraClassPath and SPARK_CLASSPATH. Use only the former.

at org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply

$7.apply(SparkConf.scala:339)

at org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply

$7.apply(SparkConf.scala:337)

at scala.collection.immutable.List.foreach(List.scala:318)

at org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:337)

at org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:325)

at scala.Option.foreach(Option.scala:236)

at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:325)

at org.apache.spark.SparkContext.(SparkContext.scala:197)

at spark.SparkToJDBC$.main(SparkToJDBC.scala:41)

at spark.SparkToJDBC.main(SparkToJDBC.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

at java.lang.reflect.Method.invoke(Method.java:597)

at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$

deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)

at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)

at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)

at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)

at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

http://www.iteblog.com/archives/1300点击打开链接

原文:http://blog.csdn.net/liuhui_306/article/details/45247991

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值