spark+mysql+驱动_java使用spark查询mysql报错找不到驱动

System.setProperty("hadoop.home.dir","D:\\spark-1.6.1-bin-hadoop2.6\\spark-1.6.1-bin-hadoop2.6");SparkConfconf=newSparkConf().setAppName("sparktest1").setMaster("local[2]...

System.setProperty("hadoop.home.dir", "D:\\spark-1.6.1-bin-hadoop2.6\\spark-1.6.1-bin-hadoop2.6");

SparkConf conf = new SparkConf()

.setAppName("spark test1")

.setMaster("local[2]")

.set("spark.testing.memory", "2147480000");

JavaSparkContext context = new JavaSparkContext(conf);

String sql = " (select * from tb_manager) as user_organ";

SQLContext sqlContext = SQLContext.getOrCreate(JavaSparkContext.toSparkContext(context));

DataFrameReader reader = sqlContext.read().format("jdbc");

reader.option("url","jdbc:mysql://120.27.163.3:3306/sinokorPro");//数据库路径

reader.option("dbtable",sql);//数据表名

reader.option("driver","com.mysql.jdbc.Driver");

reader.option("user", "root");

reader.option("password", "huahanAPP2015");

Dataset projectDataSourceDFFromMySQL = reader.load();

projectDataSourceDFFromMySQL.show();

报错信息:

Exception in thread "main" java.lang.ClassNotFoundException: com.mysql.jdbc.Driver

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

at org.apache.spark.sql.execution.datasources.jdbc.DriverRegistry$.register(DriverRegistry.scala:38)

at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$6.apply(JDBCOptions.scala:78)

at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$6.apply(JDBCOptions.scala:78)

at scala.Option.foreach(Option.scala:257)

at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.(JDBCOptions.scala:78)

at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.(JDBCOptions.scala:34)

at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:34)

at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:307)

at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)

at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)

at com.spark.Test.main(Test.java:29)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)

展开

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值