The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH.
字面意思缺少mysql连接的驱动程序,是因为hive的元数据存储在mysql中所以需要mysql驱动。单独启动hive没问题,但是启动Spark连接Hive就报错,查看网上修改spark-defaults.conf添加如下配置(无效)
spark.executor.extraClassPath /home/hadoop/jars/mysql-connector-java-5.1.46-bin.jar
spark.driver.extraClassPath /home/hadoop/jars/mysql-connector-java-5.1.46-bin.jar
----------------------------------------------------解决办法--------------------------------
后来将hive目录下的lib/mysql-connector-java-5.1.27-bin.jar 拷贝到 Spark 下的jars目录后解决
----------------------------------------------------总结----------------------------------------
Spark启动时会加载jars目录下的所有jar包,网上还有个办法:--jars命令 直接指定.(因为每次输命令都要指定所以嫌麻烦没有采用)