问题
spark shell 启动报找不到 com.mysql.jdbc.Driver was not found
解决
1. /apache-hive-1.2.1-bin/lib/目录中添加 mysql-connector-java-5.1.35-bin.jar
2. 转到spark 目录中 启动spark-shell
/bin/spark-shell --master spark://hbase1:7077 --executor-memory 1g --total-executor-cores 2 --driver-class-path /home/hadoop/apache-hive-1.2.1-bin/lib/mysql-connector-java-5.1.35-bin.jar
注意:--driver-class-path /home/hadoop/apache-hive-1.2.1-bin/lib/mysql-connector-java-5.1.35-bin.jar 添加相关驱动