1.自从spark2.0.0发布没有assembly的包了,在jars里面,是很多小jar包
修改目录查找jar
2.异常HiveConf of name hive.enable.spark.execution.engine does not exist
在hive-site.xml中:
hive.enable.spark.execution.engine过时了,配置删除即可
3.异常
Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark client.)' FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
Spark与hive版本不对,