详情参考:
https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started
https://blog.csdn.net/gulugulu_gulu/article/details/105706090
https://blog.csdn.net/yoshubom/article/details/113845190
https://blog.csdn.net/weixin_52918377/article/details/117123969?spm=1001.2101.3001.6650.8&utm_medium=distribute.pc_relevant.none-task-blog-2%7Edefault%7EBlogCommendFromBaidu%7ERate-8-117123969-blog-119000138.pc_relevant_antiscanv3&depth_1-utm_source=distribute.pc_relevant.none-task-blog-2%7Edefault%7EBlogCommendFromBaidu%7ERate-8-117123969-blog-119000138.pc_relevant_antiscanv3&utm_relevant_index=12
Hive的版本和Spark的版本要匹配;
具体来说,你使用的Hive版本编译时候用的哪个版本的Spark,那么就需要使用相同版本的Spark,可以在Hive的pom.xml中查看spark.version来确定;
Hive root pom.xml’s defines what version of Spark it was built/tested with.
Spark使用的jar包,必须是没有集成Hive的;
也就是说,编译