Hadoop 与 HBase 版本对应
参考:https://www.tqwba.com/x_d/jishu/73706.html
图片来源参考官网:
http://hbase.apache.org/book.html#hadoop
hive和hadoop、hive和spark之间版本对应关系
版本信息来自于hive源码包的pom.xml:
hive-3.1.2
<hadoop.version>3.1.0</hadoop.version>
<hbase.version>2.0.0-alpha4</hbase.version>
<spark.version>2.3.0</spark.version>
<scala.binary.version>2.11</scala.binary.version>
<scala.version>2.11.8</scala.version>
<zookeeper.version>3.4.6</zookeeper.version>
hive-2.3.6
<hadoop.version>2.7.2</hadoop.version>
<hbase.version>1.1.1</hbase.version>
<spark.version>2.0.0</spark.version>
<scala.binary.version>2.11</scala.binary.version>
<scala.version>2.11.8</scala.version>
<zookeeper.version>3.4.6</zookeeper.version>
参考其他的博客
Hive Version Spark Version
3.0.x 2.3.0
2.3.x 2.0.0
2.2.x 1.6.0
2.1.x 1.6.0
2.0.x 1.5.0
1.2.x 1.3.1
1.1.x 1.2.0
参考博客链接:
https://blog.csdn.net/weixin_44033089/article/details/86588595
apache-hive-1.2.2-src <spark.version>1.3.1</spark.version>
apache-hive-2.1.1-src <spark.version>1.6.0</spark.version>
apache-hive-2.3.3-src <spark.version>2.0.0</spark.version>
apache-hive-3.0.0-src <spark.version>2.3.0</spark.version>
stackoverflow上可行的例子是:
spark 2.0.2 with hadoop 2.7.3 and hive 2.1
参考链接:
https://stackoverflow.com/questions/42281174/hive-2-1-1-on-spark-which-version-of-spark-should-i-use
qq群里有网友给出的版本是:
Hive 2.6 spark2.2.0
版本如下暂时没发现有什么兼容性问题:
apache-hive-3.0.0-bin
hadoop-3.0.3
spark-2.3.1-bin-hadoop2.7
参考博客链接:
https://blog.csdn.net/appleyuchi/article/details/81171785
flink
版本信息来自于flink源码包的pom.xml:
flink-1.9.1
<hadoop.version>2.4.1</hadoop.version>
<scala.version>2.11.12</scala.version>
<scala.binary.version>2.11</scala.binary.version>
<zookeeper.version>3.4.10</zookeeper.version>
<hive.version>2.3.4</hive.version>