集群中已部署且正常使用了hadoop, hive, spark。打算启动spark-sql来访问hive数据时报错,
WARN metadata.Hive: Failed to access metastore. This class should not accessed in runtime.
org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
。…….……。…。……。
Caused by: java.lang.reflect.InvocationTargetException
Caused by: MetaException(message:Version information not found in metastore. )
因为版本信息原因,sqark-sql不能访问hive的metastore。回顾环境和所做过的配置,
将hive的hive-site.xml 拷贝到了spark/conf下,并在spark-env.sh中设置了$HIVE_HOME,
先前把hive从1.2.2升级到2.1.1后还正常操作过,不应该是hive本身的问题。只可能出现在spark与hive的兼容方面。
经百度和验证,定位到了hive-site.xml中的这个配置项,默认是true,即要做hive metastor