相关版本
- spark version: spark-2.1.0-bin-hadoop2.6
错误片段
Exception in thread "main" org.apache.spark.sql.catalyst.analysis.NoSuchTableException: Table or view 'rating_table' not found in database 'default';
at org.apache.spark.sql.hive.client.HiveClient$$anonfun$getTable$1.apply(HiveClient.scala:76)
at org.apache.spark.sql.hive.client.HiveClient$$anonfun$getTable$1.apply(HiveClient.scala:76)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.hive.client.HiveClient$class.getTable(HiveClient.scala:76)
at org.apache.spark.sql.hive.client.HiveClientImpl.getTable(HiveClientImpl.scala:78)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$org$apache$spark$sql$hive$HiveExternalCatalog$$getRawTable$1.apply(HiveExternalCatalog.scala:110)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$org$apache$spark$sql$hive$HiveExternalCatalog$$getRawTable$1.apply(HiveExternalCatalog.scala:110)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:95)
at org.apache.spark.sql.hive.HiveExternalCatalog.org$apache$spark$sql$hive$HiveExternalCatalog$$getRawTable(HiveExternalCatalog.scala:109)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$getTable$1.apply(HiveExternalCatalog.scala:601)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$getTable$1.apply(HiveExternalCatalog.scala:601)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:95)
at org.apache.spark.sql.hive.HiveExternalCatalog.getTable(HiveExternalCatalog.scala:600)
at org.apache.spark.sql.hive.HiveMetastoreCatalog.lookupRelation(HiveMetastoreCatalog.scala:106)
at org.apache.spark.sql.hive.HiveSessionCatalog.lookupRelation(HiveSessionCatalog.scala:69)
at org.apache.spark.sql.SparkSession.table(SparkSession.scala:578)
at org.apache.spark.sql.SparkSession.table(SparkSession.scala:574)
at org.apache.spark.sql.SQLContext.table(SQLContext.scala:708)
at org.vincent.sql.sqlUdaf$.main(sqlUdaf.scala:16)
at org.vincent.sql.sqlUdaf.main(sqlUdaf.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
19/12/03 15:23:22 INFO spark.SparkContext: Invoking stop() from shutdown hook
解决方法
ref:
- https://stackoverflow.com/questions/43619137/hive-tables-not-found-in-spark-sql-spark-sql-analysisexception-in-cloudera-vm
- https://stackoverflow.com/questions/34585953/spark-job-did-not-find-table-in-hive-database
- Using SparkSession.enableHiveSupport() instead of deprecated SQLContext or HiveContext.
- copy hive-site.xml into SPARK CONF (/usr/lib/spark/conf) directory // 这里解决了我的问题
/data/server/hive/conf/hive-site.xml /data/server/spark-2.1.0-bin-hadoop2.6/conf/
- Adding the same directory to the classpath while executing the jar (Thanks to Paul and Samson above)