org.apache.spark.sql.AnalysisException: Table or view not found:

18/09/12 09:07:20 INFO cluster.SchedulerExtensionServices: Stopping SchedulerExtensionServices
(serviceOption=None,
 services=List(),
 started=false)
18/09/12 09:07:20 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
18/09/12 09:07:20 INFO memory.MemoryStore: MemoryStore cleared
18/09/12 09:07:20 INFO storage.BlockManager: BlockManager stopped
18/09/12 09:07:20 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
18/09/12 09:07:20 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
18/09/12 09:07:20 INFO spark.SparkContext: Successfully stopped SparkContext
18/09/12 09:07:21 ERROR yarn.ApplicationMaster: User class threw exception: org.apache.spark.sql.AnalysisException: Table or view not found: `CRMUSER`.`XXXXXXXXX`; line 1 pos 14;
'Project [*]
+- 'UnresolvedRelation `CRMUSER`.`XXXXXXXXX`

org.apache.spark.sql.AnalysisException: Table or view not found: `CRMUSER`.`XXXXXXXXX`; line 1 pos 14;
'Project [*]
+- 'UnresolvedRelation `CRMUSER`.`XXXXXXXXX`

    at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
    at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:82)
    at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:78)
    at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:127)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:126)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:126)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:126)
    at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.checkAnalysis(CheckAnalysis.scala:78)
    at org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis(Analyzer.scala:91)
    at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:52)
    at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:66)
    at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:623)
    at com.jlpay.action.check.account.file.ReadFileAbstractHbaseAction.saveToHbase(ReadFileAbstractHbaseAction.java:133)
    at com.jlpay.action.check.account.file.ReadFileAbstractHbaseAction.startSaveToHbase(ReadFileAbstractHbaseAction.java:87)
    at com.jlpay.action.check.account.file.ReadFileAbstractHbaseAction.action(ReadFileAbstractHbaseAction.java:58)
    at com.jlpay.action.check.account.file.agentpay.PinganAgentPayAction.action(PinganAgentPayAction.java:67)
    at com.jlpay.action.AbstractStatAction.executeDays(AbstractStatAction.java:130)
    at com.jlpay.action.check.account.file.ReadFileAbstractHbaseAction.execute(ReadFileAbstractHbaseAction.java:285)
    at com.jlpay.job.read.file.agent.PinganAgentPayJob.handle(PinganAgentPayJob.java:22)
    at com.jlpay.job.read.file.agent.PinganAgentPayJob.main(PinganAgentPayJob.java:16)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:686)
18/09/12 09:07:21 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: org.apache.spark.sql.AnalysisException: Table or view not found: `CRMUSER`.`XXXXXXXXX`; line 1 pos 14;
'Project [*]
+- 'UnresolvedRelation `CRMUSER`.`XXXXXXXXX`
)
18/09/12 09:07:21 INFO util.ShutdownHookManager: Shutdown hook called
18/09/12 09:07:21 INFO util.ShutdownHookManager: Deleting directory /data5/yarn/nm/usercache/hdfs/appcache/application_1524727076954_10960/spark-7cb7b711-beff-422f-961f-030f18990853
18/09/12 09:07:21 INFO util.ShutdownHookManager: Deleting directory /data3/yarn/nm/usercache/hdfs/appcache/application_1524727076954_10960/spark-c5c5ec43-c232-426c-b042-99040c534197
18/09/12 09:07:21 INFO util.ShutdownHookManager: Deleting directory /data8/yarn/nm/usercache/hdfs/appcache/application_1524727076954_10960/spark-3fe1a4c2-794a-460f-b757-fb4a500215fd
18/09/12 09:07:21 INFO util.ShutdownHookManager: Deleting directory /data11/yarn/nm/usercache/hdfs/appcache/application_1524727076954_10960/spark-853057ed-25f2-4e00-ae9c-f0d1861b899d
18/09/12 09:07:21 INFO util.ShutdownHookManager: Deleting directory /data4/yarn/nm/usercache/hdfs/appcache/application_1524727076954_10960/spark-8dfff226-708a-4695-a094-ac1210b292c1
18/09/12 09:07:21 INFO util.ShutdownHookManager: Deleting directory /data10/yarn/nm/usercache/hdfs/appcache/application_1524727076954_10960/spark-1b276dee-72b7-45d9-837d-575917ee72af
18/09/12 09:07:21 INFO util.ShutdownHookManager: Deleting directory /data9/yarn/nm/usercache/hdfs/appcache/application_1524727076954_10960/spark-dbc2bb5a-c249-42d5-8e1c-6f676f32848b
18/09/12 09:07:21 INFO util.ShutdownHookManager: Deleting directory /data7/yarn/nm/usercache/hdfs/appcache/application_1524727076954_10960/spark-749a9f1e-b16c-401b-b022-97158d7f08b9
18/09/12 09:07:21 INFO util.ShutdownHookManager: Deleting directory /data1/yarn/nm/usercache/hdfs/appcache/application_1524727076954_10960/spark-25b169b4-f0bd-40ae-bcbd-26c575f24958
18/09/12 09:07:21 INFO util.ShutdownHookManager: Deleting directory /data2/yarn/nm/usercache/hdfs/appcache/application_1524727076954_10960/spark-ec6292b8-369a-48f6-a4ff-14827bfa560b
18/09/12 09:07:21 INFO util.ShutdownHookManager: Deleting directory /data12/yarn/nm/usercache/hdfs/appcache/application_1524727076954_10960/spark-b17ad723-3932-4177-b522-99618819a7e2
18/09/12 09:07:21 INFO util.ShutdownHookManager: Deleting directory /data6/yarn/nm/usercache/hdfs/appcache/application_1524727076954_10960/spark-a6945acb-7736-43be-98af-e8aee1712b92

 

 

 

线上报错,主要原因是启动spark配置文件目录没有hive配置文件。所以找不到表,并不是因为表不存在

我的spark配置hadoop目录里面包含如下几个配置文件

 

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 4
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 4
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值