Sparksql 一个错误:org.apache.spark.sql.AnalysisException: character ' ' not supported here;

遇到如下错误,但是在hive中单独运行,或者是在spark-shell中单独运行的也是毫无问题的,为何偏偏在sparksql中出问题,而且不存在所说的那个 character ’ ’ 。
还有就是我这个临时表本来是采用insert overwrite的也是正确的。
或许应该是,解析有点问题。

错误日志:
Exception in thread “main” org.apache.spark.sql.AnalysisException: character ’ ’ not supported here; line 1 pos 34
at org.apache.spark.sql.hive.HiveQl .createPlan(HiveQl.scala:318)atorg.apache.spark.sql.hive.ExtendedHiveQlParser anonfun hiveQl 1.apply(ExtendedHiveQlParser.scala:41)atorg.apache.spark.sql.hive.ExtendedHiveQlParser anonfun hiveQl 1.apply(ExtendedHiveQlParser.scala:40)atscala.util.parsing.combinator.Parsers Success.map(Parsers.scala:136)
at scala.util.parsing.combinator.Parsers Success.map(Parsers.scala:135)atscala.util.parsing.combinator.Parsers Parser

anonfun$map$1.apply(Parsers.scala:242)atscala.util.parsing.combinator.Parsers$Parser
anonfun map 1.apply(Parsers.scala:242)
at scala.util.parsing.combinator.Parsers
anon$3.apply(Parsers.scala:222)atscala.util.parsing.combinator.Parsers$Parser
anonfun append 1
anonfun$apply$2.apply(Parsers.scala:254)atscala.util.parsing.combinator.Parsers$Parser
anonfun append 1
anonfun$apply$2.apply(Parsers.scala:254)atscala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)atscala.util.parsing.combinator.Parsers$Parser
anonfun append 1.apply(Parsers.scala:254)
at scala.util.parsing.combinator.Parsers Parser anonfun append 1.apply(Parsers.scala:254)atscala.util.parsing.combinator.Parsers anon 3.apply(Parsers.scala:222) at scala.util.parsing.combinator.Parsers
anon$2
anonfun apply 14.apply(Parsers.scala:891)
at scala.util.parsing.combinator.Parsers
anon$2
anonfun apply 14.apply(Parsers.scala:891)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at scala.util.parsing.combinator.Parsers
anon$2.apply(Parsers.scala:890)atscala.util.parsing.combinator.PackratParsers
anon 1.apply(PackratParsers.scala:110)atorg.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:34)atorg.apache.spark.sql.hive.HiveQl .parseSql(HiveQl.scala:295)
at org.apache.spark.sql.hive.HiveQLDialect
anonfun$parse$1.apply(HiveContext.scala:66)atorg.apache.spark.sql.hive.HiveQLDialect
anonfun parse 1.apply(HiveContext.scala:66)
at org.apache.spark.sql.hive.client.ClientWrapper
anonfun$withHiveState$1.apply(ClientWrapper.scala:281)atorg.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(ClientWrapper.scala:228)atorg.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:227)atorg.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:270)atorg.apache.spark.sql.hive.HiveQLDialect.parse(HiveContext.scala:65)atorg.apache.spark.sql.SQLContext
anonfun 2.apply(SQLContext.scala:211)atorg.apache.spark.sql.SQLContext anonfun 2.apply(SQLContext.scala:211) at org.apache.spark.sql.execution.SparkSQLParser
anonfun$org$apache$spark$sql$execution$SparkSQLParser
others 1.apply(SparkSQLParser.scala:114)atorg.apache.spark.sql.execution.SparkSQLParser anonfun org apache spark sql execution SparkSQLParser others 1.apply(SparkSQLParser.scala:113) at scala.util.parsing.combinator.Parsers Success.map(Parsers.scala:136)atscala.util.parsing.combinator.Parsers Success.map(Parsers.scala:135) at scala.util.parsing.combinator.Parsers Parser anonfun map 1.apply(Parsers.scala:242)atscala.util.parsing.combinator.Parsers Parser
anonfun$map$1.apply(Parsers.scala:242)atscala.util.parsing.combinator.Parsers
anon 3.apply(Parsers.scala:222)atscala.util.parsing.combinator.Parsers Parser
anonfun$append$1
anonfun apply 2.apply(Parsers.scala:254) at scala.util.parsing.combinator.Parsers Parser anonfun append 1 anonfunapply 2.apply(Parsers.scala:254)atscala.util.parsing.combinator.Parsers Failure.append(Parsers.scala:202) at scala.util.parsing.combinator.Parsers Parser anonfun append 1.apply(Parsers.scala:254)atscala.util.parsing.combinator.Parsers Parser
anonfun$append$1.apply(Parsers.scala:254)atscala.util.parsing.combinator.Parsers
anon 3.apply(Parsers.scala:222)atscala.util.parsing.combinator.Parsers anon 2
anonfun$apply$14.apply(Parsers.scala:891)atscala.util.parsing.combinator.Parsers
anon 2 anonfunapply 14.apply(Parsers.scala:891)atscala.util.DynamicVariable.withValue(DynamicVariable.scala:57)atscala.util.parsing.combinator.Parsers anon 2.apply(Parsers.scala:890) at scala.util.parsing.combinator.PackratParsers
anon$1.apply(PackratParsers.scala:110)atorg.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:34)atorg.apache.spark.sql.SQLContext
anonfun 1.apply(SQLContext.scala:208)atorg.apache.spark.sql.SQLContext anonfun 1.apply(SQLContext.scala:208)
at org.apache.spark.sql.execution.datasources.DDLParser.parse(DDLParser.scala:43)
at org.apache.spark.sql.SQLContext.parseSql(SQLContext.scala:231)
at org.apache.spark.sql.hive.HiveContext.parseSql(HiveContext.scala:333)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817)
at telematic.spark.hbase.TestConvertPythonToHiveUseUDF .main(TestConvertPythonToHiveUseUDF.scala:111)attelematic.spark.hbase.TestConvertPythonToHiveUseUDF.main(TestConvertPythonToHiveUseUDF.scala)atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:483)atorg.apache.spark.deploy.SparkSubmit .org apache spark deploy SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/09/21 16:31:48 INFO spark.SparkContext: Invoking stop() from shutdown hook
16/09/21 16:31:48 INFO zookeeper.ZooKeeper: Session: 0x35701e8ba03a482 closed
16/09/21 16:31:48 INFO CuratorFrameworkSingleton: Closing ZooKeeper client.
16/09/21 16:31:48 INFO zookeeper.ClientCnxn: EventThread shut down
16/09/21 16:31:48 INFO ui.SparkUI: Stopped Spark web UI at http://10.172.10.167:4041
16/09/21 16:31:48 INFO cluster.YarnClientSchedulerBackend: Shutting down all executors
16/09/21 16:31:48 INFO cluster.YarnClientSchedulerBackend: Asking each executor to shut down
16/09/21 16:31:48 INFO cluster.YarnClientSchedulerBackend: Interrupting monitor thread
16/09/21 16:31:48 INFO cluster.YarnClientSchedulerBackend: Stopped
16/09/21 16:31:48 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/09/21 16:31:48 INFO storage.MemoryStore: MemoryStore cleared
16/09/21 16:31:48 INFO storage.BlockManager: BlockManager stopped
16/09/21 16:31:48 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
16/09/21 16:31:48 INFO scheduler.OutputCommitCoordinator OutputCommitCoordinatorEndpoint:OutputCommitCoordinatorstopped!16/09/2116:31:48INFOspark.SparkContext:SuccessfullystoppedSparkContext16/09/2116:31:48INFOutil.ShutdownHookManager:Shutdownhookcalled16/09/2116:31:48INFOutil.ShutdownHookManager:Deletingdirectory/tmp/sparka6c8186544114be7bdc093c82f4e75d216/09/2116:31:48INFOutil.ShutdownHookManager:Deletingdirectory/tmp/spark95d498f0416f4d6da559aa083556877516/09/2116:31:48INFOremote.RemoteActorRefProvider RemotingTerminator: Shutting down remote daemon.
16/09/21 16:31:48 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 1
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

mtj66

看心情

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值