大数据中的组件报错及解决方法集锦——spark

1、spark版本不

"Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark client.)'
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Failed to create spark client."

  解决方法:
将spark换成hive要求的版本

2、spark连接不上hive


  
  
java.lang.NoClassDefFoundError: org/apache/spark/sql/hive/HiveContext  at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:158)  at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:37)  at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)  at java.lang.reflect.Method.invoke(Method.java:498)  at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:635) Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.hive.HiveContext  at java.net.URLClassLoader.findClass(URLClassLoader.java:381)  at java.lang.ClassLoader.loadClass(ClassLoader.java:424)  at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  解决方法:

spark缺少jar包,从kylin的spark/jars复制过去

3、Spark内存过大


  
  
Exception in thread "main" java.lang.IllegalArgumentException: Required executor memory (8192+819 MB) is above the max threshold (8192 MB) of this cluster! Please check the values of 'yarn.scheduler.maximum-allocation-mb' and/or 'yarn.nodemanager.resource.memory-mb'.  at org.apache.spark.deploy.yarn.Client.verifyClusterResources(Client.scala:319)  at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:167)  at org.apache.spark.deploy.yarn.Client.run(Client.scala:1109)  at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1168)  at org.apache.spark.deploy.yarn.Client.main(Client.scala)  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)  at java.lang.reflect.Method.invoke(Method.java:498)  at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:775)  at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)  at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  解决方法:
在kylin里面设置一下spark的内存

4、spark分配的CPU核数过大


  
  
eption: java.lang.RuntimeException: error execute org.apache.kylin.engine.spark.SparkCubingByLayer java.lang.RuntimeException: error execute org.apache.kylin.engine.spark.SparkCubingByLayer  at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:42)  at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)  at java.lang.reflect.Method.invoke(Method.java:498)  at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:635) Caused by: java.lang.InterruptedException  at java.lang.Object.wait(Native Method)  at org.apache.spark.scheduler.TaskSchedulerImpl.waitBackendReady(TaskSchedulerImpl.scala:676)  at org.apache.spark.scheduler.TaskSchedulerImpl.postStartHook(TaskSchedulerImpl.scala:186)  at org.apache.spark.scheduler.cluster.YarnClusterScheduler.postStartHook(YarnClusterScheduler.scala:33)  at org.apache.spark.SparkContext.<init>(SparkContext.scala:567)  at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)  at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:151)  at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:37)  ... 6 more 18/01/11 19:47:48 INFO yarn.ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: Uncaught exception: org.apache.hadoop.yarn.exceptions.InvalidResourceRequestException: Invalid resource request, requested virtual cores < 0, or requested virtual cores > max configured, requestedVirtualCores=12, maxVirtualCores=4  at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.validateResourceRequest(SchedulerUtils.java:288)  at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.normalizeAndValidateRequest(SchedulerUtils.java:248)  at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.normalizeAndvalidateRequest(SchedulerUtils.java:264)  at org.apache.hadoop.yarn.server.resourcemanager.RMServerUtils.normalizeAndValidateRequests(RMServerUtils.java:206)  at org.apache.hadoop.yarn.server.resourcemanager.ApplicationMasterService.allocate(ApplicationMasterService.java:464)  at org.apache.hadoop.yarn.api.impl.pb.service.ApplicationMasterProtocolPBServiceImpl.allocate(ApplicationMasterProtocolPBServiceImpl.java:60)  at org.apache.hadoop.yarn.proto.ApplicationMasterProtocol$ApplicationMasterProtocolService$2.callBlockingMethod(ApplicationMasterProtocol.java:99)  at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)  at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)  at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)  at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)  at java.security.AccessController.doPrivileged(Native Method)  at javax.security.auth.Subject.doAs(Subject.java:422)  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)  at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455) )
  解决方法:
 
在kylin里面设置一下spark的cpu核数




评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值