zeppelin + spark 遇到的坑

1.###报错:java.lang.NoSuchFieldError: HIVE_STATS_JDBC_TIMEOUT
折腾了一整天发现是spark的客户端没有安装好,重新安装后修复问题
2.###报错

org.apache.spark.SparkException: Found both spark.driver.extraClassPath and SPARK_CLASSPATH. Use only the former.
  at org.apache.spark.SparkConf$$anonfun$validateSettings$7$$anonfun$apply$8.apply(SparkConf.scala:543)
  at org.apache.spark.SparkConf$$anonfun$validateSettings$7$$anonfun$apply$8.apply(SparkConf.scala:541)
  at scala.collection.immutable.List.foreach(List.scala:381)
  at org.apache.spark.SparkConf$$anonfun$validateSettings$7.apply(SparkConf.scala:541)
  at org.apache.spark.SparkConf$$anonfun$validateSettings$7.apply(SparkConf.scala:529)
  at scala.Option.foreach(Option.scala:257)
  at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:529)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:368)
  at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2256)
  at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
  at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
  at scala.Option.getOrElse(Option.scala:121)
  at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:606)
  at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:38)
  at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:33)
  at org.apache.zeppelin.spark.SparkInterpreter.createSparkSession(SparkInterpreter.java:368)
  at org.apache.zeppelin.spark.SparkInterpreter.getSparkSession(SparkInterpreter.java:233)
  at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:841)
  at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
  at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:491)
  at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
  at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
  at java.util.concurrent.FutureTask.run(FutureTask.java:262)
  at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
  at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
  at java.lang.Thread.run(Thread.java:745)
 INFO [2017-11-27 10:36:49,880] ({pool-2-thread-4} Logging.scala[logInfo]:54) - Successfully stopped SparkContext

需要修改修改bin/interpreter.sh
去除 –driver-class-path” ZEPPELINCLASSPATHOVERRIDES: {CLASSPATH}”
3.###报错

  at org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:223)
  at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:121)
  at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
  at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
  at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
  at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
  at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
  at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) 
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
  at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
  at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
  at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
  at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
  at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
  at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
  at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
  at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
  at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
  at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
  at java.lang.Thread.run(Thread.java:745)

解决:修改 conf/zeppelin-env.sh export
添加:SPARK_SUBMIT_OPTIONS=”–jars /home/hadoop/spark-2.0.0-bin-hadoop2.6/jars/mysql-connector-java-5.1.11-bin.jar”

  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值