SparkShell 启动报错,在线等大神解决,急!!!!

[root@hadoop01 bin]# ./spark-shell --master spark://hadoop
log4j:WARN No appenders could be found for logger (org.apa
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.htm
Using Spark's repl log4j profile: org/apache/spark/log4j-d
To adjust logging level use sc.setLogLevel("INFO")
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 1.6.3
/_/

Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server
Type in expressions to have them evaluated.
Type :help for more information.
18/03/01 09:49:58 ERROR SparkDeploySchedulerBackend: Appli
18/03/01 09:49:58 WARN SparkDeploySchedulerBackend: Applic
18/03/01 09:49:58 WARN AppClient$ClientEndpoint: Drop Unre
18/03/01 09:49:58 ERROR MapOutputTrackerMaster: Error comm
java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchr
at scala.concurrent.impl.Promise$DefaultPromise.tr
at scala.concurrent.impl.Promise$DefaultPromise.re
at scala.concurrent.impl.Promise$DefaultPromise.re
at scala.concurrent.Await$$anonfun$result$1.apply(
at scala.concurrent.BlockContext$DefaultBlockConte
at scala.concurrent.Await$.result(package.scala:10
at org.apache.spark.rpc.RpcTimeout.awaitResult(Rpc
at org.apache.spark.rpc.RpcEndpointRef.askWithRetr
at org.apache.spark.rpc.RpcEndpointRef.askWithRetr
at org.apache.spark.MapOutputTracker.askTracker(Ma
at org.apache.spark.MapOutputTracker.sendTracker(M
at org.apache.spark.MapOutputTrackerMaster.stop(Ma
at org.apache.spark.SparkEnv.stop(SparkEnv.scala:9
at org.apache.spark.SparkContext$$anonfun$stop$12.
at org.apache.spark.util.Utils$.tryLogNonFatalErro
at org.apache.spark.SparkContext.stop(SparkContext
at org.apache.spark.scheduler.cluster.SparkDeployS
at org.apache.spark.deploy.client.AppClient$Client
at org.apache.spark.deploy.client.AppClient$Client
at java.util.concurrent.Executors$RunnableAdapter.
at java.util.concurrent.FutureTask.runAndReset(Fut
at java.util.concurrent.ScheduledThreadPoolExecuto
at java.util.concurrent.ScheduledThreadPoolExecuto
at java.util.concurrent.ThreadPoolExecutor.runWork
at java.util.concurrent.ThreadPoolExecutor$Worker.
at java.lang.Thread.run(Thread.java:748)
18/03/01 09:49:59 ERROR Utils: Uncaught exception in threa
org.apache.spark.SparkException: Error communicating with
at org.apache.spark.MapOutputTracker.askTracker(Ma
at org.apache.spark.MapOutputTracker.sendTracker(M
at org.apache.spark.MapOutputTrackerMaster.stop(Ma
at org.apache.spark.SparkEnv.stop(SparkEnv.scala:9
at org.apache.spark.SparkContext$$anonfun$stop$12.
at org.apache.spark.util.Utils$.tryLogNonFatalErro
at org.apache.spark.SparkContext.stop(SparkContext
at org.apache.spark.scheduler.cluster.SparkDeployS
at org.apache.spark.deploy.client.AppClient$Client
at org.apache.spark.deploy.client.AppClient$Client
at java.util.concurrent.Executors$RunnableAdapter.
at java.util.concurrent.FutureTask.runAndReset(Fut
at java.util.concurrent.ScheduledThreadPoolExecuto
at java.util.concurrent.ScheduledThreadPoolExecuto
at java.util.concurrent.ThreadPoolExecutor.runWork
at java.util.concurrent.ThreadPoolExecutor$Worker.
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchr
at scala.concurrent.impl.Promise$DefaultPromise.tr
at scala.concurrent.impl.Promise$DefaultPromise.re
at scala.concurrent.impl.Promise$DefaultPromise.re
at scala.concurrent.Await$$anonfun$result$1.apply(
at scala.concurrent.BlockContext$DefaultBlockConte
at scala.concurrent.Await$.result(package.scala:10
at org.apache.spark.rpc.RpcTimeout.awaitResult(Rpc
at org.apache.spark.rpc.RpcEndpointRef.askWithRetr
at org.apache.spark.rpc.RpcEndpointRef.askWithRetr
at org.apache.spark.MapOutputTracker.askTracker(Ma
... 16 more
18/03/01 09:50:00 ERROR SparkContext: Error initializing S
java.lang.NullPointerException
at org.apache.spark.SparkContext.<init>(SparkConte
at org.apache.spark.repl.SparkILoop.createSparkCon
at $line3.$read$$iwC$$iwC.<init>(<console>:15)
at $line3.$read$$iwC.<init>(<console>:24)
at $line3.$read.<init>(<console>:26)
at $line3.$read$.<init>(<console>:30)
at $line3.$read$.<clinit>(<console>)
at $line3.$eval$.<init>(<console>:7)
at $line3.$eval$.<clinit>(<console>)
at $line3.$eval.$print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Na
at sun.reflect.NativeMethodAccessorImpl.invoke(Nat
at sun.reflect.DelegatingMethodAccessorImpl.invoke
at java.lang.reflect.Method.invoke(Method.java:498
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.
at org.apache.spark.repl.SparkIMain$Request.loadAn
at org.apache.spark.repl.SparkIMain.loadAndRunReq$
at org.apache.spark.repl.SparkIMain.interpret(Spar
at org.apache.spark.repl.SparkIMain.interpret(Spar
at org.apache.spark.repl.SparkILoop.reallyInterpre
at org.apache.spark.repl.SparkILoop.interpretStart
at org.apache.spark.repl.SparkILoop.command(SparkI
at org.apache.spark.repl.SparkILoopInit$$anonfun$i
at org.apache.spark.repl.SparkILoopInit$$anonfun$i
at org.apache.spark.repl.SparkIMain.beQuietDuring(
at org.apache.spark.repl.SparkILoopInit$class.init
at org.apache.spark.repl.SparkILoop.initializeSpar
at org.apache.spark.repl.SparkILoop$$anonfun$org$aSparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runT
at org.apache.spark.repl.SparkILoop.runThunks(Spar
at org.apache.spark.repl.SparkILoopInit$class.post
at org.apache.spark.repl.SparkILoop.postInitializa
at org.apache.spark.repl.SparkILoop$$anonfun$org$a
at org.apache.spark.repl.SparkILoop$$anonfun$org$a
at org.apache.spark.repl.SparkILoop$$anonfun$org$a
at scala.tools.nsc.util.ScalaClassLoader$.savingCo
at org.apache.spark.repl.SparkILoop.org$apache$spa
at org.apache.spark.repl.SparkILoop.process(SparkI
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Na
at sun.reflect.NativeMethodAccessorImpl.invoke(Nat
at sun.reflect.DelegatingMethodAccessorImpl.invoke
at java.lang.reflect.Method.invoke(Method.java:498
at org.apache.spark.deploy.SparkSubmit$.org$apache
at org.apache.spark.deploy.SparkSubmit$.doRunMain$
at org.apache.spark.deploy.SparkSubmit$.submit(Spa
at org.apache.spark.deploy.SparkSubmit$.main(Spark
at org.apache.spark.deploy.SparkSubmit.main(SparkS
java.lang.NullPointerException
at org.apache.spark.SparkContext.<init>(SparkConte
at org.apache.spark.repl.SparkILoop.createSparkCon
at $iwC$$iwC.<init>(<console>:15)
at $iwC.<init>(<console>:24)
at <init>(<console>:26)
at .<init>(<console>:30)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Na
at sun.reflect.NativeMethodAccessorImpl.invoke(Nat
at sun.reflect.DelegatingMethodAccessorImpl.invoke
at java.lang.reflect.Method.invoke(Method.java:498
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.
at org.apache.spark.repl.SparkIMain$Request.loadAn
at org.apache.spark.repl.SparkIMain.loadAndRunReq$
at org.apache.spark.repl.SparkIMain.interpret(Spar
at org.apache.spark.repl.SparkIMain.interpret(Spar
at org.apache.spark.repl.SparkILoop.reallyInterpre
at org.apache.spark.repl.SparkILoop.interpretStart
at org.apache.spark.repl.SparkILoop.command(SparkI
at org.apache.spark.repl.SparkILoopInit$$anonfun$i
at org.apache.spark.repl.SparkILoopInit$$anonfun$i
at org.apache.spark.repl.SparkIMain.beQuietDuring(
at org.apache.spark.repl.SparkILoopInit$class.init
at org.apache.spark.repl.SparkILoop.initializeSpar
at org.apache.spark.repl.SparkILoop$$anonfun$org$aSparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runT
at org.apache.spark.repl.SparkILoop.runThunks(Spar
at org.apache.spark.repl.SparkILoopInit$class.post
at org.apache.spark.repl.SparkILoop.postInitializa
at org.apache.spark.repl.SparkILoop$$anonfun$org$a
at org.apache.spark.repl.SparkILoop$$anonfun$org$a
at org.apache.spark.repl.SparkILoop$$anonfun$org$a
at scala.tools.nsc.util.ScalaClassLoader$.savingCo
at org.apache.spark.repl.SparkILoop.org$apache$spa
at org.apache.spark.repl.SparkILoop.process(SparkI
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Na
at sun.reflect.NativeMethodAccessorImpl.invoke(Nat
at sun.reflect.DelegatingMethodAccessorImpl.invoke
at java.lang.reflect.Method.invoke(Method.java:498
at org.apache.spark.deploy.SparkSubmit$.org$apache
at org.apache.spark.deploy.SparkSubmit$.doRunMain$
at org.apache.spark.deploy.SparkSubmit$.submit(Spa
at org.apache.spark.deploy.SparkSubmit$.main(Spark
at org.apache.spark.deploy.SparkSubmit.main(SparkS

java.lang.NullPointerException
at org.apache.spark.sql.SQLContext$.createListener
at org.apache.spark.sql.hive.HiveContext.<init>(Hi
at sun.reflect.NativeConstructorAccessorImpl.newIn
at sun.reflect.NativeConstructorAccessorImpl.newIn
at sun.reflect.DelegatingConstructorAccessorImpl.n
at java.lang.reflect.Constructor.newInstance(Const
at org.apache.spark.repl.SparkILoop.createSQLConte
at $iwC$$iwC.<init>(<console>:15)
at $iwC.<init>(<console>:24)
at <init>(<console>:26)
at .<init>(<console>:30)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Na
at sun.reflect.NativeMethodAccessorImpl.invoke(Nat
at sun.reflect.DelegatingMethodAccessorImpl.invoke
at java.lang.reflect.Method.invoke(Method.java:498
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.
at org.apache.spark.repl.SparkIMain$Request.loadAn
at org.apache.spark.repl.SparkIMain.loadAndRunReq$
at org.apache.spark.repl.SparkIMain.interpret(Spar
at org.apache.spark.repl.SparkIMain.interpret(Spar
at org.apache.spark.repl.SparkILoop.reallyInterpre
at org.apache.spark.repl.SparkILoop.interpretStart
at org.apache.spark.repl.SparkILoop.command(SparkI
at org.apache.spark.repl.SparkILoopInit$$anonfun$i
at org.apache.spark.repl.SparkILoopInit$$anonfun$i
at org.apache.spark.repl.SparkIMain.beQuietDuring(
at org.apache.spark.repl.SparkILoopInit$class.init
at org.apache.spark.repl.SparkILoop.initializeSpar
at org.apache.spark.repl.SparkILoop$$anonfun$org$aSparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runT
at org.apache.spark.repl.SparkILoop.runThunks(Spar
at org.apache.spark.repl.SparkILoopInit$class.post
at org.apache.spark.repl.SparkILoop.postInitializa
at org.apache.spark.repl.SparkILoop$$anonfun$org$a
at org.apache.spark.repl.SparkILoop$$anonfun$org$a
at org.apache.spark.repl.SparkILoop$$anonfun$org$a
at scala.tools.nsc.util.ScalaClassLoader$.savingCo
at org.apache.spark.repl.SparkILoop.org$apache$spa
at org.apache.spark.repl.SparkILoop.process(SparkI
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Na
at sun.reflect.NativeMethodAccessorImpl.invoke(Nat
at sun.reflect.DelegatingMethodAccessorImpl.invoke
at java.lang.reflect.Method.invoke(Method.java:498
at org.apache.spark.deploy.SparkSubmit$.org$apache
at org.apache.spark.deploy.SparkSubmit$.doRunMain$
at org.apache.spark.deploy.SparkSubmit$.submit(Spa
at org.apache.spark.deploy.SparkSubmit$.main(Spark
at org.apache.spark.deploy.SparkSubmit.main(SparkS

<console>:16: error: not found: value sqlContext
import sqlContext.implicits._
^
<console>:16: error: not found: value sqlContext
import sqlContext.sql
^
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值