单机运行 spark-shell错误

版权声明:本文为博主原创文章,遵循 CC 4.0 by-sa 版权协议,转载请附上原文出处链接和本声明。
本文链接:https://blog.csdn.net/dickysun1987/article/details/78829830

安装spark,spark-shell遇到错误

Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/12/18 08:40:12 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
17/12/18 08:40:13 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
17/12/18 08:40:13 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
17/12/18 08:40:13 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
17/12/18 08:40:13 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
17/12/18 08:40:13 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
17/12/18 08:40:13 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
17/12/18 08:40:13 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
17/12/18 08:40:13 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
17/12/18 08:40:13 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
17/12/18 08:40:13 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
17/12/18 08:40:13 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
17/12/18 08:40:13 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
17/12/18 08:40:13 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
17/12/18 08:40:13 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
17/12/18 08:40:13 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
17/12/18 08:40:13 WARN Utils: Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
17/12/18 08:40:13 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service ‘sparkDriver’ failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service ‘sparkDriver’ (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
.
.
.
Welcome to
__
/ / _ _/ /__
\ \/ \/ _ `/ _/ ‘/
// ./_,// //_\ version 2.2.1
/_/
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_151)
Type in expressions to have them evaluated.
Type :help for more information
scala>

原因:
host文件绑定地址错误
解决:
修改 /etc/hosts文件
添加或配置正确地址
原hosts文件

127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
::1 localhost localhost.localdomain localhost6 localhost6.localdomain6

查看本机hostname及地址
修改hosts文件

127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
::1 localhost localhost.localdomain localhost6 localhost6.localdomain6
10.64.42.108 ls-dj-test-3

重新运行 spark-shell 成功

展开阅读全文

spark-shell --master yarn-client错误

12-23

我执行spark-shell --master yarn-client,发现报错且sc不可用,错误如下。哪位高手帮忙看看是什么问题?rn[code=text]rn14/12/23 08:47:33 INFO yarn.Client: Uploading file:/usr/local/setup/spark/lib/spark-assembly-1.1.1-hadoop2.3.0.jar to hdfs://main:9000/user/root/.sparkStaging/application_1419252258628_0010/spark-assembly-1.1.1-hadoop2.3.0.jarrnjava.lang.IllegalArgumentException: Can not create a Path from an empty stringrn at org.apache.hadoop.fs.Path.checkPathArg(Path.java:127)rn at org.apache.hadoop.fs.Path.(Path.java:135)rn at org.apache.hadoop.fs.Path.(Path.java:94)rn at org.apache.spark.deploy.yarn.ClientBase$class.copyRemoteFile(ClientBase.scala:159)rn at org.apache.spark.deploy.yarn.Client.copyRemoteFile(Client.scala:37)rn at org.apache.spark.deploy.yarn.ClientBase$$anonfun$prepareLocalResources$5$$anonfun$apply$2.apply(ClientBase.scala:236)rn at org.apache.spark.deploy.yarn.ClientBase$$anonfun$prepareLocalResources$5$$anonfun$apply$2.apply(ClientBase.scala:231)rn at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)rn at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)rn at org.apache.spark.deploy.yarn.ClientBase$$anonfun$prepareLocalResources$5.apply(ClientBase.scala:231)rn at org.apache.spark.deploy.yarn.ClientBase$$anonfun$prepareLocalResources$5.apply(ClientBase.scala:229)rn at scala.collection.immutable.List.foreach(List.scala:318)rn at org.apache.spark.deploy.yarn.ClientBase$class.prepareLocalResources(ClientBase.scala:229)rn at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:37)rn at org.apache.spark.deploy.yarn.Client.runApp(Client.scala:74)rn at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:92)rn at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141)rn at org.apache.spark.SparkContext.(SparkContext.scala:333)rn at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:981)rn at $iwC$$iwC.(:8)rn at $iwC.(:14)rn at (:16)rn at .(:20)rn at .()rn at .(:7)rn at .()rn at $print()rn at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)rn at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)rn at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)rn at java.lang.reflect.Method.invoke(Method.java:601)rn at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:789)rn at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1062)rn at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:615)rn at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:646)rn at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:610)rn at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:823)rn at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:868)rn at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:780)rn at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:121)rn at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:120)rn at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:264)rn at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:120)rn at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:57)rn at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:940)rn at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:142)rn at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:57)rn at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:104)rn at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:57)rn at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:957)rn at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:911)rn at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:911)rn at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)rn at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:911)rn at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1006)rn at org.apache.spark.repl.Main$.main(Main.scala:31)rn at org.apache.spark.repl.Main.main(Main.scala)rn at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)rn at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)rn at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)rn at java.lang.reflect.Method.invoke(Method.java:601)rn at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:329)rn at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)rn at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)rnrnSpark context available as sc.rnrnscala> println(sc)rn:11: error: not found: value scrn println(sc)rn[/code] 论坛

spark-shell启动报错

07-19

Exception in thread "main" java.util.concurrent.TimeoutException: Futures timed out after [10 minutes]rn at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)rn at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:153)rn at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:169)rn at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:169)rn at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)rn at scala.concurrent.Await$.ready(package.scala:169)rn at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:393)rn at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)rn at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)rn at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)rn at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)rn at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)rn at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)rn at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)rn at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)rn at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)rn at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)rn at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)rn at org.apache.spark.repl.Main$.doMain(Main.scala:68)rn at org.apache.spark.repl.Main$.main(Main.scala:51)rn at org.apache.spark.repl.Main.main(Main.scala)rn at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)rn at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)rn at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)rn at java.lang.reflect.Method.invoke(Method.java:497)rn at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)rn at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)rn at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)rn at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)rn at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)rn[code=text][/code] 论坛

没有更多推荐了,返回首页