java main 异常 shell,centos7-发出spark shell命令:java.io.IOException异常:无法创建文件客户端...

我在执行spark shell命令时遇到一些问题:

[mapr@node1 ~]$ /opt/mapr/spark/spark-2.1.0/bin/spark-shell --master local[2]

错误:

20/05/02 14:21:34 ERROR SparkContext: Error initializing SparkContext.

java.io.IOException: Could not create FileClient

at com.mapr.fs.MapRFileSystem.lookupClient(MapRFileSystem.java:643)

at com.mapr.fs.MapRFileSystem.lookupClient(MapRFileSystem.java:696)

at com.mapr.fs.MapRFileSystem.getMapRFileStatus(MapRFileSystem.java:1405)

at com.mapr.fs.MapRFileSystem.getFileStatus(MapRFileSystem.java:1080)

at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:93)

at org.apache.spark.SparkContext.(SparkContext.scala:531)

at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)

at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)

at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)

at scala.Option.getOrElse(Option.scala:121)

at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)

at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)

at $line3.$read$$iw$$iw.(:15)

at $line3.$read$$iw.(:42)

at $line3.$read.(:44)

at $line3.$read$.(:48)

at $line3.$read$.()

at $line3.$eval$.$print$lzycompute(:7)

at $line3.$eval$.$print(:6)

at $line3.$eval.$print()

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)

at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)

at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)

at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)

at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)

at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)

at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)

at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)

at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)

at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)

at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)

at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)

at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)

at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)

at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)

at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)

at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)

at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)

at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)

at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)

at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)

at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)

at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)

at org.apache.spark.repl.Main$.doMain(Main.scala:68)

at org.apache.spark.repl.Main$.main(Main.scala:51)

at org.apache.spark.repl.Main.main(Main.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:733)

at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:177)

at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:202)

at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:116)

at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Caused by: java.io.IOException: Could not create FileClient

at com.mapr.fs.MapRClientImpl.(MapRClientImpl.java:136)

at com.mapr.fs.MapRFileSystem.lookupClient(MapRFileSystem.java:637)

... 58 more

java.io.IOException: Could not create FileClient

at com.mapr.fs.MapRFileSystem.lookupClient(MapRFileSystem.java:643)

at com.mapr.fs.MapRFileSystem.lookupClient(MapRFileSystem.java:696)

at com.mapr.fs.MapRFileSystem.getMapRFileStatus(MapRFileSystem.java:1405)

at com.mapr.fs.MapRFileSystem.getFileStatus(MapRFileSystem.java:1080)

at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:93)

at org.apache.spark.SparkContext.(SparkContext.scala:531)

at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)

at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)

at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)

at scala.Option.getOrElse(Option.scala:121)

at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)

at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)

... 47 elided

Caused by: java.io.IOException: Could not create FileClient

at com.mapr.fs.MapRClientImpl.(MapRClientImpl.java:136)

at com.mapr.fs.MapRFileSystem.lookupClient(MapRFileSystem.java:637)

... 58 more

:14: error: not found: value spark

import spark.implicits._

^

:14: error: not found: value spark

import spark.sql

^

Welcome to

____ __

/ __/__ ___ _____/ /__

_ / _ / _ `/ __/ '_/

/___/ .__/\_,_/_/ /_/\_ version 2.1.0-mapr-1710

/_/

Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_242)

Type in expressions to have them evaluated.

Type :help for more information.

scala> 2020-05-02 14:21:27,8616 ERROR Cidcache fs/client/fileclient/cc/cidcache.cc:2470 Thread: 30539 MoveToNextCldb: No CLDB entries, cannot run, sleeping 5 seconds!

2020-05-02 14:21:32,9268 ERROR Client fs/client/fileclient/cc/client.cc:1329 Thread: 30539 Failed to initialize client for cluster MyCluster, error Connection reset by peer(104)

我用 spark-2.1.0条以及一个有3个节点的簇 map器。

我还有以下conf文件:

# A Spark Worker will be started on each of the machines listed below.

node2

node3

还可以将以下行添加到$Spark_HOME/conf/Spark-环境卫生:

export SPARK_MASTER_HOST=node1

export SPARK_MASTER_IP=172.17.0.2

export SPARK_DIST_CLASSPATH=$(hadoop classpath)

请问有没有人有同样的问题或知道如何解决。

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值