spark安装出现的问题

安装scala, spark bin with hadoop, hadoop
安装spark一直出现错误,可能是spark配置文件的问题。

PS C:\BigData\spark-2.4.3-bin-hadoop2.7\bin> pyspark
Python 3.7.3 (default, Mar 27 2019, 17:13:21) [MSC v.1915 64 bit (AMD64)] :: Anaconda, Inc. on win32

Warning:
This Python interpreter is in a conda environment, but the environment has
not been activated. Libraries may fail to load. To activate this environment
please see https://conda.io/activation

Type “help”, “copyright”, “credits” or “license” for more information.
Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
19/08/01 16:14:48 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Invalid Spark URL: spark://[email protected]:51129
at org.apache.spark.rpc.RpcEndpointAddress . a p p l y ( R p c E n d p o i n t A d d r e s s . s c a l a : 66 ) a t o r g . a p a c h e . s p a r k . r p c . n e t t y . N e t t y R p c E n v . a s y n c S e t u p E n d p o i n t R e f B y U R I ( N e t t y R p c E n v . s c a l a : 134 ) a t o r g . a p a c h e . s p a r k . r p c . R p c E n v . s e t u p E n d p o i n t R e f B y U R I ( R p c E n v . s c a l a : 101 ) a t o r g . a p a c h e . s p a r k . r p c . R p c E n v . s e t u p E n d p o i n t R e f ( R p c E n v . s c a l a : 109 ) a t o r g . a p a c h e . s p a r k . u t i l . R p c U t i l s .apply(RpcEndpointAddress.scala:66) at org.apache.spark.rpc.netty.NettyRpcEnv.asyncSetupEndpointRefByURI(NettyRpcEnv.scala:134) at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101) at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109) at org.apache.spark.util.RpcUtils .apply(RpcEndpointAddress.scala:66)atorg.apache.spark.rpc.netty.NettyRpcEnv.asyncSetupEndpointRefByURI(NettyRpcEnv.scala:134)atorg.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101)atorg.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109)atorg.apache.spark.util.RpcUtils.makeDriverRef(RpcUtils.scala:32)
at org.apache.spark.executor.Executor.(Executor.scala:184)
at org.apache.spark.scheduler.local.LocalEndpoint.(LocalSchedulerBackend.scala:59)
at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:127)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:183)
at org.apache.spark.SparkContext.(SparkContext.scala:501)
at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:238)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:748)
19/08/01 16:14:48 ERROR Utils: Uncaught exception in thread Thread-3
java.lang.NullPointerException
at org.apache.spark.scheduler.local.LocalSchedulerBackend.org a p a c h e apache apachespark s c h e d u l e r scheduler schedulerlocal L o c a l S c h e d u l e r B a c k e n d LocalSchedulerBackend LocalSchedulerBackend s t o p ( L o c a l S c h e

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 2
    评论
要在Windows上安装Apache Spark,你可以按照以下步骤进行操作: 1. 首先,访问Apache Spark的下载页面,并选择下载链接\[1\]。下载完成后,将压缩文件解压到你想要安装Spark的目录。 2. 打开命令提示符,并进入到Spark安装目录下的bin文件夹。你可以使用以下命令:cd %SPARK_HOME%/bin 3. 在命令提示符中输入spark-shell命令,以运行Apache Spark shell\[2\]。你应该会看到一些输出信息,忽略最后可能出现的警告。 另外,如果你使用的是IntelliJ IDEA作为开发环境,你还可以安装Scala插件来更方便地使用Spark\[3\]。你可以启动IntelliJ IDEA,然后点击启动页configuration,选择Plugins,搜索并安装Scala插件。如果你无法找到插件,可能需要设置代理。你可以在Install JetBrains plugin...中找到HTTP Proxy Settings来设置代理。 希望这些步骤对你有帮助! #### 引用[.reference_title] - *1* *2* [在Windows上的安装 Spark](https://blog.csdn.net/lengyudexin/article/details/128474828)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^insertT0,239^v3^insert_chatgpt"}} ] [.reference_item] - *3* [spark踩坑记——windows环境下spark安装和运行](https://blog.csdn.net/hongxingabc/article/details/81565174)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^insertT0,239^v3^insert_chatgpt"}} ] [.reference_item] [ .reference_list ]

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值