PyCharm中通过pyspark调用spark报错的解决办法

1 篇文章 0 订阅
1 篇文章 0 订阅
  1. 问题:PyCharm中通过pyspark无法调起spark
    2019-10-16 20:39:09,343 | Dummy-1:22492 | django.db.backends:90 | utils:execute | DEBUG | (0.000) SELECT @@SQL_AUTO_IS_NULL; args=None
    2019-10-16 20:39:09,344 | Dummy-1:22492 | django.db.backends:90 | utils:execute | DEBUG | (0.000) SELECT VERSION(); args=None
    System check identified no issues (0 silenced).
    2019-10-16 20:39:09,923 | Dummy-1:22492 | django.db.backends:90 | utils:execute | DEBUG | (0.003) SHOW FULL TABLES; args=None
    2019-10-16 20:39:09,927 | Dummy-1:22492 | django.db.backends:90 | utils:execute | DEBUG | (0.002) SELECT django_migrations.app, django_migrations.name FROM django_migrations; args=()
    October 16, 2019 - 20:39:09
    Django version 1.10.1, using settings ‘mysite.settings’
    Starting development server at http://0.0.0.0:8001/
    Quit the server with CTRL-BREAK.
    2019-10-16 20:39:22,058 | Thread-4:15288 | django.db.backends:90 | utils:execute | DEBUG | (0.000) SELECT @@SQL_AUTO_IS_NULL; args=None
    2019-10-16 20:39:22,058 | Thread-4:15288 | django.db.backends:90 | utils:execute | DEBUG | (0.000) SELECT VERSION(); args=None
    2019-10-16 20:39:22,070 | Thread-4:15288 | django.db.backends:90 | utils:execute | DEBUG | (0.000) SELECT django_session.session_key, django_session.session_data, django_session.expire_date FROM django_session WHERE (django_session.session_key = ‘k4micifggqpl2xjc4wapt12wwq0j12b3’ AND django_session.expire_date > ‘2019-10-16 20:39:22’); args=(‘k4micifggqpl2xjc4wapt12wwq0j12b3’, ‘2019-10-16 20:39:22’)
    2019-10-16 20:39:22,073 | Thread-4:15288 | django.db.backends:90 | utils:execute | DEBUG | (0.001) SELECT auth_user.id, auth_user.password, auth_user.last_login, auth_user.is_superuser, auth_user.username, auth_user.first_name, auth_user.last_name, auth_user.email, auth_user.is_staff, auth_user.is_active, auth_user.date_joined FROM auth_user WHERE auth_user.id = 1; args=(1,)
    Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
    19/10/16 20:39:22 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[main,5,main]
    java.util.NoSuchElementException: key not found: _PYSPARK_DRIVER_CALLBACK_HOST
    at scala.collection.MapLike c l a s s . d e f a u l t ( M a p L i k e . s c a l a : 228 ) a t s c a l a . c o l l e c t i o n . A b s t r a c t M a p . d e f a u l t ( M a p . s c a l a : 58 ) a t s c a l a . c o l l e c t i o n . M a p L i k e class.default(MapLike.scala:228) at scala.collection.AbstractMap.default(Map.scala:58) at scala.collection.MapLike class.default(MapLike.scala:228)atscala.collection.AbstractMap.default(Map.scala:58)atscala.collection.MapLikeclass.apply(MapLike.scala:141)
    at scala.collection.AbstractMap.apply(Map.scala:58)
    at org.apache.spark.api.python.PythonGatewayServerKaTeX parse error: Can't use function '$' in math mode at position 8: anonfun$̲main$1.apply$mc…runMain(SparkSubmit.scala:731)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain 1 ( S p a r k S u b m i t . s c a l a : 181 ) a t o r g . a p a c h e . s p a r k . d e p l o y . S p a r k S u b m i t 1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit 1(SparkSubmit.scala:181)atorg.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:206)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    19/10/16 20:39:22 INFO ShutdownHookManager: Shutdown hook called
    2019-10-16 20:39:23,355 | Thread-4:15288 | django.server:131 | basehttp:log_message | ERROR | “GET /cif/cifmas_echart_0002/ HTTP/1.1” 500 102233

  2. 解决方法:在pycharm中增加变量,指向 spark目录即可。
    在这里插入图片描述
    在这里插入图片描述

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值