pyspark TypeError: ‘JavaPackage‘ object is not callable

5 篇文章 0 订阅
1 篇文章 0 订阅

pyspark 初始化报错

问题

Python 3.7.10 (default, Jun  4 2021, 14:48:32)
[GCC 7.5.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
Warning: Ignoring non-spark config property: history.server.spnego.keytab.file=/etc/security/keytabs/spnego.service.keytab
Warning: Ignoring non-spark config property: history.server.spnego.kerberos.principal=HTTP/_HOST@CHINATELECOM.CN
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
2021-12-29 16:24:26 WARN  Client:66 - Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
2021-12-29 16:24:33 ERROR SparkContext:91 - Error initializing SparkContext.
org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:89)
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:63)
        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
        at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
        at py4j.Gateway.invoke(Gateway.java:238)
        at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
        at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
        at py4j.GatewayConnection.run(GatewayConnection.java:238)
        at java.lang.Thread.run(Thread.java:748)
2021-12-29 16:24:33 WARN  YarnSchedulerBackend$YarnSchedulerEndpoint:66 - Attempted to request executors before the AM has registered!
2021-12-29 16:24:33 WARN  MetricsSystem:66 - Stopping a MetricsSystem that is not running
Traceback (most recent call last):
  File "/home/kehujingyingbu/spark-2.3.2-bin-without-hadoop/python/pyspark/shell.py", line 44, in <module>
    SparkContext._jvm.org.apache.hadoop.hive.conf.HiveConf()
TypeError: 'JavaPackage' object is not callable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/kehujingyingbu/spark-2.3.2-bin-without-hadoop/python/pyspark/shell.py", line 59, in <module>
    spark = SparkSession.builder.getOrCreate()
  File "/home/kehujingyingbu/spark-2.3.2-bin-without-hadoop/python/pyspark/sql/session.py", line 173, in getOrCreate
    sc = SparkContext.getOrCreate(sparkConf)
  File "/home/kehujingyingbu/spark-2.3.2-bin-without-hadoop/python/pyspark/context.py", line 351, in getOrCreate
    SparkContext(conf=conf or SparkConf())
  File "/home/kehujingyingbu/spark-2.3.2-bin-without-hadoop/python/pyspark/context.py", line 118, in __init__
    conf, jsc, profiler_cls)
  File "/home/kehujingyingbu/spark-2.3.2-bin-without-hadoop/python/pyspark/context.py", line 180, in _do_init
    self._jsc = jsc or self._initialize_context(self._conf._jconf)
  File "/home/kehujingyingbu/spark-2.3.2-bin-without-hadoop/python/pyspark/context.py", line 290, in _initialize_context
    return self._jvm.JavaSparkContext(jconf)
  File "/home/kehujingyingbu/spark-2.3.2-bin-without-hadoop/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1525, in __call__
  File "/home/kehujingyingbu/spark-2.3.2-bin-without-hadoop/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:89)
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:63)
        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
        at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
        at py4j.Gateway.invoke(Gateway.java:238)
        at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
        at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
        at py4j.GatewayConnection.run(GatewayConnection.java:238)
        at java.lang.Thread.run(Thread.java:748)

原因

TypeError: 'JavaPackage' object is not callable

明显是jar找不到

File "/home/kehujingyingbu/spark-2.3.2-bin-without-hadoop/python/pyspark/shell.py", line 44, in <module>
vim /home/kehujingyingbu/spark-2.3.2-bin-without-hadoop/python/pyspark/shell.py
set number

在这里插入图片描述
明显是缺少hive相关的包

解决

将hive相关jars放到/home/kehujingyingbu/spark-2.3.2-bin-without-hadoop/jars/下
我这集群用的hdp的,直接把hdp下的包替换了/home/kehujingyingbu/spark-2.3.2-bin-without-hadoop/jars/

注意

使用hdp宝替换需要在spark-env.sh配置:
export HDP_VERSION=3.1.0.0-78
在这里插入图片描述
在这里插入图片描述
在这里插入图片描述
问题解决

  • 1
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值