spark的python开发安装方式_python-PySpark安装错误

我已按照包括this、this、this和this在内的各种博客文章中的说明在笔记本电脑上安装pyspark.但是,当我尝试从终端或jupyter笔记本电脑使用pyspark时,我一直收到以下错误.

我已经安装了所有必要的软件,如问题底部所示.

我已将以下内容添加到我的.bashrc中

function sjupyter_init()

{

#Set anaconda3 as python

export PATH=~/anaconda3/bin:$PATH

#Spark path (based on your computer)

SPARK_HOME=/opt/spark

export PATH=$SPARK_HOME:$PATH

export PYTHONPATH=$SPARK_HOME/python:/home/khurram/anaconda3/bin/python3

export PYSPARK_DRIVER_PYTHON="jupyter"

export PYSPARK_DRIVER_PYTHON_OPTS="notebook"

export PYSPARK_PYTHON=python3

}

我从终端执行sjupyter_init,然后执行jupyter笔记本,以使用pyspark启动jupyter笔记本.

在笔记本中,我执行以下操作没有错误

import findspark

findspark.init('/opt/spark')

from pyspark.sql import SparkSession

但是当我在下面执行时

spark = SparkSession.builder.appName("test").getOrCreate()

它导致此错误消息

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

Setting default log level to "WARN".

To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

18/01/20 17:10:06 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

Traceback (most recent call last):

File "", line 1, in

File "/opt/spark/python/pyspark/sql/session.py", line 173, in getOrCreate

sc = SparkContext.getOrCreate(sparkConf)

File "/opt/spark/python/pyspark/context.py", line 334, in getOrCreate

SparkContext(conf=conf or SparkConf())

File "/opt/spark/python/pyspark/context.py", line 118, in __init__

conf, jsc, profiler_cls)

File "/opt/spark/python/pyspark/context.py", line 180, in _do_init

self._jsc = jsc or self._initialize_context(self._conf._jconf)

File "/opt/spark/python/pyspark/context.py", line 273, in _initialize_context

return self._jvm.JavaSparkContext(jconf)

File "/home/khurram/anaconda3/lib/python3.6/site-packages/py4j/java_gateway.py", line 1428, in __call__

answer, self._gateway_client, None, self._fqn)

File "/home/khurram/anaconda3/lib/python3.6/site-packages/py4j/protocol.py", line 320, in get_return_value

format(target_id, ".", name), value)

py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.

: java.lang.ExceptionInInitializerError

at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:546)

at org.apache.spark.SparkContext.(SparkContext.scala:373)

at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58)

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)

at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

at java.lang.reflect.Constructor.newInstance(Constructor.java:423)

at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)

at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)

at py4j.Gateway.invoke(Gateway.java:236)

at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)

at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)

at py4j.GatewayConnection.run(GatewayConnection.java:214)

at java.lang.Thread.run(Thread.java:748)

Caused by: java.net.UnknownHostException: linux-0he7: linux-0he7: Name or service not known

at java.net.InetAddress.getLocalHost(InetAddress.java:1505)

at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:891)

at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:884)

at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:884)

at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:941)

at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:941)

at scala.Option.getOrElse(Option.scala:121)

at org.apache.spark.util.Utils$.localHostName(Utils.scala:941)

at org.apache.spark.internal.config.package$.(package.scala:204)

at org.apache.spark.internal.config.package$.(package.scala)

... 14 more

Caused by: java.net.UnknownHostException: linux-0he7: Name or service not known

at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)

at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:928)

at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1323)

at java.net.InetAddress.getLocalHost(InetAddress.java:1500)

... 23 more

我的操作系统详细信息是

OS:

OpenSuse Leap 42.2 64-bit

Java的:

khurram@linux-0he7:~> java -version

openjdk version "1.8.0_151"

斯卡拉

khurram@linux-0he7:~> scala -version

Scala code runner version 2.12.4 -- Copyright 2002-2017, LAMP/EPFL and Lightbend, Inc.

Hadoop 3.0

khurram@linux-0he7:~> echo $HADOOP_HOME

/opt/hadoop

Py4J

khurram@linux-0he7:~> pip show py4j

Name: py4j

Version: 0.10.6

Summary: Enables Python programs to dynamically access arbitrary Java objects

Home-page: https://www.py4j.org/

Author: Barthelemy Dagenais

Author-email: barthelemy@infobart.com

License: BSD License

Location: /home/khurram/anaconda3/lib/python3.6/site-packages

Requires:

khurram@linux-0he7:~>

我已经对Hadoop和Spark目录执行了chmod 777.

khurram@linux-0he7:~> ls -al /opt/

total 8

drwxr-xr-x 1 root root 96 Jan 19 20:22 .

drwxr-xr-x 1 root root 222 Jan 20 14:54 ..

lrwxrwxrwx 1 root root 18 Jan 19 20:22 hadoop -> /opt/hadoop-3.0.0/

drwxrwxrwx 1 khurram users 126 Dec 8 19:42 hadoop-3.0.0

lrwxrwxrwx 1 root root 30 Jan 19 19:40 spark -> /opt/spark-2.2.1-bin-hadoop2.7

drwxrwxrwx 1 khurram users 150 Jan 19 19:33 spark-2.2.1-bin-hadoop2.7

khurram@linux-0he7:~>

主机文件的内容

khurram@linux-0he7:> cat /etc/hosts

127.0.0.1 localhost

# special IPv6 addresses

::1 localhost ipv6-localhost ipv6-loopback

fe00::0 ipv6-localnet

ff00::0 ipv6-mcastprefix

ff02::1 ipv6-allnodes

ff02::2 ipv6-allrouters

ff02::3 ipv6-allhosts

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值