python open permission denied_spark-submit python 程序,"/home/.python-eggs" permission denied 问题解决...

问题描述,spark-submit 用 yarn 模式提交一个python 脚本运行程序,运行到需要分布式的部分,即map/mapPartition等等RDD的时候,或者actor RDD的时候,报错如下 :

Traceback (most recent call last):

File "/usr/lib64/python2.7/runpy.py", line 151, in _run_module_as_main

mod_name, loader, code, fname = _get_module_details(mod_name)

File "/usr/lib64/python2.7/runpy.py", line 101, in _get_module_details

loader = get_loader(mod_name)

File "/usr/lib64/python2.7/pkgutil.py", line 464, in get_loader

return find_loader(fullname)

File "/usr/lib64/python2.7/pkgutil.py", line 474, in find_loader

for importer in iter_importers(fullname):

File "/usr/lib64/python2.7/pkgutil.py", line 430, in iter_importers

__import__(pkg)

File "/data8/yarn/local-dir/usercache/bo.feng/appcache/application_1448854352032_70810/container_1448854352032_70810_01_000002/pyspark.zip/pyspark/__init__.py", line 41, in

File "/data8/yarn/local-dir/usercache/bo.feng/appcache/application_1448854352032_70810/container_1448854352032_70810_01_000002/pyspark.zip/pyspark/context.py", line 35, in

File "/data8/yarn/local-dir/usercache/bo.feng/appcache/application_1448854352032_70810/container_1448854352032_70810_01_000002/pyspark.zip/pyspark/rdd.py", line 51, in

File "/data8/yarn/local-dir/usercache/bo.feng/appcache/application_1448854352032_70810/container_1448854352032_70810_01_000002/pyspark.zip/pyspark/shuffle.py", line 33, in

File "build/bdist.linux-x86_64/egg/psutil/__init__.py", line 89, in

File "build/bdist.linux-x86_64/egg/psutil/_pslinux.py", line 24, in

File "build/bdist.linux-x86_64/egg/_psutil_linux.py", line 7, in

File "build/bdist.linux-x86_64/egg/_psutil_linux.py", line 4, in __bootstrap__

File "/usr/lib/python2.7/site-packages/pkg_resources.py", line 945, in resource_filename

self, resource_name

File "/usr/lib/python2.7/site-packages/pkg_resources.py", line 1633, in get_resource_filename

self._extract_resource(manager, self._eager_to_zip(name))

File "/usr/lib/python2.7/site-packages/pkg_resources.py", line 1661, in _extract_resource

self.egg_name, self._parts(zip_path)

File "/usr/lib/python2.7/site-packages/pkg_resources.py", line 1025, in get_cache_path

self.extraction_error()

File "/usr/lib/python2.7/site-packages/pkg_resources.py", line 991, inextraction_error

raise err

pkg_resources.ExtractionError: Can't extract file(s) to egg cache

The following error occurred while trying to extract file(s) to the Python egg

cache:

[Errno 13] Permission denied: '/home/.python-eggs'

The Python egg cache directory is currently set to:

/home/.python-eggs

Perhaps your account does not have write access to this directory? You can

change the cache directory by setting the PYTHON_EGG_CACHE environment

variable to point to an accessible directory.

解决方案:

1、在你的map/mapPartition 里面的代码里面加上:

os.environ['PYTHON_EGG_CACHE'] = '/tmp/.python-eggs/'os.environ['PYTHON_EGG_DIR']='/tmp/.python-eggs/'

2、在集群的每一台机器上面配置环境变量(推荐):

os.environ['PYTHON_EGG_CACHE'] = '/tmp/.python-eggs/'os.environ['PYTHON_EGG_DIR']='/tmp/.python-eggs/'

3、打开spark的根目录,cd到python/lib,找到pyspark.zip文件,解压文件,cd 到pyspark里面,找到rdd.py ,vim打开,找到  “import os”这一行,在这行下面插入代码:

os.environ['PYTHON_EGG_CACHE'] = '/tmp/.python-eggs/'os.environ['PYTHON_EGG_DIR']='/tmp/.python-eggs/'

以上三种方案都不能解决这个问题的话,建议先用 hadoop 的streaming 功能 提交一个python 的执行文件,测试yarn是否支持python运算。

然后再看看用spark的standalone模式是不是可以提交python任务。

以上。

如果还有问题,那就只能发邮件给spark的开发组了。

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值