1. 报错
报错如下:
Collecting llama
Using cached https://dev.cmri.cn/nexus/repository/pypi/packages/llama/0.1.1/llama-0.1.1.tar.gz (387 kB)
ERROR: Command errored out with exit status 1:
command: /opt/conda/bin/python -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-wzdqtbal/llama/setup.py'"'"'; __file__='"'"'/tmp/pip-install-wzdqtbal/llama/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-install-wzdqtbal/llama/pip-egg-info
cwd: /tmp/pip-install-wzdqtbal/llama/
Complete output (5 lines):
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/tmp/pip-install-wzdqtbal/llama/setup.py", line 6, in <module>
execfile('llama/version.py')
NameError: name 'execfile' is not defined
----------------------------------------
ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.
因为 Python3 不支持 execfile
方法,execfile
是 Python2 的方法。需要将 execfile
切换到 exec
。
2. 解决方法
下载对应的包,例如我是安装 llama
这个包,则可如下安装:
wget https://dev.cmri.cn/nexus/repository/pypi/packages/llama/0.1.1/llama-0.1.1.tar.gz
解压包:
tar -zxf llama-0.1.1.tar.gz
找到对应的报错文件,将 execfile
方法切换到 exec
。如下:
#!/usr/bin/env python
import os
from setuptools import find_packages, setup
exec(open('llama/version.py').read()) # 替换原先的 execfile('llama/version.py')
with open('requirements.txt') as fh:
required = fh.read().splitlines()
setup(
name='llama',
version=str(__version__),
description='LLAMA - Loss & LAtency MAtrix',
url='https://github.com/dropbox/llama',
author='Bryan Reed',
maintainer='Daniel Martin',
author_email='breed@dropbox.com',
maintainer_email='dmar@dropbox.com',
license='Apache',
classifiers=[
'Development Status :: 1 - Planning',
'Intended Audience :: System Administrators',
'License :: OSI Approved :: Apache Software License',
'Operating System :: POSIX :: Linux',
'Programming Language :: Python :: 2.7',
'Topic :: System :: Networking :: Monitoring',
],
keywords='llama udp loss latency matrix probe packet',
scripts=['bin/llama_collector'],
packages=find_packages(exclude=['docs', 'tests*']),
include_package_data=True,
zip_safe=False,
install_requires=required,
)
然后在目录下执行如下命令:
pip install -e .
3. 参考
https://www.jiyik.com/tm/xwzj/prolan_6521.html