配置好HADOOP
用python连接到HDFS,创建目录失败
from hdfs.client import Client
HDFSHOST = "http://192.168.216.132:50070"
client = Client(HDFSHOST)
# 返回目录下的文件
print(client.list('/'))
client.makedirs('/tmp',permission=755)
报错信息
['data']
Traceback (most recent call last):
File "C:/Users/Administrator/PycharmProjects/pythonProject/main.py", line 6, in <module>
client.makedirs('/tmp',permission=755)
File "D:\yz\Anaconda3\lib\site-packages\hdfs\client.py", line 1029, in makedirs
self._mkdirs(hdfs_path, permission=permission)
File "D:\yz\Anaconda3\lib\site-packages\hdfs\client.py", line 118, in api_handler
raise err
hdfs.util.HdfsError: Permission denied: user=dr.who, access=WRITE, inode="/":root:supergroup:drwxr-xr-x
能正常浏览目录,但不能创建或上传报错
Permission denied: user=dr.who, access=WRITE, inode=“/”:root:supergroup:drwxr-xr-x
排查原因是因为当前用户缺少相关权限。
解决方法:进入/hadoop/etc/hadoop目录,使用vim命令编辑core-site.xml文件,在文件中加入下方代码:
<property>
<name>hadoop.http.staticuser.user</name>
<value>你的用户名</value>
</property>
保存后退出,重新