前提:项目中导入了需要的hadoop包(hadoop-2.5.1/share/common/lib/*, hadoop-2.5.1/share/common/hadoop-common-2.5.1.jar,hadoop-2.5.1/share/hadoop/hdfs/hadoop-hdfs-2.5.1.jar)
问题1:connection exception: java.net.ConnectException: Connection refused: no further information。
解决方案:将hadoop主机上的配置文件中的localhost改为主机IP
问题2: Cannot create directory /test. Name node is in safe mode.
解决方案:在hadoop主机上,退出hadoop安全模式
hadoop dfsadmin -savemode exit
问题3:Permission denied: user=DrWho, access=WRITE, inode="hadoop":hadoop:supergroup:rwxr-xr-x
解决方案:
到服务器上修改hadoop的配置文件:conf/hdfs-core.xml , 找到 dfs.permissions 的配置项 , 将value值改为 false。或者修改hadoop主机文件上的访问权限
<property>
<name>dfs.permissions</name>
<value>false</value>
<description>
If "true", enable permission checking in HDFS.
If "false", permission checking is turned off,
but all other behavior is unchanged.
Switching from one parameter value to the other does not change the mode,
owner or group of files or directories.
</description>
</property>
本文详细介绍了在Hadoop项目中遇到的连接异常、目录创建失败和权限拒绝等问题的解决方法,包括修改配置文件、退出安全模式和调整权限设置等步骤。
9723

被折叠的 条评论
为什么被折叠?



