1. WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
解决方案
去除警告,vim hadoop/etc/hadoop/log4j.properties
添加 log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR
2. Failed to replace a bad datanode on the existing pipeline due to no more good datanodes being available to try. (Nodes: current=[DatanodeInfoWithStorage[192.168.1.125:50010,DS-c41f5f60-fb7e-4afd-814e-d4ee05623630,DISK], DatanodeInfoWithStorage[192.168.1.1
解决方案
修改 hdfs-site.xml
dfs.client.block.write.replace-datanode-on-failure.policy
NEVER
dfs.client.block.write.replace-datanode-on-failure.enable
true
3. 下载文件错误1 :java 调用hadoop API 报错 java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
解决方案
将对应的hadoop.gz包在windows中解压缩,然后配置环境变量 HADOOP_HOME E:\hadoop-3.2.0\hadoop-3.2.0重启eclipse
4. 下载文件错误2: java.io.FileNotFoundException: Could not locate Hadoop executable: E:\hadoop-3.2.0\hadoop-3.2.0\bin\winutils.exe
解决方案
需要在windows下解压一份windows的安装包 将winutils.exe 放到 bin目录下 再次运行 解决问题
运行mapreduce 程序的时候 报错
错误: 找不到或无法加载主类org.apache.hadoop.mapreduce.v2.app.MRAppMaster
解决方案
在命令行输入hadoop classpath 得到路径
添加 yarn-site.xml
yarn.application.classpath
/usr/local/webserver/hadoop-3.2.0/etc/hadoop,/usr/local/webserver/hadoop-3.2.0/share/hadoop/common/lib/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/common/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/hdfs,/usr/local/webserver/hadoop-3.2.0/share/hadoop/hdfs/lib/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/hdfs/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/mapreduce/lib/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/mapreduce/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/yarn,/usr/local/webserver/hadoop-3.2.0/share/hadoop/yarn/lib/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/yarn/*
5. org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot delete错误是 hadoop 处于安全状态不能删除
解决方案hadoop dfsadmin -safemode leave