Error: JAVA_HOME is incorrectly set.Please update E:\hadoop\hadoop-2.6.5\conf\hadoop-env.cmd解决探秘

在控制台输入hdfs namenode -format 后提示Error: JAVA_HOME is incorrectly set.Please update E:\hadoop\hadoop-2.6.5\conf\hadoop-env.cmd错误如下图:

E:\hadoop\hadoop-2.6.5\etc\hadoop>hdfs namenode -format
系统找不到指定的路径。
Error: JAVA_HOME is incorrectly set.
       Please update E:\hadoop\hadoop-2.6.5\conf\hadoop-env.cmd
'-Dhadoop.security.logger' 不是内部或外部命令,也不是可运行的程序
或批处理文件。

看看你的JAVA_HOME环境变量配置是否有问题,在控制台中输入java -version得到如下输出则说明没问题:

E:\hadoop\hadoop-2.6.5\etc\hadoop>java -version
java version "9.0.1"
Java(TM) SE Runtime Environment (build 9.0.1+11)
Java HotSpot(TM) 64-Bit Server VM (build 9.0.1+11, mixed mode)

打开E:\hadoop\hadoop-2.6.5\etc\hadoop\hadoop-env.cmd文件

@echo off
@rem Licensed to the Apache Software Foundation (ASF) under one or more
@rem contributor license agreements.  See the NOTICE file distributed with
@rem this work for additional information regarding copyright ownership.
@rem The ASF licenses this file to You under the Apache License, Version 2.0
@rem (the "License"); you may not use this file except in compliance with
@rem the License.  You may obtain a copy of the License at
@rem
@rem     http://www.apache.org/licenses/LICENSE-2.0
@rem
@rem Unless required by applicable law or agreed to in writing, software
@rem distributed under the License is distributed on an "AS IS" BASIS,
@rem WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
@rem See the License for the specific language governing permissions and
@rem limitations under the License.

@rem Set Hadoop-specific environment variables here.

@rem The only required environment variable is JAVA_HOME.  All others are
@rem optional.  When running a distributed configuration it is best to
@rem set JAVA_HOME in this file, so that it is correctly defined on
@rem remote nodes.

@rem The java implementation to use.  Required.
@rem set JAVA_HOME=%JAVA_HOME%

set JAVA_HOME=E:\java\jdk-9.0.1

@rem The jsvc implementation to use. Jsvc is required to run secure datanodes.
@rem set JSVC_HOME=%JSVC_HOME%

@rem set HADOOP_CONF_DIR=

@rem Extra Java CLASSPATH elements.  Automatically insert capacity-scheduler.
if exist %HADOOP_HOME%\contrib\capacity-scheduler (
  if not defined HADOOP_CLASSPATH (
    set HADOOP_CLASSPATH=%HADOOP_HOME%\contrib\capacity-scheduler\*.jar
  ) else (
    set HADOOP_CLASSPATH=%HADOOP_CLASSPATH%;%HADOOP_HOME%\contrib\capacity-scheduler\*.jar
  )
)

@rem The maximum amount of heap to use, in MB. Default is 1000.
@rem set HADOOP_HEAPSIZE=
@rem set HADOOP_NAMENODE_INIT_HEAPSIZE=""

@rem Extra Java runtime options.  Empty by default.
@rem set HADOOP_OPTS=%HADOOP_OPTS% -Djava.net.preferIPv4Stack=true

@rem Command specific options appended to HADOOP_OPTS when specified
if not defined HADOOP_SECURITY_LOGGER (
  set HADOOP_SECURITY_LOGGER=INFO,RFAS
)
if not defined HDFS_AUDIT_LOGGER (
  set HDFS_AUDIT_LOGGER=INFO,NullAppender
)

set HADOOP_NAMENODE_OPTS=-Dhadoop.security.logger=%HADOOP_SECURITY_LOGGER% -Dhdfs.audit.logger=%HDFS_AUDIT_LOGGER% %HADOOP_NAMENODE_OPTS%
set HADOOP_DATANODE_OPTS=-Dhadoop.security.logger=ERROR,RFAS %HADOOP_DATANODE_OPTS%
set HADOOP_SECONDARYNAMENODE_OPTS=-Dhadoop.security.logger=%HADOOP_SECURITY_LOGGER% -Dhdfs.audit.logger=%HDFS_AUDIT_LOGGER% %HADOOP_SECONDARYNAMENODE_OPTS%

@rem The following applies to multiple commands (fs, dfs, fsck, distcp etc)
set HADOOP_CLIENT_OPTS=-Xmx512m %HADOOP_CLIENT_OPTS%
@rem set HADOOP_JAVA_PLATFORM_OPTS="-XX:-UsePerfData %HADOOP_JAVA_PLATFORM_OPTS%"

@rem On secure datanodes, user to run the datanode as after dropping privileges
set HADOOP_SECURE_DN_USER=%HADOOP_SECURE_DN_USER%

@rem Where log files are stored.  %HADOOP_HOME%/logs by default.
@rem set HADOOP_LOG_DIR=%HADOOP_LOG_DIR%\%USERNAME%

@rem Where log files are stored in the secure data environment.
set HADOOP_SECURE_DN_LOG_DIR=%HADOOP_LOG_DIR%\%HADOOP_HDFS_USER%

@rem The directory where pid files are stored. /tmp by default.
@rem NOTE: this should be set to a directory that can only be written to by 
@rem       the user that will run the hadoop daemons.  Otherwise there is the
@rem       potential for a symlink attack.
set HADOOP_PID_DIR=%HADOOP_PID_DIR%
set HADOOP_SECURE_DN_PID_DIR=%HADOOP_PID_DIR%

@rem A string representing this instance of hadoop. %USERNAME% by default.
set HADOOP_IDENT_STRING=%USERNAME%

配置好如下图所示:

E:\hadoop\hadoop-2.6.5\etc\hadoop>hdfs namenode -format
19/10/26 15:40:31 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = wishreally/192.168.11.1
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 2.6.5
STARTUP_MSG:   classpath = E:\hadoop\hadoop-2.6.5\etc\hadoop;E:\hadoop\hadoop-2.6.5\share\hadoop\common\lib\activation-1
.1.jar;E:\hadoop\hadoop-2.6.5\share\hadoop\common\lib\apacheds-i18n-2.0.0-M15.jar;E:\hadoop\hadoop-2.6.5\share\hadoop\co

 

### 正确设置 `JAVA_HOME` 环境变量 当遇到 `Error: JAVA_HOME is incorrectly set.` 的提示时,表明 Hadoop 配置件未能正确识别 Java 安装路径。为了修正此问题并确保 Hadoop 能够正常运行,需按照以下方式调整配置。 #### 修改 `hadoop-env.cmd` 对于 Windows 用户,在安装有特定版本的 Hadoop(例如 hadoop-3.1.0),应编辑位于 `\etc\hadoop\` 件夹内的 `hadoop-env.cmd` 件[^1]。具体操作为: 找到并打开 `hadoop-env.cmd` 件,定位到设定 `JAVA_HOME` 变量的地方。如果当前设置指向了一个不正确的 JDK JRE 地址,则需要更正它以匹配实际安装位置。假设 JDK 已经安装到了 E:\jdk13 目录下,那么应该将这一行改为: ```batch set JAVA_HOME=E:\jdk13 ``` 者采用动态获取系统环境变量的方式,即恢复默认行为让其继承自系统的 `JAVA_HOME` 设置: ```batch set JAVA_HOME=%JAVA_HOME% ``` 这一步骤可以避免硬编码具体的 Java 版本号路径所带来的兼容性问题[^5]。 #### 处理路径中含有空格的情况 值得注意的是,某些情况下由于路径中包含空格字符(比如 "Program Files"),可能会引发解析错误。针对这种情况,建议使用短名称替代长名称来规避潜在的问题。例如,把 `"C:\Program Files"` 改成 `C:\PROGRA~1`[^4]。 通过上述更改保存件之后重新启动命令行工具,并再次尝试执行 `hadoop version` 来验证是否解决了原始错误消息。 ### 注意事项 确认全局环境变量也已适当设置了 `JAVA_HOME` 和 `PATH`,以便操作系统能够识别 Java 执行程序的位置。可以通过输入 `java -version` 测试这一点;如果没有返回任何关于未定义命令的信息而是显示了 Java 版本信息,则表示这部分配置无误[^2]。
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值