部署hadoop大数据报错如何解决

首次启动 HDFS 时,必须对其进行格式化操作

[root@hadoop111 hadoop]# hdfs namenode -format
2024-03-17 12:16:36,483 INFO namenode.NameNode: STARTUP_MSG: 
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = hadoop111/192.168.80.111
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 3.2.4
STARTUP_MSG:   classpath = /opt/module/hadoop//etc/hadoop:/opt/module/hadoop//share/hadoop/common/lib/kerb-client-1.0.1.jar:/opt/module/hadoop//share/hadoop/common/lib/hadoop-annotations-3.2.4.jar:/opt/module/hadoop//share/hadoop/common/lib/commons-configuration2-2.1.1.jar:/opt/module/hadoop//share/hadoop/common/lib/kerb-common-1.0.1.jar:/opt/module/hadoop//share/hadoop/common/lib/kerb-crypto-1.0.1.jar:/opt/module/hadoop//share/hadoop/common/lib/hadoop-auth-3.2.4.jar:/opt/module/hadoop//share/hadoop/common/lib/jersey-core-1.19.jar:/opt/module/hadoop//share/hadoop/common/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/module/hadoop//share/hadoop/common/lib/jetty-xml-9.4.43.v20210629.jar:/opt/module/hadoop//share/hadoop/common/lib/jcip-annotations-1.0-1.jar:/opt/module/hadoop//share/hadoop/common/lib/woodstox-core-5.3.0.jar:/opt/module/hadoop//share/hadoop/common/lib/gson-2.9.0.jar:/opt/module/hadoop//share/hadoop/common/lib/htrace-core4-4.1.0-incubating.jar:/opt/module/hadoop//share/hadoop/common/lib/commons-collections-3.2.2.jar:/opt/module/hadoop//share/hadoop/common/lib/netty-3.10.6.Final.jar:/opt/module/hadoop//share/hadoop/common/lib/zookeeper-3.4.14.jar:/opt/module/hadoop//share/hadoop/common/lib/jersey-server-1.19.jar:/opt/module/hadoop//share/hadoop/common/lib/slf4j-reload4j-1.7.35.jar:/opt/module/hadoop//share/hadoop/common/lib/paranamer-2.3.jar:/opt/module/hadoop//share/hadoop/common/lib/javax.servlet-api-3.1.0.jar:/opt/module/hadoop//share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/opt/module/hadoop//share/hadoop/common/lib/javax.activation-api-1.2.0.jar:/opt/module/hadoop//share/hadoop/common/lib/guava-27.0-jre.jar:/opt/module/hadoop//share/hadoop/common/lib/asm-5.0.4.jar:/opt/module/hadoop//share/hadoop/common/lib/j2objc-annotations-1.1.jar:/opt/module/hadoop//share/hadoop/common/lib/curator-framework-2.13.0.jar:/opt/module/hadoop//share/hadoop/common/lib/accessors-smart-2.4.7.jar:/opt/module/hadoop//share/hadoop/common/lib/jetty-server-9.4.43.v20210629.jar:/opt/module/hadoop//share/hadoop/common/lib/re2j-1.1.jar:/opt/module/hadoop//share/hadoop/common/lib/kerby-config-1.0.1.jar:/opt/module/hadoop//share/hadoop/common/lib/jetty-servlet-9.4.43.v20210629.jar:/opt/module/hadoop//share/hadoop/common/lib/jackson-databind-2.10.5.1.jar:/opt/module/hadoop//share/hadoop/common/lib/commons-beanutils-1.9.4.jar:/opt/module/hadoop//share/hadoop/common/lib/failureaccess-1.0.jar:/opt/module/hadoop//share/hadoop/common/lib/commons-cli-1.2.jar:/opt/module/hadoop//share/hadoop/common/lib/jersey-json-1.19.jar:/opt/module/hadoop//share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/opt/module/hadoop//share/hadoop/common/lib/dnsjava-2.1.7.jar:/opt/module/hadoop//share/hadoop/common/lib/jackson-xc-1.9.13.jar:/opt/module/hadoop//share/hadoop/common/lib/jsp-api-2.1.jar:/opt/module/hadoop//share/hadoop/common/lib/jaxb-api-2.2.11.jar:/opt/module/hadoop//share/hadoop/common/lib/reload4j-1.2.18.3.jar:/opt/module/hadoop//share/hadoop/common/lib/kerby-util-1.0.1.jar:/opt/module/hadoop//share/hadoop/common/lib/kerb-core-1.0.1.jar:/opt/module/hadoop//share/hadoop/common/lib/httpclient-4.5.13.jar:/opt/module/hadoop//share/hadoop/common/lib/nimbus-jose-jwt-9.8.1.jar:/opt/module/hadoop//share/hadoop/common/lib/curator-client-2.13.0.jar:/opt/module/hadoop//share/hadoop/common/lib/commons-text-1.4.jar:/opt/module/hadoop//share/hadoop/common/lib/jackson-annotations-2.10.5.jar:/opt/module/hadoop//share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/opt/module/hadoop//share/hadoop/common/lib/jetty-util-9.4.43.v20210629.jar:/opt/module/hadoop//share/hadoop/common/lib/jersey-servlet-1.19.jar:/opt/module/hadoop//share/hadoop/common/lib/jetty-http-9.4.43.v20210629.jar:/opt/module/hadoop//share/hadoop/common/lib/kerb-identity-1.0.1.jar:/opt/module/hadoop//share/hadoop/common/lib/spotbugs-annotations-3.1.9.jar:/opt/module/hadoop//share/hadoop/common/lib/checker-qual-2.5.2.jar:/opt/module/hadoop//share/hadoop/common/lib/jetty-webapp-9.4.43.v20210629.jar:/opt/module/hadoop//share/hadoop/common/lib/metrics-core-3.2.4.jar:/opt/module/hadoop//share/hadoop/common/lib/token-provider-1.0.1.jar:/opt/module/hadoop//share/hadoop/common/lib/snappy-java-1.0.5.jar:/opt/module/hadoop//share/hadoop/common/lib/commons-net-3.6.jar:/opt/module/hadoop//share/hadoop/common/lib/kerb-server-1.0.1.jar:/opt/module/hadoop//share/hadoop/common/lib/kerby-pkix-1.0.1.jar:/opt/module/hadoop//share/hadoop/common/lib/commons-compress-1.21.jar:/opt/module/hadoop//share/hadoop/common/lib/httpcore-4.4.13.jar:/opt/module/hadoop//share/hadoop/common/lib/jul-to-slf4j-1.7.35.jar:/opt/module/hadoop//share/hadoop/common/lib/jsr305-3.0.2.jar:/opt/module/hadoop//share/hadoop/common/lib/commons-math3-3.1.1.jar:/opt/module/hadoop//share/hadoop/common/lib/kerb-admin-1.0.1.jar:/opt/module/hadoop//share/hadoop/common/lib/curator-recipes-2.13.0.jar:/opt/module/hadoop//share/hadoop/common/lib/jetty-util-ajax-9.4.43.v20210629.jar:/opt/module/hadoop//share/hadoop/common/lib/commons-codec-1.11.jar:/opt/module/hadoop//share/hadoop/common/lib/commons-io-2.8.0.jar:/opt/module/hadoop//share/hadoop/common/lib/avro-1.7.7.jar:/opt/module/hadoop//share/hadoop/common/lib/kerby-asn1-1.0.1.jar:/opt/module/hadoop//share/hadoop/common/lib/commons-lang3-3.7.jar:/opt/module/hadoop//share/hadoop/common/lib/json-smart-2.4.7.jar:/opt/module/hadoop//share/hadoop/common/lib/kerb-util-1.0.1.jar:/opt/module/hadoop//share/hadoop/common/lib/jsr311-api-1.1.1.jar:/opt/module/hadoop//share/hadoop/common/lib/kerb-simplekdc-1.0.1.jar:/opt/module/hadoop//share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/opt/module/hadoop//share/hadoop/common/lib/jackson-core-2.10.5.jar:/opt/module/hadoop//share/hadoop/common/lib/kerby-xdr-1.0.1.jar:/opt/module/hadoop//share/hadoop/common/lib/jsch-0.1.55.jar:/opt/module/hadoop//share/hadoop/common/lib/jettison-1.1.jar:/opt/module/hadoop//share/hadoop/common/lib/protobuf-java-2.5.0.jar:/opt/module/hadoop//share/hadoop/common/lib/commons-logging-1.1.3.jar:/opt/module/hadoop//share/hadoop/common/lib/animal-sniffer-annotations-1.17.jar:/opt/module/hadoop//share/hadoop/common/lib/error_prone_annotations-2.2.0.jar:/opt/module/hadoop//share/hadoop/common/lib/slf4j-api-1.7.35.jar:/opt/module/hadoop//share/hadoop/common/lib/audience-annotations-0.5.0.jar:/opt/module/hadoop//share/hadoop/common/lib/jetty-security-9.4.43.v20210629.jar:/opt/module/hadoop//share/hadoop/common/lib/jetty-io-9.4.43.v20210629.jar:/opt/module/hadoop//share/hadoop/common/lib/stax2-api-4.2.1.jar:/opt/module/hadoop//share/hadoop/common/hadoop-common-3.2.4-tests.jar:/opt/module/hadoop//share/hadoop/common/hadoop-common-3.2.4.jar:/opt/module/hadoop//share/hadoop/common/hadoop-kms-3.2.4.jar:/opt/module/hadoop//share/hadoop/common/hadoop-nfs-3.2.4.jar:/opt/module/hadoop//share/hadoop/hdfs:/opt/module/hadoop//share/hadoop/hdfs/lib/kerb-client-1.0.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/hadoop-annotations-3.2.4.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/commons-configuration2-2.1.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/kerb-common-1.0.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/kerb-crypto-1.0.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/hadoop-auth-3.2.4.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jersey-core-1.19.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jetty-xml-9.4.43.v20210629.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jcip-annotations-1.0-1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/woodstox-core-5.3.0.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/gson-2.9.0.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/htrace-core4-4.1.0-incubating.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/commons-collections-3.2.2.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/netty-3.10.6.Final.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/zookeeper-3.4.14.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jersey-server-1.19.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/paranamer-2.3.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/json-simple-1.1.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/javax.activation-api-1.2.0.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/guava-27.0-jre.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/asm-5.0.4.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/j2objc-annotations-1.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/curator-framework-2.13.0.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/accessors-smart-2.4.7.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jetty-server-9.4.43.v20210629.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/re2j-1.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/kerby-config-1.0.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jetty-servlet-9.4.43.v20210629.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jackson-databind-2.10.5.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/commons-beanutils-1.9.4.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/failureaccess-1.0.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/commons-cli-1.2.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jersey-json-1.19.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/dnsjava-2.1.7.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jackson-xc-1.9.13.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jaxb-api-2.2.11.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/reload4j-1.2.18.3.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/kerby-util-1.0.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/kerb-core-1.0.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/httpclient-4.5.13.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/nimbus-jose-jwt-9.8.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/curator-client-2.13.0.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/commons-text-1.4.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/netty-all-4.1.68.Final.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jackson-annotations-2.10.5.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jetty-util-9.4.43.v20210629.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jersey-servlet-1.19.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jetty-http-9.4.43.v20210629.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/kerb-identity-1.0.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/spotbugs-annotations-3.1.9.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/checker-qual-2.5.2.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jetty-webapp-9.4.43.v20210629.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/token-provider-1.0.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/snappy-java-1.0.5.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/commons-net-3.6.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/kerb-server-1.0.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/kerby-pkix-1.0.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/commons-compress-1.21.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/httpcore-4.4.13.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jsr305-3.0.2.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/commons-math3-3.1.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/kerb-admin-1.0.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/curator-recipes-2.13.0.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jetty-util-ajax-9.4.43.v20210629.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/commons-codec-1.11.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/commons-io-2.8.0.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/avro-1.7.7.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/kerby-asn1-1.0.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/commons-lang3-3.7.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/json-smart-2.4.7.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/kerb-util-1.0.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jsr311-api-1.1.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/kerb-simplekdc-1.0.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/okio-1.6.0.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jackson-core-2.10.5.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/kerby-xdr-1.0.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jsch-0.1.55.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jettison-1.1.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/animal-sniffer-annotations-1.17.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/error_prone_annotations-2.2.0.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/audience-annotations-0.5.0.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jetty-security-9.4.43.v20210629.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/okhttp-2.7.5.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/jetty-io-9.4.43.v20210629.jar:/opt/module/hadoop//share/hadoop/hdfs/lib/stax2-api-4.2.1.jar:/opt/module/hadoop//share/hadoop/hdfs/hadoop-hdfs-3.2.4.jar:/opt/module/hadoop//share/hadoop/hdfs/hadoop-hdfs-3.2.4-tests.jar:/opt/module/hadoop//share/hadoop/hdfs/hadoop-hdfs-client-3.2.4-tests.jar:/opt/module/hadoop//share/hadoop/hdfs/hadoop-hdfs-client-3.2.4.jar:/opt/module/hadoop//share/hadoop/hdfs/hadoop-hdfs-rbf-3.2.4-tests.jar:/opt/module/hadoop//share/hadoop/hdfs/hadoop-hdfs-native-client-3.2.4-tests.jar:/opt/module/hadoop//share/hadoop/hdfs/hadoop-hdfs-native-client-3.2.4.jar:/opt/module/hadoop//share/hadoop/hdfs/hadoop-hdfs-httpfs-3.2.4.jar:/opt/module/hadoop//share/hadoop/hdfs/hadoop-hdfs-rbf-3.2.4.jar:/opt/module/hadoop//share/hadoop/hdfs/hadoop-hdfs-nfs-3.2.4.jar:/opt/module/hadoop//share/hadoop/mapreduce/lib/junit-4.13.2.jar:/opt/module/hadoop//share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/opt/module/hadoop//share/hadoop/mapreduce/hadoop-mapreduce-client-core-3.2.4.jar:/opt/module/hadoop//share/hadoop/mapreduce/hadoop-mapreduce-client-app-3.2.4.jar:/opt/module/hadoop//share/hadoop/mapreduce/hadoop-mapreduce-examples-3.2.4.jar:/opt/module/hadoop//share/hadoop/mapreduce/hadoop-mapreduce-client-uploader-3.2.4.jar:/opt/module/hadoop//share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-3.2.4.jar:/opt/module/hadoop//share/hadoop/mapreduce/hadoop-mapreduce-client-common-3.2.4.jar:/opt/module/hadoop//share/hadoop/mapreduce/hadoop-mapreduce-client-hs-3.2.4.jar:/opt/module/hadoop//share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-3.2.4.jar:/opt/module/hadoop//share/hadoop/mapreduce/hadoop-mapreduce-client-nativetask-3.2.4.jar:/opt/module/hadoop//share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-3.2.4-tests.jar:/opt/module/hadoop//share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-3.2.4.jar:/opt/module/hadoop//share/hadoop/yarn:/opt/module/hadoop//share/hadoop/yarn/lib/jersey-guice-1.19.jar:/opt/module/hadoop//share/hadoop/yarn/lib/swagger-annotations-1.5.4.jar:/opt/module/hadoop//share/hadoop/yarn/lib/json-io-2.5.1.jar:/opt/module/hadoop//share/hadoop/yarn/lib/HikariCP-java7-2.4.12.jar:/opt/module/hadoop//share/hadoop/yarn/lib/ehcache-3.3.1.jar:/opt/module/hadoop//share/hadoop/yarn/lib/bcpkix-jdk15on-1.60.jar:/opt/module/hadoop//share/hadoop/yarn/lib/jackson-jaxrs-json-provider-2.10.5.jar:/opt/module/hadoop//share/hadoop/yarn/lib/javax.inject-1.jar:/opt/module/hadoop//share/hadoop/yarn/lib/aopalliance-1.0.jar:/opt/module/hadoop//share/hadoop/yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/opt/module/hadoop//share/hadoop/yarn/lib/metrics-core-3.2.4.jar:/opt/module/hadoop//share/hadoop/yarn/lib/objenesis-1.0.jar:/opt/module/hadoop//share/hadoop/yarn/lib/snakeyaml-1.26.jar:/opt/module/hadoop//share/hadoop/yarn/lib/bcprov-jdk15on-1.60.jar:/opt/module/hadoop//share/hadoop/yarn/lib/jackson-module-jaxb-annotations-2.10.5.jar:/opt/module/hadoop//share/hadoop/yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/module/hadoop//share/hadoop/yarn/lib/jackson-jaxrs-base-2.10.5.jar:/opt/module/hadoop//share/hadoop/yarn/lib/fst-2.50.jar:/opt/module/hadoop//share/hadoop/yarn/lib/mssql-jdbc-6.2.1.jre7.jar:/opt/module/hadoop//share/hadoop/yarn/lib/guice-4.0.jar:/opt/module/hadoop//share/hadoop/yarn/lib/java-util-1.9.0.jar:/opt/module/hadoop//share/hadoop/yarn/lib/jersey-client-1.19.jar:/opt/module/hadoop//share/hadoop/yarn/lib/guice-servlet-4.0.jar:/opt/module/hadoop//share/hadoop/yarn/lib/jakarta.activation-api-1.2.1.jar:/opt/module/hadoop//share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-3.2.4.jar:/opt/module/hadoop//share/hadoop/yarn/hadoop-yarn-server-tests-3.2.4.jar:/opt/module/hadoop//share/hadoop/yarn/hadoop-yarn-services-core-3.2.4.jar:/opt/module/hadoop//share/hadoop/yarn/hadoop-yarn-services-api-3.2.4.jar:/opt/module/hadoop//share/hadoop/yarn/hadoop-yarn-api-3.2.4.jar:/opt/module/hadoop//share/hadoop/yarn/hadoop-yarn-server-nodemanager-3.2.4.jar:/opt/module/hadoop//share/hadoop/yarn/hadoop-yarn-server-router-3.2.4.jar:/opt/module/hadoop//share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-3.2.4.jar:/opt/module/hadoop//share/hadoop/yarn/hadoop-yarn-client-3.2.4.jar:/opt/module/hadoop//share/hadoop/yarn/hadoop-yarn-submarine-3.2.4.jar:/opt/module/hadoop//share/hadoop/yarn/hadoop-yarn-server-timeline-pluginstorage-3.2.4.jar:/opt/module/hadoop//share/hadoop/yarn/hadoop-yarn-common-3.2.4.jar:/opt/module/hadoop//share/hadoop/yarn/hadoop-yarn-server-resourcemanager-3.2.4.jar:/opt/module/hadoop//share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-3.2.4.jar:/opt/module/hadoop//share/hadoop/yarn/hadoop-yarn-registry-3.2.4.jar:/opt/module/hadoop//share/hadoop/yarn/hadoop-yarn-server-common-3.2.4.jar:/opt/module/hadoop//share/hadoop/yarn/hadoop-yarn-server-web-proxy-3.2.4.jar:/opt/module/hadoop//share/hadoop/yarn/hadoop-yarn-applications-distributedshell-3.2.4.jar
STARTUP_MSG:   build = Unknown -r 7e5d9983b388e372fe640f21f048f2f2ae6e9eba; compiled by 'ubuntu' on 2022-07-12T11:58Z
STARTUP_MSG:   java = 1.8.0_131
************************************************************/
2024-03-17 12:16:36,508 INFO namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
2024-03-17 12:16:36,611 ERROR conf.Configuration: error parsing conf core-site.xml
com.ctc.wstx.exc.WstxUnexpectedCharException: Unexpected character 'S' (code 83) in prolog; expected '<'
 at [row,col,system-id]: [1,2,"file:/opt/module/hadoop-3.2.4/etc/hadoop/core-site.xml"]
    at com.ctc.wstx.sr.StreamScanner.throwUnexpectedChar(StreamScanner.java:666)
    at com.ctc.wstx.sr.BasicStreamReader.nextFromProlog(BasicStreamReader.java:2130)
    at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1179)
    at org.apache.hadoop.conf.Configuration$Parser.parseNext(Configuration.java:3336)
    at org.apache.hadoop.conf.Configuration$Parser.parse(Configuration.java:3130)
    at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3023)
    at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2984)
    at org.apache.hadoop.conf.Configuration.loadProps(Configuration.java:2862)
    at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2844)
    at org.apache.hadoop.conf.Configuration.get(Configuration.java:1200)
    at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1812)
    at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1789)
    at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183)
    at org.apache.hadoop.util.ShutdownHookManager$HookEntry.<init>(ShutdownHookManager.java:207)
    at org.apache.hadoop.util.ShutdownHookManager.addShutdownHook(ShutdownHookManager.java:304)
    at org.apache.hadoop.util.StringUtils.startupShutdownMessage(StringUtils.java:749)
    at org.apache.hadoop.util.StringUtils.startupShutdownMessage(StringUtils.java:733)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1789)
2024-03-17 12:16:36,615 ERROR namenode.NameNode: Failed to start namenode.
java.lang.RuntimeException: com.ctc.wstx.exc.WstxUnexpectedCharException: Unexpected character 'S' (code 83) in prolog; expected '<'
 at [row,col,system-id]: [1,2,"file:/opt/module/hadoop-3.2.4/etc/hadoop/core-site.xml"]
    at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3040)
    at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2984)
    at org.apache.hadoop.conf.Configuration.loadProps(Configuration.java:2862)
    at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2844)
    at org.apache.hadoop.conf.Configuration.get(Configuration.java:1200)
    at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1812)
    at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1789)
    at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183)
    at org.apache.hadoop.util.ShutdownHookManager$HookEntry.<init>(ShutdownHookManager.java:207)
    at org.apache.hadoop.util.ShutdownHookManager.addShutdownHook(ShutdownHookManager.java:304)
    at org.apache.hadoop.util.StringUtils.startupShutdownMessage(StringUtils.java:749)
    at org.apache.hadoop.util.StringUtils.startupShutdownMessage(StringUtils.java:733)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1789)
Caused by: com.ctc.wstx.exc.WstxUnexpectedCharException: Unexpected character 'S' (code 83) in prolog; expected '<'
 at [row,col,system-id]: [1,2,"file:/opt/module/hadoop-3.2.4/etc/hadoop/core-site.xml"]
    at com.ctc.wstx.sr.StreamScanner.throwUnexpectedChar(StreamScanner.java:666)
    at com.ctc.wstx.sr.BasicStreamReader.nextFromProlog(BasicStreamReader.java:2130)
    at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1179)
    at org.apache.hadoop.conf.Configuration$Parser.parseNext(Configuration.java:3336)
    at org.apache.hadoop.conf.Configuration$Parser.parse(Configuration.java:3130)
    at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3023)
    ... 12 more
2024-03-17 12:16:36,617 INFO util.ExitUtil: Exiting with status 1: java.lang.RuntimeException: com.ctc.wstx.exc.WstxUnexpectedCharException: Unexpected character 'S' (code 83) in prolog; expected '<'
 at [row,col,system-id]: [1,2,"file:/opt/module/hadoop-3.2.4/etc/hadoop/core-site.xml"]
2024-03-17 12:16:36,655 ERROR conf.Configuration: error parsing conf core-site.xml
com.ctc.wstx.exc.WstxUnexpectedCharException: Unexpected character 'S' (code 83) in prolog; expected '<'
 at [row,col,system-id]: [1,2,"file:/opt/module/hadoop-3.2.4/etc/hadoop/core-site.xml"]
    at com.ctc.wstx.sr.StreamScanner.throwUnexpectedChar(StreamScanner.java:666)
    at com.ctc.wstx.sr.BasicStreamReader.nextFromProlog(BasicStreamReader.java:2130)
    at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1179)
    at org.apache.hadoop.conf.Configuration$Parser.parseNext(Configuration.java:3336)
    at org.apache.hadoop.conf.Configuration$Parser.parse(Configuration.java:3130)
    at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3023)
    at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2984)
    at org.apache.hadoop.conf.Configuration.loadProps(Configuration.java:2862)
    at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2844)
    at org.apache.hadoop.conf.Configuration.get(Configuration.java:1200)
    at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1812)
    at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1789)
    at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183)
    at org.apache.hadoop.util.ShutdownHookManager.shutdownExecutor(ShutdownHookManager.java:145)
    at org.apache.hadoop.util.ShutdownHookManager.access$300(ShutdownHookManager.java:65)
    at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:102)
Exception in thread "Thread-1" java.lang.RuntimeException: com.ctc.wstx.exc.WstxUnexpectedCharException: Unexpected character 'S' (code 83) in prolog; expected '<'
 at [row,col,system-id]: [1,2,"file:/opt/module/hadoop-3.2.4/etc/hadoop/core-site.xml"]
    at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3040)
    at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2984)
    at org.apache.hadoop.conf.Configuration.loadProps(Configuration.java:2862)
    at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2844)
    at org.apache.hadoop.conf.Configuration.get(Configuration.java:1200)
    at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1812)
    at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1789)
    at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183)
    at org.apache.hadoop.util.ShutdownHookManager.shutdownExecutor(ShutdownHookManager.java:145)
    at org.apache.hadoop.util.ShutdownHookManager.access$300(ShutdownHookManager.java:65)
    at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:102)
Caused by: com.ctc.wstx.exc.WstxUnexpectedCharException: Unexpected character 'S' (code 83) in prolog; expected '<'
 at [row,col,system-id]: [1,2,"file:/opt/module/hadoop-3.2.4/etc/hadoop/core-site.xml"]
    at com.ctc.wstx.sr.StreamScanner.throwUnexpectedChar(StreamScanner.java:666)
    at com.ctc.wstx.sr.BasicStreamReader.nextFromProlog(BasicStreamReader.java:2130)
    at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1179)
    at org.apache.hadoop.conf.Configuration$Parser.parseNext(Configuration.java:3336)
    at org.apache.hadoop.conf.Configuration$Parser.parse(Configuration.java:3130)
    at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3023)
    ... 10 more
 

  • 22
    点赞
  • 26
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
Hadoop大数据技术是一种开源的分布式计算框架,它能够处理大规模数据集并提供高性能、可伸缩和可靠性的数据处理能力。 Hadoop的核心组件包括Hadoop分布式文件系统(HDFS)和Hadoop分布式计算框架(MapReduce)。HDFS是一个分布式文件系统,可以将大数据集存储在多台计算机上,并提供高容错性和高可用性。MapReduce是一种分布式计算模型,可以将大规模数据集分割成小块,分发给集群的多个计算节点进行并行处理,最后将结果合并返回。 除了核心组件之外,Hadoop还提供了一些其他工具和组件,如YARN(资源管理系统)、HBase(分布式数据库)、Hive(数据仓库和查询语言)、Pig(数据分析工具)等,这些工具和组件可以与Hadoop一起使用,提供更多丰富的功能和更灵活的数据处理方式。 Hadoop大数据技术的优势主要体现在以下几个方面: 1. 可扩展性:Hadoop能够通过增加计算节点来处理更大规模的数据集,从而实现高性能的数据处理能力。 2. 容错性:Hadoop将数据复制到不同的计算节点上,即使某个节点出现故障,数据依然可以恢复和访问。 3. 成本效益:Hadoop使用廉价的硬件来构建集群,相比传统的大型服务器,成本更低。 4. 处理速度快:由于使用分布式计算模型,Hadoop可以在短时间内处理大规模数据集,提供高速的数据处理能力。 5. 灵活性:Hadoop提供了各种工具和组件,使得开发人员可以根据自己的需求选择最合适的方式来处理数据。 总的来说,Hadoop大数据技术是一个非常强大的数据处理工具,可以帮助企业处理和分析大规模的数据,从而提供更准确、更全面的数据分析和决策支持。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值