Exception in thread "main" expected '<document start>', but found BlockMappingStart in 'reader', lin...

平台:centos-6.3-i386

   jdk-7u51

        storm 0.9.1

   python 2.6.6

     hadoop 1.2.1

  启动storm的时候,遇到这个问题,百度之后,看到大家的解决方案是在 nimbus.host: "master"前加上空格,但是,我的已经加上空格。还是出错。

# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements.  See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership.  The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License.  You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

########### These MUST be filled in for a storm configuration
# storm.zookeeper.servers:
     - "master"
     - "slave1"
     - "slave2"
#
 nimbus.host: "master"
 storm.local.dir: "/home/hadoop/apache-storm-0.9.1-incubating/data"
 supervisor.slots.ports:
    - 6700
    - 6701
    - 6702
    - 6703
#
#
# ##### These may optionally be filled in:
#
## List of custom serializations
# topology.kryo.register:
#     - org.mycompany.MyType
#     - org.mycompany.MyType2: org.mycompany.MyType2Serializer
#
## List of custom kryo decorators
# topology.kryo.decorators:
#     - org.mycompany.MyDecorator
#
## Locations of the drpc servers
# drpc.servers:
#     - "server1"
#     - "server2"

## Metrics Consumers
# topology.metrics.consumer.register:
#   - class: "backtype.storm.metrics.LoggingMetricsConsumer"
#     parallelism.hint: 1
#   - class: "org.mycompany.MyMetricsConsumer"
#     parallelism.hint: 1
#     argument:
#       - endpoint: "metrics-collector.mycompany.org"

启动 storm nimbus &

Exception in thread "main" expected '<document start>', but found BlockMappingStart
 in 'reader', line 23, column 2:
     nimbus.host: "master"
     ^

        at org.yaml.snakeyaml.parser.ParserImpl$ParseDocumentStart.produce(ParserImpl.java:225)
        at org.yaml.snakeyaml.parser.ParserImpl.peekEvent(ParserImpl.java:158)
        at org.yaml.snakeyaml.parser.ParserImpl.checkEvent(ParserImpl.java:143)
        at org.yaml.snakeyaml.composer.Composer.getSingleNode(Composer.java:108)
        at org.yaml.snakeyaml.constructor.BaseConstructor.getSingleData(BaseConstructor.java:120)
        at org.yaml.snakeyaml.Yaml.loadFromReader(Yaml.java:481)
        at org.yaml.snakeyaml.Yaml.load(Yaml.java:424)
        at backtype.storm.utils.Utils.findAndReadConfigFile(Utils.java:138)
        at backtype.storm.utils.Utils.readStormConfig(Utils.java:178)
        at backtype.storm.config$read_storm_config.invoke(config.clj:116)
        at backtype.storm.command.config_value$_main.invoke(config_value.clj:22)
        at clojure.lang.AFn.applyToHelper(AFn.java:161)
        at clojure.lang.AFn.applyTo(AFn.java:151)
        at backtype.storm.command.config_value.main(Unknown Source)
Exception in thread "main" expected '<document start>', but found BlockMappingStart
 in 'reader', line 23, column 2:
     nimbus.host: "master"
     ^

        at org.yaml.snakeyaml.parser.ParserImpl$ParseDocumentStart.produce(ParserImpl.java:225)
        at org.yaml.snakeyaml.parser.ParserImpl.peekEvent(ParserImpl.java:158)
        at org.yaml.snakeyaml.parser.ParserImpl.checkEvent(ParserImpl.java:143)
        at org.yaml.snakeyaml.composer.Composer.getSingleNode(Composer.java:108)
        at org.yaml.snakeyaml.constructor.BaseConstructor.getSingleData(BaseConstructor.java:120)
        at org.yaml.snakeyaml.Yaml.loadFromReader(Yaml.java:481)
        at org.yaml.snakeyaml.Yaml.load(Yaml.java:424)
        at backtype.storm.utils.Utils.findAndReadConfigFile(Utils.java:138)
        at backtype.storm.utils.Utils.readStormConfig(Utils.java:178)
        at backtype.storm.config$read_storm_config.invoke(config.clj:116)
        at backtype.storm.command.config_value$_main.invoke(config_value.clj:22)
        at clojure.lang.AFn.applyToHelper(AFn.java:161)
        at clojure.lang.AFn.applyTo(AFn.java:151)
        at backtype.storm.command.config_value.main(Unknown Source)
Running: java -server -Dstorm.options= -Dstorm.home=/home/hadoop/apache-storm-0.9.1-incubating -Djava.library.path= -Dstorm.conf.file= -cp /home/hadoop/apache-storm-0.9.1-incubating/lib/ring-core-1.1.5.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/tools.cli-0.2.2.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/compojure-1.1.3.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/jgrapht-core-0.9.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/json-simple-1.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/commons-io-1.4.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/objenesis-1.2.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/servlet-api-2.5.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/ring-servlet-0.3.11.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/httpclient-4.1.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/clout-1.0.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/disruptor-2.10.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/commons-fileupload-1.2.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/ring-jetty-adapter-0.3.11.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/curator-client-1.0.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/netty-3.6.3.Final.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/meat-locker-0.3.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/clj-time-0.4.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/commons-codec-1.4.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/slf4j-api-1.6.5.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/clojure-1.4.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/joda-time-2.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/commons-lang-2.5.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/kryo-2.17.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/commons-logging-1.1.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/carbonite-1.3.2.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/tools.macro-0.1.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/clj-stacktrace-0.2.4.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/log4j-over-slf4j-1.6.6.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/commons-exec-1.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/storm-core-0.9.1-incubating.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/jetty-util-6.1.26.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/guava-13.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/core.incubator-0.1.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/snakeyaml-1.11.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/ring-devel-0.3.11.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/minlog-1.2.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/logback-core-1.0.6.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/servlet-api-2.5-20081211.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/reflectasm-1.07-shaded.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/logback-classic-1.0.6.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/math.numeric-tower-0.0.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/hiccup-0.3.6.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/httpcore-4.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/asm-4.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/zookeeper-3.3.3.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/curator-framework-1.0.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/junit-3.8.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/jline-2.11.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/tools.logging-0.2.3.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/jetty-6.1.26.jar:/home/hadoop/apache-storm-0.9.1-incubating/conf -Dlogfile.name=nimbus.log -Dlogback.configurationFile=/home/hadoop/apache-storm-0.9.1-incubating/logback/cluster.xml backtype.storm.daemon.nimbus

查看报警信息标识是在nimbus的n上,经试验原来是这几个配置名称前需要加空格。即:

空格nimbus.host: "192.168.1.101"
空格storm.zookeeper.port: 2181
空格storm.local.dir: "home/hadoop/storm-0.9.1/data"
空格supervisor.slots.ports:
 
大家配置storm.yaml时一定要注意了。少一个空格竟然就启动不了,真是不可思议。

转载于:https://www.cnblogs.com/zhaoyan001/p/7910192.html

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
### 回答1: 这是一个 YAML 解析错误,错误信息为“期望 '<document start>',但是找到了 '<block mapping start>'”。这意味着在 YAML 文件中,解析器期望看到一个文档的开始,但是实际上却找到了一个块映射的开始。可能是 YAML 文件格式不正确或者缺少必要的标记。需要检查 YAML 文件的格式并修复错误。 ### 回答2: 这个错误是 YAML 解析器(Parser)在遇到一个“块映射开始标记(<block mapping start>)”时,却期望着能够看到一个“文档开始标记(<document start>)”。这种情况通常出现在 YAML 文件的格式出现问题时。 在 YAML 中,每个文档都应以文档开始标记(---)开始,以文档结束标记(...)结束。每个文档中又包含了若干个映射(Mappings)或序列(Sequences),它们都用不同的标记方式表示。其中,块映射是一种常用的映射标记方式,它通常以花括号表示,映射的键值对用冒号分隔。 当出现“块映射开始标记”时,YAML 解析器默认会认为它是一个文档,因此需要在其之前加上“文档开始标记”以明确指定这是一个 YAML 文档。如果没有遵循以上标记顺序的话,就会导致 YAML 解析器出现“expected '<document start>', but found '<block mapping start>'”这个错误。 解决这个错误的方法就是检查 YAML 文件的格式是否正确,是否严格按照标记的顺序编写。如果确定格式无误,也有可能是缩进不当导致的错误,这时可以尝试调整缩进使它们匹配起来。如果还是无法解决,那么可能是 YAML 解析器本身的问题,可以升级解析器或改用其他解析器来尝试解决问题。 ### 回答3: 这是一个错误信息,通常出现在使用Python编程语言中使用PyYAML库来解析YAML文件时。该错误信息表明,解析器预计将标记开头的文档,但它找到了块映射开始标记。 这个错误通常发生在将一个YAML文件传给PyYAML解析器时,出现了语法错误。或者是YAML文件格式出现了问题,不符合PyYAML解析器的预期。具体而言,当PyYAML解析器读取YAML文件时,它会预期读取文档开始的标记,但如果在读取文件的过程中它发现了块映射开始标记,那么它就会抛出这个错误信息。 解决这个问题的方法是检查YAML文件的格式和语法错误。在大多数情况下,这个错误是因为YAML文件格式不正确。检查YAML文件中每个键和值的缩进是否正确,确保所有键和值使用相同数量的空格或制表符进行缩进。如果您正在使用缩进,一定要使用相同量的缩进符,不能混用制表符和空格。此外,还应该检查每条语句的结束,确保它们都以正确的方式结束,例如:每个语句结尾是一个冒号(:)。 除此之外,您可能还需要检查文件名是否正确拼写,文件路径是否正确,并确保YAML文件中没有其他不一致的元素存在。如果您的YAML文件似乎没有任何明显的问题,那么最好的选择是将文件分成小节逐一进行检查,以查找可能会导致解析错误的部分。 总之,要解决YAML格式解析错误的问题,您需要仔细检查YAML文件的语法和格式,以查找可能存在的任何问题。毕竟,只要文件的格式和语法正确,使用PyYAML就可以很容易地读取和解析YAML文件。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值