启动storm遇到的while scanning a simple key in 'reader', line 54, column 2: nimbus.host:"node1"

平台:centos-6.3-i386

   jdk-7u51

           storm 0.9.1

   Python 2.6.6

     Hadoop 1.2.1


启动storm的时候,遇到这个问题,百度之后,看到大家的解决方案是在 nimbus.host: "master"前加上空格,但是,我的已经加上空格。还是出错。

[html]  view plain  copy
  1. 1 # Licensed to the Apache Software Foundation (ASF) under one  
  2.  2 # or more contributor license agreements.  See the NOTICE file  
  3.  3 # distributed with this work for additional information  
  4.  4 # regarding copyright ownership.  The ASF licenses this file  
  5.  5 # to you under the Apache License, Version 2.0 (the  
  6.  6 # "License"); you may not use this file except in compliance  
  7.  7 # with the License.  You may obtain a copy of the License at  
  8.  8 #  
  9.  9 # http://www.apache.org/licenses/LICENSE-2.0  
  10. 10 #  
  11. 11 # Unless required by applicable law or agreed to in writing, software  
  12. 12 # distributed under the License is distributed on an "AS IS" BASIS,  
  13. 13 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.  
  14. 14 # See the License for the specific language governing permissions and  
  15. 15 # limitations under the License.  
  16. 16   
  17. 17 ########### These MUST be filled in for a storm configuration  
  18. 18 # storm.zookeeper.servers:  
  19. 19      - "master"  
  20. 20      - "slave1"  
  21. 21      - "slave2"  
  22. 22 #  
  23. 23  nimbus.host: "master"  
  24. 24  storm.local.dir: "/home/hadoop/apache-storm-0.9.1-incubating/data"  
  25. 25  supervisor.slots.ports:  
  26. 26     - 6700  
  27. 27     - 6701  
  28. 28     - 6702  
  29. 29     - 6703  
  30. 30 #  
  31. 31 #  
  32. 32 # ##### These may optionally be filled in:  
  33. 33 #  
  34. 34 ## List of custom serializations  
  35. 35 # topology.kryo.register:  
  36. 36 #     - org.mycompany.MyType  
  37. 37 #     - org.mycompany.MyType2: org.mycompany.MyType2Serializer  
  38. 38 #  
  39. 39 ## List of custom kryo decorators  
  40. 40 # topology.kryo.decorators:  
  41. 41 #     - org.mycompany.MyDecorator  
  42. 42 #  
  43. 43 ## Locations of the drpc servers  
  44. 44 # drpc.servers:  
  45. 45 #     - "server1"  
  46. 46 #     - "server2"  
  47. 47   
  48. 48 ## Metrics Consumers  
  49. 49 # topology.metrics.consumer.register:  
  50. 50 #   - class: "backtype.storm.metrics.LoggingMetricsConsumer"  
  51. 51 #     parallelism.hint: 1  
  52. 52 #   - class: "org.mycompany.MyMetricsConsumer"  
  53. 53 #     parallelism.hint: 1  
  54. 54 #     argument:  
  55. 55 #       - endpoint: "metrics-collector.mycompany.org"  

启动 storm nimbus &


[html]  view plain  copy
  1. <span style="font-size:18px;">1  Exception in thread "main" expected ‘<document start>‘, but found BlockMappingStart  
  2.  2  in ‘reader‘, line 23, column 2:  
  3.  3      nimbus.host: "master"  
  4.  4      ^  
  5.  5   
  6.  6         at org.yaml.snakeyaml.parser.ParserImpl$ParseDocumentStart.produce(ParserImpl.java:225)  
  7.  7         at org.yaml.snakeyaml.parser.ParserImpl.peekEvent(ParserImpl.java:158)  
  8.  8         at org.yaml.snakeyaml.parser.ParserImpl.checkEvent(ParserImpl.java:143)  
  9.  9         at org.yaml.snakeyaml.composer.Composer.getSingleNode(Composer.java:108)  
  10. 10         at org.yaml.snakeyaml.constructor.BaseConstructor.getSingleData(BaseConstructor.java:120)  
  11. 11         at org.yaml.snakeyaml.Yaml.loadFromReader(Yaml.java:481)  
  12. 12         at org.yaml.snakeyaml.Yaml.load(Yaml.java:424)  
  13. 13         at backtype.storm.utils.Utils.findAndReadConfigFile(Utils.java:138)  
  14. 14         at backtype.storm.utils.Utils.readStormConfig(Utils.java:178)  
  15. 15         at backtype.storm.config$read_storm_config.invoke(config.clj:116)  
  16. 16         at backtype.storm.command.config_value$_main.invoke(config_value.clj:22)  
  17. 17         at clojure.lang.AFn.applyToHelper(AFn.java:161)  
  18. 18         at clojure.lang.AFn.applyTo(AFn.java:151)  
  19. 19         at backtype.storm.command.config_value.main(Unknown Source)  
  20. 20 Exception in thread "main" expected ‘<document start>‘, but found BlockMappingStart  
  21. 21  in ‘reader‘, line 23, column 2:  
  22. 22      nimbus.host: "master"  
  23. 23      ^  
  24. 24   
  25. 25         at org.yaml.snakeyaml.parser.ParserImpl$ParseDocumentStart.produce(ParserImpl.java:225)  
  26. 26         at org.yaml.snakeyaml.parser.ParserImpl.peekEvent(ParserImpl.java:158)  
  27. 27         at org.yaml.snakeyaml.parser.ParserImpl.checkEvent(ParserImpl.java:143)  
  28. 28         at org.yaml.snakeyaml.composer.Composer.getSingleNode(Composer.java:108)  
  29. 29         at org.yaml.snakeyaml.constructor.BaseConstructor.getSingleData(BaseConstructor.java:120)  
  30. 30         at org.yaml.snakeyaml.Yaml.loadFromReader(Yaml.java:481)  
  31. 31         at org.yaml.snakeyaml.Yaml.load(Yaml.java:424)  
  32. 32         at backtype.storm.utils.Utils.findAndReadConfigFile(Utils.java:138)  
  33. 33         at backtype.storm.utils.Utils.readStormConfig(Utils.java:178)  
  34. 34         at backtype.storm.config$read_storm_config.invoke(config.clj:116)  
  35. 35         at backtype.storm.command.config_value$_main.invoke(config_value.clj:22)  
  36. 36         at clojure.lang.AFn.applyToHelper(AFn.java:161)  
  37. 37         at clojure.lang.AFn.applyTo(AFn.java:151)  
  38. 38         at backtype.storm.command.config_value.main(Unknown Source)  
  39. 39 Running: java -server -Dstorm.options-Dstorm.home=/home/hadoop/apache-storm-0.9.1-incubating -Djava.library.path-Dstorm.conf.file= -cp /home/hadoop/apache-storm-0.9.1-incubating/lib/ring-core-1.1.5.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/tools.cli-0.2.2.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/compojure-1.1.3.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/jgrapht-core-0.9.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/json-simple-1.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/commons-io-1.4.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/objenesis-1.2.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/servlet-api-2.5.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/ring-servlet-0.3.11.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/httpclient-4.1.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/clout-1.0.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/disruptor-2.10.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/commons-fileupload-1.2.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/ring-jetty-adapter-0.3.11.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/curator-client-1.0.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/netty-3.6.3.Final.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/meat-locker-0.3.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/clj-time-0.4.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/commons-codec-1.4.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/slf4j-api-1.6.5.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/clojure-1.4.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/joda-time-2.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/commons-lang-2.5.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/kryo-2.17.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/commons-logging-1.1.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/carbonite-1.3.2.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/tools.macro-0.1.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/clj-stacktrace-0.2.4.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/log4j-over-slf4j-1.6.6.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/commons-exec-1.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/storm-core-0.9.1-incubating.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/jetty-util-6.1.26.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/guava-13.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/core.incubator-0.1.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/snakeyaml-1.11.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/ring-devel-0.3.11.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/minlog-1.2.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/logback-core-1.0.6.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/servlet-api-2.5-20081211.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/reflectasm-1.07-shaded.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/logback-classic-1.0.6.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/math.numeric-tower-0.0.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/hiccup-0.3.6.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/httpcore-4.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/asm-4.0.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/zookeeper-3.3.3.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/curator-framework-1.0.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/junit-3.8.1.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/jline-2.11.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/tools.logging-0.2.3.jar:/home/hadoop/apache-storm-0.9.1-incubating/lib/jetty-6.1.26.jar:/home/hadoop/apache-storm-0.9.1-incubating/conf -Dlogfile.name=nimbus.log -Dlogback.configurationFile=/home/hadoop/apache-storm-0.9.1-incubating/logback/cluster.xml backtype.storm.daemon.nimbus</span>  

查看报警信息标识是在nimbusn上,经试验原来是这几个配置名称前需要加空格。即:

空格nimbus.host: "192.168.1.101"
空格storm.zookeeper.port: 2181
空格storm.local.dir: "home/hadoop/storm-0.9.1/data"
空格supervisor.slots.ports:
 
大家配置storm.yaml时一定要注意了。少一个空格竟然就启动不了,真是不可思议。

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值