hadoop伪分布式运行wordcount 实例Fedora14

[hadoop@master hadoop]$ bin/hadoop namenode format
12/03/09 23:41:08 INFO namenode.NameNode: STARTUP_MSG: 
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = master/192.168.1.3
STARTUP_MSG:   args = [format]
STARTUP_MSG:   version = 0.20.2
STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20 -r 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
************************************************************/
Usage: java NameNode [-format] | [-upgrade] | [-rollback] | [-finalize] | [-importCheckpoint]
12/03/09 23:41:08 INFO namenode.NameNode: SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at master/192.168.1.3
************************************************************/

[hadoop@master hadoop]$ bin/start-all.sh
starting namenode, logging to /home/hadoop/hadoop/logs/hadoop-hadoop-namenode-zc.out
192.168.1.3: starting datanode, logging to /home/hadoop/hadoop/logs/hadoop-hadoop-datanode-master.out
192.168.1.3: starting secondarynamenode, logging to /home/hadoop/hadoop/logs/hadoop-hadoop-secondarynamenode-master.out
starting jobtracker, logging to /home/hadoop/hadoop/logs/hadoop-hadoop-jobtracker-zc.out
192.168.1.3: starting tasktracker, logging to /home/hadoop/hadoop/logs/hadoop-hadoop-tasktracker-master.out
[hadoop@master hadoop]$ jps
14155 JobTracker
14310 Jps
13954 DataNode
13847 NameNode
14266 TaskTracker
14082 SecondaryNameNode

[hadoop@master hadoop]$ vim  in1.txt

[hadoop@master hadoop]$ cat in1.txt

hello hello world world
hello cai ge
cai ge ni hao
gentlemam
gentlemen
lady
lady


[hadoop@master hadoop]$ vim in2.txt
[hadoop@master hadoop]$ ls
bin          docs                        in1      lib          oool        word
build.xml    hadoop-0.20.2-ant.jar       in1.txt  librecordio  README.txt
c++          hadoop-0.20.2-core.jar      in2.txt  LICENSE.txt  s1
CHANGES.txt  hadoop-0.20.2-examples.jar  input    logs         src
conf         hadoop-0.20.2-test.jar      ivy      NOTICE.txt   tmpdir
contrib      hadoop-0.20.2-tools.jar     ivy.xml  okk          webapps
[hadoop@master hadoop]$ mkdir input1
[hadoop@master hadoop]$ cp in1.txt in2.txt iput1
cp: 目标"iput1" 不是目录
[hadoop@master hadoop]$ cp in1.txt in2.txt input1
[hadoop@master hadoop]$ bin/hadoop fs -put input1/ input1
[hadoop@master hadoop]$ bin/hadoop fs -ls
Found 3 items
drwxr-xr-x   - hadoop supergroup          0 2012-03-09 23:19 /user/hadoop/input
drwxr-xr-x   - hadoop supergroup          0 2012-03-09 23:32 /user/hadoop/input1
drwxr-xr-x   - hadoop supergroup          0 2012-03-09 23:21 /user/hadoop/output
[hadoop@master hadoop]$ bin/hadoop jar hadoop-*-examples.jar wordcount input1 output112/03/09 23:34:10 INFO input.FileInputFormat: Total input paths to process : 2
12/03/09 23:34:10 INFO mapred.JobClient: Running job: job_201203092303_0004
12/03/09 23:34:11 INFO mapred.JobClient:  map 0% reduce 0%
12/03/09 23:34:19 INFO mapred.JobClient:  map 100% reduce 0%
12/03/09 23:34:31 INFO mapred.JobClient:  map 100% reduce 100%
12/03/09 23:34:33 INFO mapred.JobClient: Job complete: job_201203092303_0004
12/03/09 23:34:33 INFO mapred.JobClient: Counters: 17
12/03/09 23:34:33 INFO mapred.JobClient:   Job Counters 
12/03/09 23:34:33 INFO mapred.JobClient:     Launched reduce tasks=1
12/03/09 23:34:33 INFO mapred.JobClient:     Launched map tasks=2
12/03/09 23:34:33 INFO mapred.JobClient:     Data-local map tasks=2
12/03/09 23:34:33 INFO mapred.JobClient:   FileSystemCounters
12/03/09 23:34:33 INFO mapred.JobClient:     FILE_BYTES_READ=173
12/03/09 23:34:33 INFO mapred.JobClient:     HDFS_BYTES_READ=134
12/03/09 23:34:33 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=416
12/03/09 23:34:33 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=85
12/03/09 23:34:33 INFO mapred.JobClient:   Map-Reduce Framework
12/03/09 23:34:33 INFO mapred.JobClient:     Reduce input groups=11
12/03/09 23:34:33 INFO mapred.JobClient:     Combine output records=15
12/03/09 23:34:33 INFO mapred.JobClient:     Map input records=12
12/03/09 23:34:33 INFO mapred.JobClient:     Reduce shuffle bytes=179
12/03/09 23:34:33 INFO mapred.JobClient:     Reduce output records=11
12/03/09 23:34:33 INFO mapred.JobClient:     Spilled Records=30
12/03/09 23:34:33 INFO mapred.JobClient:     Map output bytes=241
12/03/09 23:34:33 INFO mapred.JobClient:     Combine input records=27
12/03/09 23:34:33 INFO mapred.JobClient:     Map output records=27
12/03/09 23:34:33 INFO mapred.JobClient:     Reduce input records=15
[hadoop@master hadoop]$ bin/hadoop fs -get output1 s2
[hadoop@master hadoop]$ ls
bin        CHANGES.txt  docs                    hadoop-0.20.2-examples.jar  in1      input   ivy.xml      LICENSE.txt  okk         s1   tmpdir
build.xml  conf         hadoop-0.20.2-ant.jar   hadoop-0.20.2-test.jar      in1.txt  input1  lib          logs         oool        s2   webapps
c++        contrib      hadoop-0.20.2-core.jar  hadoop-0.20.2-tools.jar     in2.txt  ivy     librecordio  NOTICE.txt   README.txt  src  word
[hadoop@master hadoop]$ cat s2/*
cat: s2/_logs: 是一个目录
alarm 2
cai 4
clock 2
ge 4
gentlemam 1
gentlemen 1
hao 3
hello 3
lady 2
ni 3
world 2
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值