hadoop2.4.1_WordCount测试

hadoop2.4.1完全分布式安装 :http://blog.itpub.net/26613085/viewspace-1219710/
[hadoop@master mapreduce]$ hadoop fs -ls /input
14/07/18 09:31:08 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 1 items
-rw-r--r--   3 hadoop supergroup       2514 2014-07-17 18:09 /input/test.txt

[hadoop@master mapreduce]$ hadoop jar /home/hadoop/hadoop-2.4.1/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.4.1.jar wordcount /input /output/wordcount
14/07/18 09:33:55 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/07/18 09:33:56 INFO client.RMProxy: Connecting to ResourceManager at master/100.12.56.228:8032
14/07/18 09:33:57 INFO input.FileInputFormat: Total input paths to process : 1
14/07/18 09:33:57 INFO mapreduce.JobSubmitter: number of splits:1
14/07/18 09:33:58 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1405438646128_0001
14/07/18 09:33:58 INFO impl.YarnClientImpl: Submitted application application_1405438646128_0001
14/07/18 09:33:58 INFO mapreduce.Job: The url to track the job: http://master:8088/proxy/application_1405438646128_0001/
14/07/18 09:33:58 INFO mapreduce.Job: Running job: job_1405438646128_0001
14/07/18 09:34:09 INFO mapreduce.Job: Job job_1405438646128_0001 running in uber mode : false
14/07/18 09:34:09 INFO mapreduce.Job:  map 0% reduce 0%
14/07/18 09:34:26 INFO mapreduce.Job:  map 100% reduce 0%
14/07/18 09:34:34 INFO mapreduce.Job:  map 100% reduce 100%
14/07/18 09:34:35 INFO mapreduce.Job: Job job_1405438646128_0001 completed successfully
14/07/18 09:34:35 INFO mapreduce.Job: Counters: 49
        File System Counters
                FILE: Number of bytes read=1245
                FILE: Number of bytes written=188807
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=2612
                HDFS: Number of bytes written=882
                HDFS: Number of read operations=6
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=2
        Job Counters
                Launched map tasks=1
                Launched reduce tasks=1
                Data-local map tasks=1
                Total time spent by all maps in occupied slots (ms)=14440
                Total time spent by all reduces in occupied slots (ms)=5493
                Total time spent by all map tasks (ms)=14440
                Total time spent by all reduce tasks (ms)=5493
                Total vcore-seconds taken by all map tasks=14440
                Total vcore-seconds taken by all reduce tasks=5493
                Total megabyte-seconds taken by all map tasks=14786560
                Total megabyte-seconds taken by all reduce tasks=5624832
        Map-Reduce Framework
                Map input records=51
                Map output records=344
                Map output bytes=3742
                Map output materialized bytes=1245
                Input split bytes=98
                Combine input records=344
                Combine output records=91
                Reduce input groups=91
                Reduce shuffle bytes=1245
                Reduce input records=91
                Reduce output records=91
                Spilled Records=182
                Shuffled Maps =1
                Failed Shuffles=0
                Merged Map outputs=1
                GC time elapsed (ms)=222
                CPU time spent (ms)=4630
                Physical memory (bytes) snapshot=216760320
                Virtual memory (bytes) snapshot=718753792
                Total committed heap usage (bytes)=137039872
        Shuffle Errors
                BAD_ID=0
                CONNECTION=0
                IO_ERROR=0
                WRONG_LENGTH=0
                WRONG_MAP=0
                WRONG_REDUCE=0
        File Input Format Counters
                Bytes Read=2514
        File Output Format Counters
                Bytes Written=882

[hadoop@master mapreduce]$ hadoop fs -ls /output/wordcount
14/07/18 09:36:23 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 2 items
-rw-r--r--   3 hadoop supergroup          0 2014-07-18 09:34 /output/wordcount/_SUCCESS
-rw-r--r--   3 hadoop supergroup        882 2014-07-18 09:34 /output/wordcount/part-r-00000
[hadoop@master mapreduce]$ hadoop fs -cat /output/wordcount/part-r-00000
14/07/18 09:36:53 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
-       4
-ls     1
-rw-rw-r--      2
-rwxrwxrwx      6
..      1
/       1
/hbase  1
/input  1
/tmp    1
/user   1
0       5
00:27   4
09:41   1
1       8
10      2
11      2
138656756       2
14/07/17        1
14:01   2
15      12
15:09   2
15:29   2
16      2
16:44   1
16:48   1
17      12
17:12   2
17:38   2
18:03   1
18:03:34        1
2       3
2014-07-17      4
22:42   3
23:22   2
23:30   4
23:36   3
269012  2
3       7
4       3
4096    18
54246778        2
8       5
82246622        2
9       2
Found   1
Jul     26
Unable  1
WARN    1
[hadoop@master  13
apache-hive-0.13.1-bin  2
apache-hive-0.13.1-bin.tar.gz   2
applicable      1
builtin-java    1
cd      6
classes 1
current 2
data    2
data]$  2
dfs     3
dfs]$   3
drwxr-xr-x      6
drwxrwxr-x      16
file:   2
for     1
fs      1
hadoop  53
hadoop-2.4.1    2
hadoop-2.4.1.tar.gz     2
hadoop-2.4.1]$  2
hbase-0.98.2-hadoop2    2
hbase-0.98.2-hadoop2-bin.tar.gz 2
in_use.lock     2
items   1
library 1
ll      6
load    1
name    2
name]$  3
native-hadoop   1
platform...     1
root    4
supergroup      4
tmp     2
to      1
total   6
using   1
util.NativeCodeLoader:  1
where   1
your    1
zookeeper       2
~]$     3
[hadoop@master mapreduce]$

来自 “ ITPUB博客 ” ,链接:http://blog.itpub.net/26613085/viewspace-1221865/,如需转载,请注明出处,否则将追究法律责任。

转载于:http://blog.itpub.net/26613085/viewspace-1221865/

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值