【hadoop之mapreduce流量统计案例】

Linux部分

1、启动hadoop集群

start-dfs.sh
start-yarn.sh

[baoerjie@hadoop1 hadoop2.7.7]$ jps
4322 SecondaryNameNode
6563 Jps
3675 org.eclipse.equinox.launcher_1.3.201.v20161025-1711.jar
3995 NameNode
4139 DataNode
4559 ResourceManager
4863 NodeManager
[baoerjie@hadoop1 hadoop2.7.7]$ 

2、在根目录下创建文件夹

[baoerjie@hadoop1 hadoop2.7.7]$ hadoop fs -mkdir /baoerjie
[baoerjie@hadoop1 hadoop2.7.7]$ hadoop fs -mkdir /baoerjie/input
[baoerjie@hadoop1 hadoop2.7.7]$ hadoop fs -ls /
Found 1 items
drwxr-xr-x   - baoerjie supergroup          0 2020-05-18 22:34 /baoerjie
[baoerjie@hadoop1 hadoop2.7.7]$ 

3、上传数据文件到/baoerjie/input目录下

[baoerjie@hadoop1 hadoop2.7.7]$ hadoop fs -put data.log /baoerjie/input
[baoerjie@hadoop1 hadoop2.7.7]$ hadoop fs -ls /baoerjie/input
Found 1 items
-rw-r--r--   2 baoerjie supergroup         81 2020-05-18 22:38 /baoerjie/input/data.log
[baoerjie@hadoop1 hadoop2.7.7]$ 

4、运行打包好的jar包

[baoerjie@hadoop1 app]$ hadoop jar flow.jar com.lzw.mapreduce.FlowSumRunner
20/05/18 22:45:58 INFO client.RMProxy: Connecting to ResourceManager at hadoop1/192.168.126.129:8032
20/05/18 22:45:58 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
20/05/18 22:45:58 INFO input.FileInputFormat: Total input paths to process : 1
20/05/18 22:45:58 INFO mapreduce.JobSubmitter: number of splits:1
20/05/18 22:45:59 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1589808064752_0002
20/05/18 22:45:59 INFO impl.YarnClientImpl: Submitted application application_1589808064752_0002
20/05/18 22:45:59 INFO mapreduce.Job: The url to track the job: http://hadoop1:8088/proxy/application_1589808064752_0002/
20/05/18 22:45:59 INFO mapreduce.Job: Running job: job_1589808064752_0002
20/05/18 22:46:05 INFO mapreduce.Job: Job job_1589808064752_0002 running in uber mode : false
20/05/18 22:46:05 INFO mapreduce.Job:  map 0% reduce 0%
20/05/18 22:46:11 INFO mapreduce.Job:  map 100% reduce 0%
20/05/18 22:46:16 INFO mapreduce.Job:  map 100% reduce 100%
20/05/18 22:46:17 INFO mapreduce.Job: Job job_1589808064752_0002 completed successfully
20/05/18 22:46:17 INFO mapreduce.Job: Counters: 49
        File System Counters
                FILE: Number of bytes read=53236996
                FILE: Number of bytes written=106719387
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=61084313
                HDFS: Number of bytes written=654320
                HDFS: Number of read operations=6
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=2
        Job Counters 
                Launched map tasks=1
                Launched reduce tasks=1
                Data-local map tasks=1
                Total time spent by all maps in occupied slots (ms)=4208
                Total time spent by all reduces in occupied slots (ms)=2301
                Total time spent by all map tasks (ms)=4208
                Total time spent by all reduce tasks (ms)=2301
                Total vcore-milliseconds taken by all map tasks=4208
                Total vcore-milliseconds taken by all reduce tasks=2301
                Total megabyte-milliseconds taken by all map tasks=4308992
                Total megabyte-milliseconds taken by all reduce tasks=2356224
        Map-Reduce Framework
                Map input records=548160
                Map output records=548160
                Map output bytes=52130912
                Map output material
  • 0
    点赞
  • 5
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值