hadoop基准性能测试

HDFS写性能

测试内容:向HDFS集群写10个128m的文件

hadoop jar /opt/hadoop-2.7.2/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.2-tests.jar TestDFSIO -write -nrFiles 10 -fileSize 128MB
20/07/17 08:09:34 INFO fs.TestDFSIO: ----- TestDFSIO ----- : write
20/07/17 08:09:34 INFO fs.TestDFSIO:            Date & time: Fri Jul 17 08:09:34 UTC 2020
20/07/17 08:09:34 INFO fs.TestDFSIO:        Number of files: 10
20/07/17 08:09:34 INFO fs.TestDFSIO: Total MBytes processed: 1280.0
20/07/17 08:09:34 INFO fs.TestDFSIO:      Throughput mb/sec: 76.33587786259542
20/07/17 08:09:34 INFO fs.TestDFSIO: Average IO rate mb/sec: 81.69550323486328
20/07/17 08:09:34 INFO fs.TestDFSIO:  IO rate std deviation: 21.683323743056395
20/07/17 08:09:34 INFO fs.TestDFSIO:     Test exec time sec: 20.283

测试HDFS读性能

测试内容:读取HDFS集群10个128M的文件

hadoop jar /opt/hadoop-2.7.2/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.2-tests.jar TestDFSIO -read -nrFiles 10 -fileSize 128MB
20/07/17 08:14:28 INFO fs.TestDFSIO: ----- TestDFSIO ----- : read
20/07/17 08:14:28 INFO fs.TestDFSIO:            Date & time: Fri Jul 17 08:14:28 UTC 2020
20/07/17 08:14:28 INFO fs.TestDFSIO:        Number of files: 10
20/07/17 08:14:28 INFO fs.TestDFSIO: Total MBytes processed: 1280.0
20/07/17 08:14:28 INFO fs.TestDFSIO:      Throughput mb/sec: 282.4360105913504
20/07/17 08:14:28 INFO fs.TestDFSIO: Average IO rate mb/sec: 558.9140014648438
20/07/17 08:14:28 INFO fs.TestDFSIO:  IO rate std deviation: 480.34357648099825
20/07/17 08:14:28 INFO fs.TestDFSIO:     Test exec time sec: 16.391

删除测试数据

hadoop jar /opt/hadoop-2.7.2/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.2-tests.jar TestDFSIO -clean

使用Sort程序测试MapReduce

  1. 使用RandomWriter来产生随机数,每个节点运行10个Map任务,每个Map产生大约1G大小的二进制随机数

    hadoop jar /opt/hadoop-2.7.2/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.2.jar randomwriter random-data
    
  2. 执行sort程序

    hadoop jar /opt/hadoop-2.7.2/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.2.jar sort random-data sorted-data
    
  3. 验证

    hadoop jar /opt/hadoop-2.7.2/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.2.jar testmapredsort -sortInput random-data -sortOutput sorted-data
    

生成1G数据job

[root@dw-node01 hadoop]# hadoop jar /opt/hadoop-2.7.2/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.2.jar randomwriter rando
20/07/17 08:20:31 INFO client.RMProxy: Connecting to ResourceManager at dw-node02/192.168.9.202:8032
Running 30 maps.
Job started: Fri Jul 17 08:20:31 UTC 2020
20/07/17 08:20:31 INFO client.RMProxy: Connecting to ResourceManager at dw-node02/192.168.9.202:8032
20/07/17 08:20:32 INFO mapreduce.JobSubmitter: number of splits:30
20/07/17 08:20:32 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1594972286943_0003
20/07/17 08:20:32 INFO impl.YarnClientImpl: Submitted application application_1594972286943_0003
20/07/17 08:20:32 INFO mapreduce.Job: The url to track the job: http://dw-node02:8088/proxy/application_1594972286943_0003/
20/07/17 08:20:32 INFO mapreduce.Job: Running job: job_1594972286943_0003
20/07/17 08:20:37 INFO mapreduce.Job: Job job_1594972286943_0003 running in uber mode : false
20/07/17 08:20:37 INFO mapreduce.Job:  map 0% reduce 0%
20/07/17 08:24:29 INFO mapreduce.Job:  map 3% reduce 0%
20/07/17 08:24:33 INFO mapreduce.Job:  map 7% reduce 0%
20/07/17 08:24:51 INFO mapreduce.Job:  map 10% reduce 0%
20/07/17 08:24:53 INFO mapreduce.Job:  map 13% reduce 0%
20/07/17 08:25:09 INFO mapreduce.Job:  map 17% reduce 0%
20/07/17 08:25:13 INFO mapreduce.Job:  map 20% reduce 0%
20/07/17 08:25:19 INFO mapreduce.Job:  map 27% reduce 0%
20/07/17 08:25:20 INFO mapreduce.Job:  map 30% reduce 0%
20/07/17 08:25:31 INFO mapreduce.Job:  map 33% reduce 0%
20/07/17 08:25:34 INFO mapreduce.Job:  map 37% reduce 0%
20/07/17 08:25:35 INFO mapreduce.Job:  map 40% reduce 0%
20/07/17 08:25:37 INFO mapreduce.Job:  map 43% reduce 0%
20/07/17 08:25:38 INFO mapreduce.Job:  map 47% reduce 0%
20/07/17 08:25:41 INFO mapreduce.Job:  map 50% reduce 0%
20/07/17 08:25:45 INFO mapreduce.Job:  map 53% reduce 0%
20/07/17 08:25:46 INFO mapreduce.Job:  map 57% reduce 0%
20/07/17 08:25:49 INFO mapreduce.Job:  map 60% reduce 0%
20/07/17 08:25:50 INFO mapreduce.Job:  map 70% reduce 0%
20/07/17 08:25:52 INFO mapreduce.Job:  map 73% reduce 0%
20/07/17 08:27:51 INFO mapreduce.Job:  map 80% reduce 0%
20/07/17 08:27:58 INFO mapreduce.Job:  map 83% reduce 0%
20/07/17 08:27:59 INFO mapreduce.Job:  map 87% reduce 0%
20/07/17 08:28:06 INFO mapreduce.Job:  map 90% reduce 0%
20/07/17 08:28:07 INFO mapreduce.Job:  map 93% reduce 0%
20/07/17 08:28:08 INFO mapreduce.Job:  map 97% reduce 0%
20/07/17 08:28:09 INFO mapreduce.Job:  map 100% reduce 0%
20/07/17 08:28:22 INFO mapreduce.Job: Job job_1594972286943_0003 completed successfully
20/07/17 08:28:23 INFO mapreduce.Job: Counters: 33
	File System Counters
		FILE: Number of bytes read=0
		FILE: Number of bytes written=3515390
		FILE: Number of read operations=0
		FILE: Number of large read operations=0
		FILE: Number of write operations=0
		HDFS: Number of bytes read=3680
		HDFS: Number of bytes written=32318668575
		HDFS: Number of read operations=120
		HDFS: Number of large read operations=0
		HDFS: Number of write operations=60
	Job Counters 
		Killed map tasks=10
		Launched map tasks=40
		Other local map tasks=40
		Total time spent by all maps in occupied slots (ms)=8331235
		Total time spent by all reduces in occupied slots (ms)=0
		Total time spent by all map tasks (ms)=8331235
		Total vcore-milliseconds taken by all map tasks=8331235
		Total megabyte-milliseconds taken by all map tasks=8531184640
	Map-Reduce Framework
		Map input records=30
		Map output records=3068426
		Input split bytes=3680
		Spilled Records=0
		Failed Shuffles=0
		Merged Map outputs=0
		GC time elapsed (ms)=15455
		CPU time spent (ms)=1228130
		Physical memory (bytes) snapshot=5070524416
		Virtual memory (bytes) snapshot=58810920960
		Total committed heap usage (bytes)=3866624000
	org.apache.hadoop.examples.RandomWriter$Counters
		BYTES_WRITTEN=32212494219
		RECORDS_WRITTEN=3068426
	File Input Format Counters 
		Bytes Read=0
	File Output Format Counters 
		Bytes Written=32318668575
Job ended: Fri Jul 17 08:28:23 UTC 2020
The job took 471 seconds.

排序数据

hadoop jar /opt/hadoop-2.7.2/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.2-tests.jar TestDFSIO -clean
20/07/17 08:17:57 INFO fs.TestDFSIO: TestDFSIO.1.8
20/07/17 08:17:57 INFO fs.TestDFSIO: nrFiles = 1
20/07/17 08:17:57 INFO fs.TestDFSIO: nrBytes (MB) = 1.0
20/07/17 08:17:57 INFO fs.TestDFSIO: bufferSize = 1000000
20/07/17 08:17:57 INFO fs.TestDFSIO: baseDir = /benchmarks/TestDFSIO
20/07/17 08:17:58 INFO fs.TestDFSIO: Cleaning up test files
[root@dw-node01 hadoop]# hadoop jar /opt/hadoop-2.7.2/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.2.jar randomwriter rando
20/07/17 08:20:31 INFO client.RMProxy: Connecting to ResourceManager at dw-node02/192.168.9.202:8032
Running 30 maps.
Job started: Fri Jul 17 08:20:31 UTC 2020
20/07/17 08:20:31 INFO client.RMProxy: Connecting to ResourceManager at dw-node02/192.168.9.202:8032
20/07/17 08:20:32 INFO mapreduce.JobSubmitter: number of splits:30
20/07/17 08:20:32 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1594972286943_0003
20/07/17 08:20:32 INFO impl.YarnClientImpl: Submitted application application_1594972286943_0003
20/07/17 08:20:32 INFO mapreduce.Job: The url to track the job: http://dw-node02:8088/proxy/application_1594972286943_0003/
20/07/17 08:20:32 INFO mapreduce.Job: Running job: job_1594972286943_0003
20/07/17 08:20:37 INFO mapreduce.Job: Job job_1594972286943_0003 running in uber mode : false
20/07/17 08:20:37 INFO mapreduce.Job:  map 0% reduce 0%
20/07/17 08:24:29 INFO mapreduce.Job:  map 3% reduce 0%
20/07/17 08:24:33 INFO mapreduce.Job:  map 7% reduce 0%
20/07/17 08:24:51 INFO mapreduce.Job:  map 10% reduce 0%
20/07/17 08:24:53 INFO mapreduce.Job:  map 13% reduce 0%
20/07/17 08:25:09 INFO mapreduce.Job:  map 17% reduce 0%
20/07/17 08:25:13 INFO mapreduce.Job:  map 20% reduce 0%
20/07/17 08:25:19 INFO mapreduce.Job:  map 27% reduce 0%
20/07/17 08:25:20 INFO mapreduce.Job:  map 30% reduce 0%
20/07/17 08:25:31 INFO mapreduce.Job:  map 33% reduce 0%
20/07/17 08:25:34 INFO mapreduce.Job:  map 37% reduce 0%
20/07/17 08:25:35 INFO mapreduce.Job:  map 40% reduce 0%
20/07/17 08:25:37 INFO mapreduce.Job:  map 43% reduce 0%
20/07/17 08:25:38 INFO mapreduce.Job:  map 47% reduce 0%
20/07/17 08:25:41 INFO mapreduce.Job:  map 50% reduce 0%
20/07/17 08:25:45 INFO mapreduce.Job:  map 53% reduce 0%
20/07/17 08:25:46 INFO mapreduce.Job:  map 57% reduce 0%
20/07/17 08:25:49 INFO mapreduce.Job:  map 60% reduce 0%
20/07/17 08:25:50 INFO mapreduce.Job:  map 70% reduce 0%
20/07/17 08:25:52 INFO mapreduce.Job:  map 73% reduce 0%
20/07/17 08:27:51 INFO mapreduce.Job:  map 80% reduce 0%
20/07/17 08:27:58 INFO mapreduce.Job:  map 83% reduce 0%
20/07/17 08:27:59 INFO mapreduce.Job:  map 87% reduce 0%
20/07/17 08:28:06 INFO mapreduce.Job:  map 90% reduce 0%
20/07/17 08:28:07 INFO mapreduce.Job:  map 93% reduce 0%
20/07/17 08:28:08 INFO mapreduce.Job:  map 97% reduce 0%
20/07/17 08:28:09 INFO mapreduce.Job:  map 100% reduce 0%
20/07/17 08:28:22 INFO mapreduce.Job: Job job_1594972286943_0003 completed successfully
20/07/17 08:28:23 INFO mapreduce.Job: Counters: 33
	File System Counters
		FILE: Number of bytes read=0
		FILE: Number of bytes written=3515390
		FILE: Number of read operations=0
		FILE: Number of large read operations=0
		FILE: Number of write operations=0
		HDFS: Number of bytes read=3680
		HDFS: Number of bytes written=32318668575
		HDFS: Number of read operations=120
		HDFS: Number of large read operations=0
		HDFS: Number of write operations=60
	Job Counters 
		Killed map tasks=10
		Launched map tasks=40
		Other local map tasks=40
		Total time spent by all maps in occupied slots (ms)=8331235
		Total time spent by all reduces in occupied slots (ms)=0
		Total time spent by all map tasks (ms)=8331235
		Total vcore-milliseconds taken by all map tasks=8331235
		Total megabyte-milliseconds taken by all map tasks=8531184640
	Map-Reduce Framework
		Map input records=30
		Map output records=3068426
		Input split bytes=3680
		Spilled Records=0
		Failed Shuffles=0
		Merged Map outputs=0
		GC time elapsed (ms)=15455
		CPU time spent (ms)=1228130
		Physical memory (bytes) snapshot=5070524416
		Virtual memory (bytes) snapshot=58810920960
		Total committed heap usage (bytes)=3866624000
	org.apache.hadoop.examples.RandomWriter$Counters
		BYTES_WRITTEN=32212494219
		RECORDS_WRITTEN=3068426
	File Input Format Counters 
		Bytes Read=0
	File Output Format Counters 
		Bytes Written=32318668575
Job ended: Fri Jul 17 08:28:23 UTC 2020
The job took 471 seconds.

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值