Investigation report of MSSQL data read and write performance

MSSQL 2008

CPU Xeon E5 2G (12 CORE ) X 2

RAM 128G

case 1:

while @i<1,000,000
begin
Insert into T1(
column1,
column2,
column3,
...
column34) values(
@i,
getdate(),
rand()*getdate,
...
...)
end

Test result : 1,000,000 rows  time:  00:04:28  size 503MB


case 2:
insert into T1 select * from T1

Test result : 1,000,000 rows insert to Table(1,000,000 rows exist)  time:  00:00:51 


case 3:
select * into NewTableName from T1

Test result : 2,000,000 rows insert to NewTable  time:  00:00:22


select * into TableName from T1

Test result : 2,000,000 rows insert to TableName(2,000,000 rows exist)  time:  00:01:00


case 4:
Tmp(20,000,000 rows)  Target ( 20,000,000 rows)
Open 10 threads run this sql in same time
insert into Target select * from Tmp

Test result : 2,000,000X10 rows insert to Target(20,000,000 rows exist)  time:  04:29:00


case 5:
cmd>bcp T1 out t1.dat -f bcp.Xml -T -d database

Test result : 20,000,000 rows export to t1.dat  time:  00:00:20
cmd>bcp Target in t1.dat -f bcp.Xml -T -d database

Test result : 20,000,000 rows import from t1.dat to Target(500,000,000 rows exist )  time:  00:50:01

case 6:

bcp import 20,000,000 X 2 rows use 2 thread to same blank table 

cmd>bcp Target in t1.dat -f bcp.Xml -T -d database

Test result : thread 1 time:  00:08:35  speed :38761 rows/sec  ;  thread 2 time: 00:07:37  speed: 48598    Total speed: 87359 rows/sec  

case 7:

bcp export 639,160,000 rows to a file 

cmd>bcp Target out target.dat -f bcp.Xml -T -d database

Test result : time : 01:24:17 speed: 126,380 rows/sec 

case 8:

bcp import 639,160,000 rows to target table

cmd>bcp Target in target.dat -f bcp.Xml -T -d database -b 5000

Test result : Very slow, canceled. So, you'd better don't use -b (aka batch size) option. it would split job into lots of transactions. if without -b option, it would be only one transaction.


use bcp to import 639,160,000 rows to target table( over 700,000,000 rows exist )

cmd>bcp Target in target.dat -f bcp.Xml -T -d database -h "TABLOCK"

Test result:  time:  12:50:35  speed:  13,824 rows / sec


case 9:

use bulk insert clause to import 20,000,000 rows to target table ( 640,000,000 rows exist )

bulk insert Target from 'tmp.dat' with (formatfile = 'bcp.Xml', tablock)

Test result:  time : 01:33:06  ( compare with case 5 )



大家有更快的方案可以分享一下吗?


  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值