压测时常需要往数据库插入大量数据,下面是我往两个数据库插入数据时用的脚本
SQL server
declare @maxSum int,
@lid nvarchar(64); -- 'lid'为表id
set @maxSum=1;
begin tran
while @maxSum<200000
begin
set @lid='LID'+convert(nvarchar,@maxSum) -- id加'LID'前缀方便识别
insert into T_AT_LOG values(@lid,'1','1','2020-10-26 12:15:07.000761','7','127.0.0.1','system','0','{"msg":"Login for User:admin.","logBeans":null}','0');
set @maxSum=@maxSum+1
end
commit
数据要分批Commit,这里我是单次20万笔,数据库是无法支撑单次几百万笔Commit的。可能会导致锁表
单笔耗时21秒
Oracle
DECLARE
i number;
BEGIN
for i in 1 .. 200000 loop
INSERT INTO T_AT_LOG(LID,CID,USERID,OPER_TIME,OPER_TYPE,OPER_HOST,PERMISSION,STATUS,DETAILS,VERSION)
VALUES(i,'1','1','2020-10-13 10:25:38.000176','8','127.0.0.1','system','0','{"msg":"Login for User:admin.","logBeans":null}','0');
end loop;
commit;
END;
这里一样是分批Commit
单笔耗时33秒
脚本效率不是很满意,有更高效率的脚本欢迎在评论区分享