插入数据优化有几个方面:
批量插入数据
# 我们在插入数据时,可以一次插入多条数据以进行SQL优化(可以一次插入500~1000条数据)
insert into tb_test values (1,'TOM'),(2,'JERRY')...;
手动提交事务
# SQL在每条语句后都进行提交会影响整体性能,我们可以手动提交以减轻电脑负担。在DB2、Oracle中没有start transaction;使用db2 +C "insert ..."手动commit.在SQLSEVER和PostgreSQL中是begin transaction;在mysql中是start transaction;
DB2: db2 +C "insert into employee values(500000,'kys'),(500001,'liys')"
MYQL:
start transaction;
insert into employee values(500000,'kys');
insert into employee values(500001,'kys');
insert into employee values(500002,'kys');
commit;
如果一次性插入超大量数据,insert语句的插入性能就太低了,因而我们采用load方法插入:
DB2 :db2 load from employee.del of del insert into employee
-- 客户端连接服务端时,加上参数 -–local-infile
mysql –-local-infile -u root -p
-- 设置全局参数local_infile为1,开启从本地加载文件导入数据的开关
set global local_infile = 1;
-- 执行load指令将准备好的数据,加载到表结构中
load data local infile './employee.log' into table employee;
注意:默认是,为分割符,回车为换行符。fieldsterminated by ',' lines terminated by '\n' ;