我用的阿里druid数据库连接池(其实这个和连接池毛线关系没得 ),创建jdbctemplate在执行insert 15000条数据时 ,我发现还是30条,20条,35条这样提交。 完全没有batch的效果。
查看数据库,5秒才267条。
在等20秒看,才5130条。
说明这个批量执行,没有生效。
================
================
1. 我们再来,我改造了参数。(截图中是5431毫秒, 写错了)
15000条,数据5秒就搞定。
2.、再试一次 。1秒(1294毫秒)就搞定。
我们对比下。我url修改了什么参数。
修改前:
server.port=29098
user.userServicePath=http://localhost:29097/simple/
# 驱动配置信息
spring.datasource.url = jdbc:mysql://192.168.15.44:3306/ml_test?useSSL=false
spring.datasource.username = root
spring.datasource.password = br
spring.datasource.driver-class-name = com.mysql.jdbc.Driver
spring.datasource.initialize = true
spring.datasource.init-db = true
spring.datasource.timeout = 60000
spring.datasource.MaxActive = 20
spring.datasource.initActive = 2
spring.datasource.minActive = 0
修改后:
server.port=29098
user.userServicePath=http://localhost:29097/simple/
# 驱动配置信息
spring.datasource.url = jdbc:mysql://192.168.15.44:3306/ml_test?useSSL=false&useUnicode=true&characterEncoding=utf-8&rewriteBatchedStatements=true&autoReconnect=true
spring.datasource.username = root
spring.datasource.password = bro
spring.datasource.driver-class-name = com.mysql.jdbc.Driver
spring.datasource.initialize = true
spring.datasource.init-db = true
spring.datasource.timeout = 60000
spring.datasource.MaxActive = 20
spring.datasource.initActive = 2
spring.datasource.minActive = 0
没错,就是
rewriteBatchedStatements = true
jdbcTemplate.batchUpdate 调用的地方:
public int[] BatchInsertMuser(List<Muser> list, String v_sql) {
System.out.println("list.size()=" + list.size());
return jdbcTemplate.batchUpdate(v_sql, new BatchPreparedStatementSetter() {
public void setValues(PreparedStatement ps, int i) throws SQLException {
ps.setInt(1, list.get(i).getId());
ps.setString(2, list.get(i).getName());
ps.setString(3, list.get(i).getEmail());
}
@Override
public int getBatchSize() {
// TODO Auto-generated method stub
return list.size();
}
});
}
--------------------------------结论---------------------------------
MySql 非批量10万条记录, 5700条/秒
MySql 批量(batch)10万条记录, 62500条/秒
oracle 非批量插入10万条记录, 4464条/秒
oracle 批量(batch)插入10万条记录, 27778条/秒
补充:
public int[] batchUpdate(java.lang.String sql,
BatchPreparedStatementSetter pss)
throws DataAccessException
Description copied from interface: JdbcOperations
Issue multiple update statements on a single PreparedStatement, using batch updates and a BatchPreparedStatementSetter to set values.
Will fall back to separate updates on a single PreparedStatement if the JDBC driver does not support batch updates.
Specified by:
batchUpdate in interface JdbcOperations
Parameters:
sql - defining PreparedStatement that will be reused. All statements in the batch will use the same SQL.
pss - object to set parameters on the PreparedStatement created by this method
Returns:
an array of the number of rows affected by each statement
Throws:
DataAccessException - if there is any problem issuing the update
返回的int[] it .但是里面的值都是-2 ,-2表示成功。没有具体的行数。
但是从目前测试来看,一条成功,就返回-2, 那么可以用it.length() 记录插入成功多少条记录。
15000条,就有15000个 -2
————————————————
原文链接:https://blog.csdn.net/shushugood/article/details/81005718#commentBox