相对于一条一条的入库,批量入库有相当大的优势。我本人实验,原来n条数据入库要50多分钟,用批量之后只用了一分钟多一点。
方法一,使用PreparedStatement加批量的方法
Connection connection;
PreparedStatement stmt = null;
ResultSet rs;
Class.forName("数据库驱动");
String url = "地址";
String user = "用户名";
String pass = "密码";
connection = DriverManager.getConnection(url, user, pass);
connection.setAutoCommit(false);
PreparedStatement cmd = connection
.prepareStatement("INSERT INTO aa(l)" VALUES (?, ?, ?);");
for (int i = 0; i <1000000; i++) {//100万条数据
cmd.setTimestamp(1," ");
cmd.setString(2," ");
cmd.setInt(3," ");
cmd.addBatch();
}
cmd.executeBatch();
connection.commit();
cmd.close();
connection.close();
方法二 使用Statement加批量的方法
Statement stmt = conn.createStatement(ResultSet.TYPE_SCROLL_SENSITIVE, ResultSet.CONCUR_READ_ONLY);
for(int x = 0; x < size; x++){
stmt.addBatch("INSERT INTO adlogs(ip,website,yyyymmdd,hour,object_id) VALUES('192.168.1.3', 'localhost','20081009',8,'23123')");
}
stmt.executeBatch();
conn.commit();
方法三:直接使用Statement
conn.setAutoCommit(false);
Statement stmt = conn.createStatement(ResultSet.TYPE_SCROLL_SENSITIVE, ResultSet.CONCUR_READ_ONLY);
for(int x = 0; x < size; x++){
stmt.execute("INSERT INTO adlogs(ip,website,yyyymmdd,hour,object_id) VALUES('192.168.1.3', 'localhost','20081009',8,'23123')");
}
conn.commit();