I am currently trying to populate a MySQL5.1 database on a current Ubuntu machine with >5,000,000 entries.
Due to the program's architecture, for each INSERT statement a new database connection is opened and closed. I am aware that this is an expensive operation to do, but changing this would require to change a loooot of code, so I'd rather avoid that.
The problem I am facing is that after a while (usually about 12 seconds, but this number increases after some retries) this process is not able to connect to the database any more. Other processes can connect to the database without any problem.
I do not think that this is an MySQL issue, cause none of the MySQL logs report any errors. Also, there is max 1 connection open at a time (which I checked by taking a look at the MySQL status variables).
Question: Is there a limit of how many successive connections to the socket can be opened in a given time frame?
Here is a simplified perl-script (the actual program is in Java) that leads to the same problem (the database-variables are not included):
# ...
my $i = 0;
my $DBH = 0;
for ($i = 0;$i < $MAX; ++$i) {
$DBH = DBI->connect("DBI:mysql:$DBNAME:$DBHOST:$DBPORT", $DBUSER, $DBPW)
or die ("Error - Connection to database failed: \n $i times ok\n".DBI->errstr);
$DBH -> disconnect;
if ($i % 10000 == 0) {
print $i. " ";
}
}
Question: Any suggestions/ideas?
解决方案
try raising the number of allowed connections in /etc/mysql/my.cnf?
max_connections = 300
if php, change all instances of mysql_connect() to mysql_pconnect() ?