im having a really troublesome time trying to import a large csv file into mysql on localhost
the csv is about 55 MB and has about 750,000 rows.
now ive resorted to writing a script that parses the csv and dumps rows 1 by one
heres the code:
$row = 1;
if (($handle = fopen("postal_codes.csv", "r")) !== FALSE)
{
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE)
{
$num = count($data);
$row++;
for ($c=0; $c < $num; $c++)
{
$arr = explode('|', $data[$c]);
$postcode = mysql_real_escape_string($arr[1]);
$city_name = mysql_real_escape_string($arr[2]);
$city_slug = mysql_real_escape_string(toAscii($city_name));
$prov_name = mysql_real_escape_string($arr[3]);
$prov_slug = mysql_real_escape_string(toAscii($prov_name));
$prov_abbr = mysql_real_escape_string($arr[4]);
$lat = mysql_real_escape_string($arr[6]);
$lng = mysql_real_escape_string($arr[7]);
mysql_query("insert into cities (`postcode`, `city_name`, `city_slug`, `prov_name`, `prov_slug`, `prov_abbr`, `lat`, `lng`)
values ('$postcode', '$city_name', '$city_slug', '$prov_name', '$prov_slug', '$prov_abbr', '$lat', '$lng')") or die(mysql_error());
}
}
fclose($handle);
}
the problem is this is taking forever to execute...any solutions would be great.
解决方案
You are reinventing the wheel. Check out the mysqlimport tool, which comes with MySQL. It is an efficient tool for importing CSV data files.
mysqlimport is a command-line interface for the LOAD DATA LOCAL INFILE SQL statement.
Either should run 10-20x faster than doing INSERT row by row.