csv导入mysql_将大型csv导入mysql数据库

bd96500e110b49cbb3cd949968f18be7.png

im having a really troublesome time trying to import a large csv file into mysql on localhost

the csv is about 55 MB and has about 750,000 rows.

now ive resorted to writing a script that parses the csv and dumps rows 1 by one

heres the code:

$row = 1;

if (($handle = fopen("postal_codes.csv", "r")) !== FALSE)

{

while (($data = fgetcsv($handle, 1000, ",")) !== FALSE)

{

$num = count($data);

$row++;

for ($c=0; $c < $num; $c++)

{

$arr = explode('|', $data[$c]);

$postcode = mysql_real_escape_string($arr[1]);

$city_name = mysql_real_escape_string($arr[2]);

$city_slug = mysql_real_escape_string(toAscii($city_name));

$prov_name = mysql_real_escape_string($arr[3]);

$prov_slug = mysql_real_escape_string(toAscii($prov_name));

$prov_abbr = mysql_real_escape_string($arr[4]);

$lat = mysql_real_escape_string($arr[6]);

$lng = mysql_real_escape_string($arr[7]);

mysql_query("insert into cities (`postcode`, `city_name`, `city_slug`, `prov_name`, `prov_slug`, `prov_abbr`, `lat`, `lng`)

values ('$postcode', '$city_name', '$city_slug', '$prov_name', '$prov_slug', '$prov_abbr', '$lat', '$lng')") or die(mysql_error());

}

}

fclose($handle);

}

the problem is this is taking forever to execute...any solutions would be great.

解决方案

You are reinventing the wheel. Check out the mysqlimport tool, which comes with MySQL. It is an efficient tool for importing CSV data files.

mysqlimport is a command-line interface for the LOAD DATA LOCAL INFILE SQL statement.

Either should run 10-20x faster than doing INSERT row by row.

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值