[url]http://www.mysqlperformanceblog.com/2006/06/09/why-mysql-could-be-slow-with-large-tables/[/url]
So if you’re dealing with large data sets and complex queries here are few tips
[color=orange]Try to fit data set you’re working with in memory[/color] – Processing in memory is so much faster and you have whole bunch of problems solved just doing so. Use multiple servers to host portions of data set. Store portion of data you’re going to work with in temporary table etc.
[color=orange]Prefer full table scans to index accesses[/color] – For large data sets full table scans are often faster than range scans and other types of index lookups. Even if you look at 1% or rows or less full table scan may be faster.
[color=orange]Avoid joins to large tables[/color] Joining of large data sets using nested loops is very expensive. Try to avoid it. Joins to smaller tables is OK but you might want to preload them to memory before join so there is no random IO needed to populate the caches.
So if you’re dealing with large data sets and complex queries here are few tips
[color=orange]Try to fit data set you’re working with in memory[/color] – Processing in memory is so much faster and you have whole bunch of problems solved just doing so. Use multiple servers to host portions of data set. Store portion of data you’re going to work with in temporary table etc.
[color=orange]Prefer full table scans to index accesses[/color] – For large data sets full table scans are often faster than range scans and other types of index lookups. Even if you look at 1% or rows or less full table scan may be faster.
[color=orange]Avoid joins to large tables[/color] Joining of large data sets using nested loops is very expensive. Try to avoid it. Joins to smaller tables is OK but you might want to preload them to memory before join so there is no random IO needed to populate the caches.