mysql 如何避免全屏扫描,优化MySQL查询以避免扫描很多行

I am running an application that is using tables similar to the below tables. There are one tables for articles and there is another table for tags. I want to get the latest 30 articles for a specific tag order by article id. for example "acer", the below query will do the job but it is not indexed correctly because it will scan a lot of rows if there are a lot of articles related to a specific tag. How to run a query to get the same result without scanning a large number of rows?

EXPLAIN SELECT title

FROM tag, article

WHERE tag = 'acer'

AND tag.article_id = article.id

ORDER BY tag.article_id DESC

LIMIT 0 , 30

Output

id select_type table type possible_keys key key_len ref rows Extra

1 SIMPLE tag ref tag tag 92 const 220439 Using where; Using index

1 SIMPLE article eq_ref PRIMARY PRIMARY 4 testdb.tag.article_id 1

The flollowing is the tables and sample data:

CREATE TABLE `article` (

`id` int(11) NOT NULL auto_increment,

`title` varchar(60) NOT NULL,

`time_stamp` int(11) NOT NULL,

PRIMARY KEY (`id`)

) ENGINE=MyISAM DEFAULT CHARSET=utf8 AUTO_INCREMENT=1000001 ;

--

-- Dumping data for table `article`

--

INSERT INTO `article` VALUES (1, 'Saudi Apple type D', 1313390211);

INSERT INTO `article` VALUES (2, 'Japan Apple type A', 1313420771);

INSERT INTO `article` VALUES (3, 'UAE Samsung type B', 1313423082);

INSERT INTO `article` VALUES (4, 'UAE Apple type H', 1313417337);

INSERT INTO `article` VALUES (5, 'Japan Samsung type D', 1313398875);

INSERT INTO `article` VALUES (6, 'UK Acer type B', 1313387888);

INSERT INTO `article` VALUES (7, 'Saudi Sony type D', 1313429416);

INSERT INTO `article` VALUES (8, 'UK Apple type B', 1313394549);

INSERT INTO `article` VALUES (9, 'Japan HP type A', 1313427730);

INSERT INTO `article` VALUES (10, 'Japan Acer type C', 1313400046);

CREATE TABLE `tag` (

`tag` varchar(30) NOT NULL,

`article_id` int(11) NOT NULL,

UNIQUE KEY `tag` (`tag`,`article_id`)

) ENGINE=MyISAM DEFAULT CHARSET=utf8;

--

-- Dumping data for table `tag`

--

INSERT INTO `tag` VALUES ('Samsung', 1);

INSERT INTO `tag` VALUES ('Acer', 2);

INSERT INTO `tag` VALUES ('Sony', 3);

INSERT INTO `tag` VALUES ('Apple', 4);

INSERT INTO `tag` VALUES ('Acer', 5);

INSERT INTO `tag` VALUES ('HP', 6);

INSERT INTO `tag` VALUES ('Acer', 7);

INSERT INTO `tag` VALUES ('Sony', 7);

INSERT INTO `tag` VALUES ('Acer', 7);

INSERT INTO `tag` VALUES ('Samsung', 9);

解决方案

What makes you think the query will examine a large number of rows?

The query will scan exactly 30 records using the UNIQUE index on tag (tag, article_id), join the article to each record on PRIMARY KEY and stop.

This is exactly what your plan says.

I just made this test script:

CREATE TABLE `article` (

`id` int(11) NOT NULL auto_increment,

`title` varchar(60) NOT NULL,

`time_stamp` int(11) NOT NULL,

PRIMARY KEY (`id`)

) ENGINE=MyISAM DEFAULT CHARSET=utf8 AUTO_INCREMENT=1000001 ;

CREATE TABLE `tag` (

`tag` varchar(30) NOT NULL,

`article_id` int(11) NOT NULL,

UNIQUE KEY `tag` (`tag`,`article_id`)

) ENGINE=MyISAM DEFAULT CHARSET=utf8;

INSERT

INTO article

SELECT id, CONCAT('Article ', id), UNIX_TIMESTAMP('2011-08-17' - INTERVAL id SECOND)

FROM t_source;

INSERT

INTO tag

SELECT CASE fld WHEN 1 THEN CONCAT('tag', (id - 1) div 10 + 1) ELSE tag END AS tag, id

FROM (

SELECT tag,

id,

FIELD(tag, 'Other', 'Acer', 'Sony', 'HP', 'Dell') AS fld,

RAND(20110817) AS rnd

FROM (

SELECT 'Other' AS tag

UNION ALL

SELECT 'Acer' AS tag

UNION ALL

SELECT 'Sony' AS tag

UNION ALL

SELECT 'HP' AS tag

UNION ALL

SELECT 'Dell' AS tag

) t

JOIN t_source

) q

WHERE POWER(3, -fld) > rnd;

, where t_source is a table with 1M records in it, and run your query:

SELECT *

FROM tag t

JOIN article a

ON a.id = t.article_id

WHERE t.tag = 'acer'

ORDER BY

t.article_id DESC

LIMIT 30;

It was instant.

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值