Stored procedures to implement paging for large tables or queries in SQL Server 2005 and SQL Server 2008
Written By: Michelle Gutzait -- 3/3/2009 -- 2 comments
Stay informed - get the MSSQLTips.com newsletter and win - click here |
Problem
I need to query a large amount of data to my application window and use paging to view it. The query itself takes a long time to process and I do not want to repeat it every time I have to fetch a page. Also, the number of rows in the result set could be huge, so I am often fetching a page from the end of the result set. I can't use the default paging because I wait a long time until I get the data back. What are my options?
Solution
There are few possible solutions out there for paging through a large result set. In this tip, I am going to focus on three examples and compare the performance implications. The examples are:
- Example 1 - I use a temporary table (#temp_table) to store the result set for each session.
- Example 2 - I use a Common Table Expression (CTE) to page through the result set.
- Example 3 - I populate a global temporary table to store the complete result set.
The first two examples are similar to some of the most commonly used paging stored procedure options, the third example is my own extension which I wanted to show for comparison in this specific case of a complex query with a large large result set.
Example #1 - Using a session temporary table (#temp_table)
In this stored procedure, I create the temporary table and insert only the relevant rows into it based on the input parameters:
CREATE PROCEDURE dbo.proc_Paging_TempTable ( @Page int, @RecsPerPage int ) AS -- The number of rows affected by the different commands -- does not interest the application, so turn NOCOUNT ON SET NOCOUNT ON -- Determine the first record and last record DECLARE @FirstRec int, @LastRec int SELECT @FirstRec = (@Page - 1) * @RecsPerPage SELECT @LastRec = (@Page * @RecsPerPage + 1) -- Create a temporary table CREATE TABLE #TempItems (RowNum int IDENTITY PRIMARY KEY, Title nvarchar(100), Publisher nvarchar(50), AuthorNames nvarchar(200), LanguageName nvarchar(20), FirstLine nvarchar(150), CreationDate smalldatetime, PublishingDate smalldatetime, Popularity int) -- Insert the rows into the temp table -- We query @LatRec + 1, to find out if there are more records INSERT INTO #TempItems (Title, Publisher, AuthorNames, LanguageName, FirstLine, CreationDate, PublishingDate, Popularity) SELECT TOP (@LastRec-1) s.Title, m.Publisher, s.AuthorNames, l.LanguageName, m.FirstLine, m.CreationDate, m.PublishingDate, m.Popularity FROM dbo.Articles m INNER JOIN dbo.ArticlesContent s ON s.ArticleID = m.ID LEFT OUTER JOIN dbo.Languages l ON l.ID = m.LanguageID ORDER BY m.Popularity desc -- Return the set of paged records SELECT * FROM #TempItems WHERE RowNum > @FirstRec AND RowNum < @LastRec -- Drop the temp table DROP TABLE #TempItems -- Turn NOCOUNT back OFF SET NOCOUNT OFF GO |
Example #2 - Using a Common Table Expression (CTE)
In this example, I use a CTE with the ROW_NUMBER() function to fetch only the relevant rows:
CREATE PROCEDURE dbo.proc_Paging_CTE ( @Page int, @RecsPerPage int ) AS -- The number of rows affected by the different commands |
Example #3 - Using a global temporary table to hold the whole result
In this example, I use a global temporary table to store the complete result set of the query. In this scenario, this temporary table will be populated during the first execution of the stored procedure. All subsequent executions of the stored procedure will use the same temporary table. The idea behind this approach is that, when using a Global temporary table, other sessions can also use the same table (if they are aware of the GUID and need the same data). In order to drop the temporary table, you will have to either drop it explicitly or disconnect the session.
If this approach does not work for you, you could use the same technique method to create "temporary" tables in your user defined database with a unique extension. One specific scenario when this technique could be useful is when the tempdb database is already being a bottleneck. If that is the case, with this approach you can always create a dedicated database for these tables. Just do not forget to drop the temporary objects when they are not required.
CREATE PROCEDURE dbo.proc_Paging_GlobalTempTable ( @Page int, @RecsPerPage int, @GUID uniqueidentifier = null OUTPUT -- will output the extension of the table. -- This parameter should be sent by the application: -- First time it should be NULL and after, it should be -- populated by the value that was sent back from the SP. ) AS -- The # of rows affected ny the different commands -- does not interest the application, so turn NOCOUNT ON SET NOCOUNT ON |
Test Cases
In an effort to gather average execution metrics for the stored procedures, I ran them three times and monitored their statistics (averages reads, writes, CPU, duration, etc.) via SQL Server Profiler. In my tests, I assumed that a standard page contains around 50 rows and I executed the stored procedures to return the following pages:
- Page 1
- Page 10
- Page 100
- Page 1,000
- Page 10,000
- Page 100,000
Here is a sample execution for page 1 and page 10:
-- Page 1 -- dbo.proc_Paging_TempTable exec dbo.proc_Paging_TempTable 1,50 go -- dbo.proc_Paging_CTE exec dbo.proc_Paging_CTE 1,50 go -- dbo.proc_Paging_GlobalTempTable declare @GUID uniqueidentifier exec dbo.proc_Paging_GlobalTempTable 1,50, @GUID OUTPUT SELECT @GUID go ---------------------------------- -- Page 10 -- dbo.proc_Paging_TempTable exec dbo.proc_Paging_TempTable 10,50 go -- dbo.proc_Paging_CTE exec dbo.proc_Paging_CTE 10,50 go -- dbo.proc_Paging_GlobalTempTable declare @GUID uniqueidentifier -- Using the GUID from previous execution: SET @GUID = '{6A71D5E2-691D-46B2-867C-8969DADB732B}' exec dbo.proc_Paging_GlobalTempTable 10,50, @GUID OUTPUT go |
Test results
Average Reads | ||||||||
Page # | 1 | 10 | 100 | 1000 | 10000 | 100000 | TOTAL | |
Example # | ||||||||
# 1 | 543 | 6590 | 58175 | 621039 | 6409168 | 6168499 | 13264014 | |
# 2 | 365 | 3258 | 33128 | 354052 | 4227203 | 4286209 | 8904215 | |
# 3 | 154449 | 69 | 69 | 107 | 69 | 7 | 154770 | |
Average Writes | ||||||||
Page # | 1 | 10 | 100 | 1000 | 10000 | 100000 | TOTAL | |
Example # | ||||||||
# 1 | 3 | 25 | 260 | 2695 | 28333 | 29972 | 61288 | |
# 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
# 3 | 27483 | 0 | 0 | 0 | 0 | 0 | 27483 | |
Average CPU | ||||||||
Page # | 1 | 10 | 100 | 1000 | 10000 | 100000 | TOTAL | |
Example # | ||||||||
# 1 | 15 | 16 | 172 | 4578 | 19422 | 14703 | 38906 | |
# 2 | 16 | 16 | 93 | 1797 | 6230 | 6547 | 14699 | |
# 3 | 15594 | 0 | 0 | 0 | 0 | 0 | 15594 | |
Average Duration | ||||||||
Page # | 1 | 10 | 100 | 1000 | 10000 | 100000 | TOTAL | |
Example # | ||||||||
# 1 | 376 | 2769 | 5816 | 58087 | 67599 | 42007 | 176654 | |
# 2 | 71 | 107 | 165 | 2738 | 22173 | 8103 | 33357 | |
# 3 | 77420 | 268 | 79 | 208 | 125 | 139 | 78239 |
Test Analysis
Based on the test cases and the sample data, the following generalities can be made:
- The CTE appears to out perform the local and global temporary tables in most cases.
- The CTE is probably the best option when your application or users are not paging much.
- Since the CTE was introduced in SQL Server 2005, using this coding technique may be an improvement over SQL Server 2000 code that was ported directly to SQL Server 2005 or 2008 without being tuned.
- If your application is constantly paging or if your users are fetching the same data constantly, the global temporary table does offer performance enhancements after the first execution.
Next Steps
- Create your own samples in your environment and test them. Compare performance and choose the right solution for your application and users.
- If performance is still bad, consider using other methods such as: Indexed Views, Table Partitioning, summary tables, etc.
- Keep in mind that you can always use more than one method for different scenarios in your application.
- The above examples were designed for the 'common' applications. If you have other ideas on how to better implement paging when performance is critical, please feel free to post your experiences in the MSSQLTips forum below.