Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
snaayk
Starting Member
17 Posts |
Posted - 2007-08-28 : 10:27:00
|
I have 3 tables that have inserts once a day for the current month and then once for the previous month. The data has a date period that gets deleted and then reinserted. It's roughly 500K, 75K, 50K records for the daily process, and then then about 900K, 150K, 50K for the previosu month. These are deleted and then inserted.I will be building select intensive indexes into the three tables, since that's all the users will do with the exception of the automated deletes/inserts. The questions is what's the best way to keep the index as efficient as possible with these updates? - Drop and rebuild daily and monthly
- reindex daily, but rebuild and drop monthly
- something else...
One last thing the monthly run is done at night and duration is not as important. The daily runs starts at about 5am and should be done by 630am. I've never timed what building an index(or indices) on a million record table takes. |
|
Kristen
Test
22859 Posts |
Posted - 2007-08-28 : 10:34:32
|
If you are deleting everything (or most of it ...) and re-inserting the same data (plus new data, less "stale" data) then I would drop the Indexes first, do the Delete/Insert, and then re-create the Indexes. That will a) make the delete/insert faster and b) build you fresh statistics etc. when the indexes are re-created.Personally I would use a staging table and only Delete/Insert/Update rows that were no longer needed, new, changed - assuming that that is a modest number of rows (if you are keeping a months data then it would be, say, 1/30th of the data for each day's import)Kristen |
 |
|
snaayk
Starting Member
17 Posts |
Posted - 2007-08-28 : 13:28:28
|
The data can change for a rolling 15 days until the end of the month (when it stops updating) until the 8th of the following month. The process is to continue refreshing teh current months data and then update it one last time 8 days after the month ends. However, data for all the previous months continues to exist. As of right now, I have about 7 million rows (in the biggest table), ~1 million per month. |
 |
|
snaayk
Starting Member
17 Posts |
Posted - 2007-08-28 : 14:32:44
|
Ive just done some testing and using dbcc showcontig it does not seem that these are having a huge impact on the fragmentation of the table. I'll continue to watch the tables and reindex or defrag as needed.This sound right? |
 |
|
rmiao
Master Smack Fu Yak Hacker
7266 Posts |
Posted - 2007-08-28 : 21:33:50
|
Has clustered index on the table? If not, rebuild index doesn't defrag table. |
 |
|
|
|
|
|
|