Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
RYates
Starting Member
2 Posts |
Posted - 2012-06-26 : 05:51:45
|
Hi allOK i'm a newbie to SQL and a "accidently DBA" (love that term).We have several SQL databases, and we've never had anyone create any maintenance tasks etc, so i've been researching it for a while.I've run DBCC checkcontig on 1 database, and it seemed to show it was heavily fragmented, so after some research i used Ola Hallengren's "IndexOptimize" script on it. (Took about 3mins to complete)I then re-ran dbcc checkcontig, and can see a definate change.For example, here's a before and after for a single tab:Table: 'zZENObject' (1317579732); index ID: 1, database ID: 5TABLE level scan performed.- Pages Scanned................................: 2947- Extents Scanned..............................: 375- Extent Switches..............................: 2945- Avg. Pages per Extent........................: 7.9- Scan Density [Best Count:Actual Count].......: 12.53% [369:2946]- Logical Scan Fragmentation ..................: 99.22%- Extent Scan Fragmentation ...................: 97.33%- Avg. Bytes Free per Page.....................: 2730.1- Avg. Page Density (full).....................: 66.27%DBCC SHOWCONTIG scanning 'zZENObject' table...Table: 'zZENObject' (1317579732); index ID: 1, database ID: 5TABLE level scan performed.- Pages Scanned................................: 1994- Extents Scanned..............................: 250- Extent Switches..............................: 249- Avg. Pages per Extent........................: 8.0- Scan Density [Best Count:Actual Count].......: 100.00% [250:250]- Logical Scan Fragmentation ..................: 0.05%- Extent Scan Fragmentation ...................: 58.00%- Avg. Bytes Free per Page.....................: 175.4- Avg. Page Density (full).....................: 97.83%So, if i'm understanding this right, BEFORE i ran the script, there were 2947 pages of info in the table, and AFTER theres 1994 pages.So am guessing it's reduced the pages, by removing the empty info? Sound right?Thanks guys. |
|
nigelrivett
Master Smack Fu Yak Hacker
3385 Posts |
Posted - 2012-06-26 : 06:37:35
|
Yes, and the indexed values should be in a more contiguous order for scanning.Have a look at the extent switches and bytes free per page.You might want to lok at the design of the system to see why it got that fragmented in the first page - could just be an initial load but maybe the structure doesn't suit the way data is inserted/updated. Not saying it needs to be changed but the less maintenance you have to do the better.==========================================Cursors are useful if you don't know sql.SSIS can be used in a similar way.Beer is not cold and it isn't fizzy. |
|
|
RYates
Starting Member
2 Posts |
Posted - 2012-06-26 : 07:04:19
|
Thanks for the reply mate.Cool, good to hear i'm on the right understanding track :)As for the way the data is entered into the system, well this is a database for a Novell ZENworks Configuration Management system (desktop management system) so i have no control over how it works.Thanks again. |
|
|
|
|
|
|
|