Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
bsethi24
Starting Member
25 Posts |
Posted - 2012-12-06 : 12:08:04
|
Dear All, Hi! I recently worked on a Data warehouse server which has performance issues. When I dig up the DB wise tables then found that frequently used tables have more than 5 billion records. Now, what steps should I take so that the fetching of records from such tables will be optimized? All such tables already have Indexes & the statistics is also getting update on a daily basis.Please guide... |
|
nigelrivett
Master Smack Fu Yak Hacker
3385 Posts |
Posted - 2012-12-06 : 12:25:26
|
Look at archiving data and splitting up the queries into separate steps.Also look at creating aggregate tablesIf they are indexed well then the performance should depend more on the data that needs to be retrieved rather than tha size of the table and you should never need to return thousands of rows.==========================================Cursors are useful if you don't know sql.SSIS can be used in a similar way.Beer is not cold and it isn't fizzy. |
|
|
sodeep
Master Smack Fu Yak Hacker
7174 Posts |
Posted - 2012-12-06 : 13:14:46
|
Have you considered using Table Partitioning? |
|
|
|
|
|