Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
odykob
Starting Member
1 Post |
Posted - 2004-04-15 : 16:24:58
|
| We had a database optimization ran on our SQL 2000 sp3a server. It was run against a database that houses all the data for an application we use for reservations and contact management. We use MSQRY and pivot tables in Excel, as well as Crystal to get reports out of the data. After the optimization job, the reports have been unbearably slow, if the don't lock up. I tried creating a new pivot table and it worked fine, but all our existing ones won't. Optimization was set at 50%, (which I didn't like) as directed by our tech support for the vendor. I re-ran the optimization at 10% after all the trouble started, but that didn't help. Please help if you have any ides. |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2004-04-15 : 16:57:44
|
| When you say 10% and 50%, do you mean the FILLFACTOR? Both values are really low. Typically the values are around 80. And what optimization did you run? DBCC DBREINDEX? Are the statistics updated or do you even have statistics? Are those options set for the database for autocreate and autoupdate?Tara |
 |
|
|
ValterBorges
Master Smack Fu Yak Hacker
1429 Posts |
Posted - 2004-04-15 : 18:13:47
|
| How much data are we talking?What does your infrastructure look like?What is the transactional volume of the machine?One possible solution is to do some datawarehousing and build olap cubes during the night time on a different machine. So that all the reports and analytics are not running against your oltp system and against your olap system.Leave fillfactor at 0%.if you go with the warehousing approach your oltp system would probably need less indexes and run faster, run 100% full stats with no recompute if possible and partition your data at the table level and possible distribute it across filegroups and if possible distribute the filegroups to different physical i/o.If you decide to go with the warehousing approach make sure you read up on dimensional modeling, etl, and to have a good net change process in place. |
 |
|
|
|
|
|