Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
coolerbob
Aged Yak Warrior
841 Posts |
Posted - 2007-05-09 : 10:46:37
|
| I want to run Profiler Traces, Perfmon Checks and whatever else to see what hits the server the hardest in order to work out where I can make performance improvements. We don't have any massive problems other than more timeouts than I would expect from web users no-and-then when they do pretty simple free-text searches. I just want to see if I can lighten the load on the server in general.Where does a DBA get the most ROI? I don't want to drift into the realm of diminishing returns where I am making changes that have no noticeable impact.What should I start measuring first and how do you measure it? (E.g. if I should monitor disk writes in Profiler, how many writes are too high for a statement? 100? If so, is one 100-write statement per hour OK but one 100-write statement per second is not OK?)If there is already an article for this, my apologies. I'd be grateful if you can point me to it.Is http://www.sql-server-performance.com/ a sufficient one-stop-shop for this question? If so, have you found some articles there to be more useful than others? |
|
|
Kristen
Test
22859 Posts |
Posted - 2007-05-09 : 13:08:43
|
In terms of tuning queries (rather than tuning hardware, or SQL Server Config settings) this is what I do:Run SQL profiler and capture "pretty much everything" (for a single database) for a decent while - perhaps an hour, but certainly during the busy period - when the duration of queries will be exaggerated because the server is inherently busy.I save that to a Database Table.Then I run a bunch of queries on that to work out what are the "averagely worst offenders" - basically weighting to take into account frequency of calling, but also higher elapsed duration.My Special Clever Scripts" also allow me to aggregate queries which are the same, but use some sort of string handling to dynamically create the WHERE clause - as is common with ad-hoc query type applications.The I review "average" SQL syntax from that. I run the examples I have collected and examine the Query Plans and the I/O.I review the indexes available, and whether they are best for the queries. I experiment with changing the indexes and see how that effects the Query Plan and I/O.Separately I have a bunch of "reverse engineer" queries that give me a heads-up on poorly written tables. Table with no PKs, but also columns that are in practice unique but don't have any constraint, columns that are defined much wider than the data they contain, columns that contain few distinct values that should perhaps be normalised, and so on.As a separate issue I also do some Perf Mon stuff. I do this initially so I have a base-line to compare against, and to look for any obvious poorly performing counters. But that's not something that can easily be defined, its more experience of having seen loads of poorly built systems and recognising whats "fishy"!I also have some scripts that compare Config setting against the out-of-box defaults, and against my own preferred settings. I question any that are out of whack.Another first-job I do is to create a Build Script for the database. Then I can make one later, after implementing various things to DIFF for changes. Sometimes its handy for other things too - like wanting to change a PK from NonClustered to Clustered and discovering that there is Free Text / Index Server stuff dependent on the PK ... the original script then helps with making sure that I managed to put everything back in place properly after re-creating the PK.And so on and so forth ....As you are UK based this is something I can do for you as a service - although I appreciate you haven't got any problems, per se. If that's of interest please feel free to send me a Private Message.Kristen |
 |
|
|
mccoigk
Starting Member
12 Posts |
Posted - 2007-05-10 : 11:00:09
|
| For what its worth - If your looking for the cheapest/easiest/quickest ROI as far as your time is concerned, I have found that the new Performance Dashboard introduced for SP2 is very helpful for taking a quick first look glance based on what SQL Server thinks may be a problem. And its free. |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2007-05-10 : 13:28:38
|
| I'd better have a look at that then ... <fx:ToddlesOff>Hopefully it enhances my consultancy revenue by getting potential clients to the point where they realise there is a problem, but not to the point of giving them a perfect solution!Kristen |
 |
|
|
coolerbob
Aged Yak Warrior
841 Posts |
Posted - 2007-05-24 : 11:11:43
|
| For the greater good, allow me to share my experience:For months our DB server's 4 processors have been sitting at 40%-50% and then jumping up to 90%+ and sitting there for a few seconds before dropping back down to about 40%.Yesterday I made two changes and now it's spiking nicely - dropping as far down as 5%.The two changes were:Switch anti-virus off and(on a 16 gig ram server) change Maximum Server Memory down from the default of 2 147 483 647MB to 15gig.swe-e-e-e-t |
 |
|
|
coolerbob
Aged Yak Warrior
841 Posts |
Posted - 2007-05-25 : 07:53:38
|
| I found this calculation in BOL quite usefull too for PerfMon (SQLServer:Buffer Manager):([Stolen pages] + [Reserved pages]) / 100 in MB = (if I understand correctly) current ram usage (in my case 4.88 gig) |
 |
|
|
|
|
|
|
|