Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
Sitka
Aged Yak Warrior
571 Posts |
Posted - 2003-11-07 : 10:13:49
|
| Hi,I need to capture/audit some metrics.What we have is a FOXPRO based ERP system that evoloved to use MSSQL as a datastore. I say datastore in the most explict term because it is not a database. There are no primary keys defined in the MSSQL DBMSproduct on the ERP tables that reside there. All database relationships, rules and structure reside in the Foxpro application.Foxpro accesses the SQL 'store' using about 500 views to the datastore which is about 2 Gigabyte.Now hopefully some of you have a bit of Foxpro experience and understand that it a flat file based DBS. Also it seems all the programming 'ideas' passed throught the Foxpro world were CURSOR based. So in essense the application runs on a backbone of building temporary files which represent the cursors, the cursors for the most part are generated with dynamic strings and these cursors are pulled down client side then further processed on the client. If they are not complete for the whole proccessing that needs to be done then the views are accessed again for more data whether to build more cursors or just direct selects. Also bound controls and dynamic string inserts update are everywhere. Well I have come to find out that Accountancy types really only do any real work on Friday. So running this Foxpro nightmare can just peg (80% or greater) the server. This is like 5-10 users really hitting it. I am constantly defending myself to those who don't understand such things that the bottleneck the users experience resides in the poor programming/concept of the Foxpro application and their weak client machines. While some of my more advanced reporting does indeed hit hard at times it is merely because real time is very important here and calculation of Gross Margin or variances takes a lot of work when sales orders have year long build times, 10 thousand hours of labour and hundreds of purchases. So I need to capture the application that uses up all our power. Advice.I'm running perfmon on my workstation and set it up with an hour long view 36 second sample time. But I really need to break this down intoprocess info. The Foxpro app operates under a single user. My applications operate under a dedicated login mainly from a web server but there are some other client installed apps.Right now the SQL set of performance counters aren't available to me. I'm gonna reboot the server tomorrow night to do thisunlodctr.exe MSSQLServerlodctr.exe <SQL Server path>\binn\sqlctr.iniand explore how I can log those.So advice, it's been a while since I've looked at Profiler but I believe I can build traces showing query time by individual user? |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2003-11-07 : 12:24:46
|
| Yes, if you just use the default template in SQL Profiler, then you'll get what you want. I typically add a few events and also filter my results. Send your results to a table. Then:SELECT Duration, TextDataFROM SomeTableWHERE Duration > 1000ORDER BY Duration DESCThe above query will give you the queries that are taking over 1 second. Just modify the WHERE clause to what you need.Tara |
 |
|
|
Sitka
Aged Yak Warrior
571 Posts |
Posted - 2003-11-07 : 13:12:29
|
| Hmm..."over a second" is going to exclude alot of load. Remember cursors here SELECT checkno from gldistribution where checkno = '567800' GOSELECT checkno from gldistribution where checkno = '567801' GOSELECT checkno from gldistribution where checkno = '567802' GOSELECT checkno from gldistribution where checkno = '567803' GOLike the profiler will return 35,000 of these statements.This is the kind of stuff that gets generated just to browse a range of checkno'sin a GUI. Getting the hang of performance logging in perfmon so I hope the more specific sqlctr.ini set of counters will be a little more specific.Used with some nice Send your results to a table, Profiles that I can aggregate on.And a time of observation at the Process Info page during high usage should all work in conjunction to prove my point and get some folks off my back.It all seems clear to me but I need a thing that says."Foxpro application uses 80% of server resources during peak time.95% of that is do to slow client machines and poor implimentation"Big red letters and middle finger. |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2003-11-07 : 13:24:58
|
| I exclude queries that take over 1 second because those queries are performing fine so I don't need to waste my time optimizing them.Tara |
 |
|
|
Sitka
Aged Yak Warrior
571 Posts |
Posted - 2003-11-07 : 14:27:17
|
Oh I see... Profiler actually returns/displays durations less than a second as 0, I mixed that up with your query in a kind of mixed metaphors style. Must just be column resolution on the GUI. This is good news. Three more different things to do today and I can get back to Profiler. Oh it might be time by then but Oh well. Once I get a nice result set I'll have to thank you for giving me the finger so that I may pass it on to others. hee hee. |
 |
|
|
MichaelP
Jedi Yak
2489 Posts |
Posted - 2003-11-07 : 14:47:03
|
| What you could do it log the profiler trace to a table, and see how many queries were executed. That might help you factor in the cursor thing. Michael<Yoda>Use the Search page you must. Find the answer you will.</Yoda> |
 |
|
|
|
|
|
|
|