Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
Lady
Starting Member
32 Posts |
Posted - 2002-11-27 : 07:50:11
|
| Please, help me.I need the job that runs the script that could be run at time, which SQL Server has, least load.I define 5 users (for example) as max allowed for this jobEdited by - Lady on 11/27/2002 09:27:27 |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2002-11-27 : 09:04:21
|
| You mean you don't know when during the day this will be?How long does the script take - do you stop people logging on while it runs?You can add a step which just queries sysprocesses to find how may logged on users there are before continuing. Put it in a loop with a pause.==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
 |
|
|
Lady
Starting Member
32 Posts |
Posted - 2002-11-27 : 09:22:29
|
| - I don't know time - because project lives all around the world- this script cleanup the trash is about 5 minutesShould I put script in every hour and check number of user?what about sp_add_jobschedule paramenter freq_type = 128 (Run when the computer is idle)is it run then computer is logoff?Edited by - Lady on 11/27/2002 09:23:51Edited by - Lady on 11/27/2002 09:48:30 |
 |
|
|
robvolk
Most Valuable Yak
15732 Posts |
Posted - 2002-11-27 : 09:33:35
|
| It would probably be more effective to profile the usage levels on your SQL Server over a 24-hour period and look for consistent times when usage is low. You can't have an immediate process "know" that this moment, right now, is the lowest usage; it has to compare it to something. And there's no way for it to know that someone just kicked off a monster query right after it started.How about making the trash cleanup a series of smaller operations, instead of one big one? If you have to delete 1 million rows, for example, how about splitting it up into 20 operations of 50,000 rows each? That way it can run once an hour and still get the same amount of work done in a day. Something like that might even run in 10 seconds or less, and when you add them all up it would be faster in total than one big job (I've done quite of bit of operations this way and it worked out much better than one big job) |
 |
|
|
Lady
Starting Member
32 Posts |
Posted - 2002-11-27 : 09:45:37
|
| I cant splitting cleanup up into small step - it's cascade deleting just one rather big object.Thanks a lot for solution with profile :) |
 |
|
|
|
|
|