| Author |
Topic |
|
debug2k
Starting Member
18 Posts |
Posted - 2004-05-20 : 11:08:20
|
| Hello Frens,Require you valuable inputs.I have a development database server in a very bad condition, so i need to identify the databases which have not been accessed from the past 5 months, Is there any way in which i can do this ??Is there any way i can find out the Last accessed date for a database OR any Table in a database?regards,Deepak Uniyal |
|
|
raymondpeacock
Constraint Violating Yak Guru
367 Posts |
Posted - 2004-05-20 : 11:18:03
|
| No, SQL Server does not store this type of information. The best you could get is the date databases and their objects were created, but that doesn't give you when they were modified (unless you had a time stamp column in the tables) or viewed.If you are looking to delete the unused databases, I suggest you setup a process to check the sysprocesses table in master and start recording the databases in use using the DBID column. Over a period of time this may give you an idea of which databases are not in use.Raymond |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2004-05-20 : 12:26:32
|
| You could also setup a trace to capture which databases are in use.Tara |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2004-05-22 : 04:53:17
|
| This is a rather poor solution, but might work OK for a DEV server.We set our "unused" databases to OFFLINE.In the meantime if something breaks we can just set the database back to "online" with no ill effect.Anything still set to OFFLINE after a couple of weeks we drop.This will likely start breaking procedures that try to process all databases (possibly including any Wizzard-written Maintenance Plans - what can't that stuff NOT try to do Minor Repairs on Read-Only databases and the like? </RANT>)I suppose you could DETACH databases thought to be unused - that wouldn't break any all-databases type processes. And if the original files were left in place they could easily be reinstated with an ATTACH. Again, just delete the detached files after a couple of weeks if they remain not-needed. Slight risk that someone would create a new database with the name of a database in the process of "being pensioned off"Kristen |
 |
|
|
derrickleggett
Pointy Haired Yak DBA
4184 Posts |
Posted - 2004-05-22 : 11:10:45
|
| ??? And WHY is this a better solution then setting up a trace? That's a good way to get fired where I work. They really don't care what the DBA's excuse is for something like this. The traces in 2000 don't really have that much overhead. Set the trace up to record on another machine (like your desktop). Trace all userconnections on the server and what database, hostname, application name, and Client ProcessID they use.MeanOldDBAderrickleggett@hotmail.comWhen life gives you a lemon, fire the DBA. |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2004-05-23 : 05:29:36
|
quote: And WHY is this a better solution then setting up a trace?
It aint! which is why I said "This is a rather poor solution"quote: That's a good way to get fired where I work
Indeedie, but you are a "real DBA" - in the sense that you do this for a living, look after lots of data, and know what you are doing, and MSSQL intimately.We are a small outfit. I only know what I know ;-) and I suspect that applies to debug2k too. I vaguely understand what you mean by "setting up a trace" - in fact I've just had a trawl around SQL Profiler [assuming I'm not making an ass of myself saying that?!] and indeed I can trace logins; and come to think of it that could be really handy [monitoring hacking attempts and so on]. However, it is not "obvious" for a newbie to set up. Also, from here, I have a dial-up ISDN connection to the servers I look after (that's rural UK for you!) and continuous-monitoring type stuff is a bit naff; to be sure I could set this up on the servers in question, and store it to a Table, and so on. I'd have to build a package of some sort to survive a server reboot and so on and so forth.I reckon my idea would work for a Mom and Pop outfit, but I certainly think yours is more "proper".Kristen |
 |
|
|
debug2k
Starting Member
18 Posts |
Posted - 2004-05-23 : 17:39:21
|
| Thanks a lot folks...i'll apply the trace policy to track down such databases...then make those Dbs which are not accessed for a month 'Single User/Read only' and finally if no developer claims them..send out a communication to the developer community of the deletion of the databse by certain date && finally if no response... drop the database.I really appreciate all your helpRegards,Debug |
 |
|
|
derrickleggett
Pointy Haired Yak DBA
4184 Posts |
Posted - 2004-05-23 : 19:13:10
|
| Also, from here, I have a dial-up ISDN connection to the servers I look after (that's rural UK for you!)WOW. Do you have to buy used modems? lol :)MeanOldDBAderrickleggett@hotmail.comWhen life gives you a lemon, fire the DBA. |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2004-05-24 : 01:01:13
|
quote: Do you have to buy used modems?
Now that would be a bonus, but sadly its Cisco VPN stuff, so no money saved there :-(Kristen |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2004-05-24 : 01:09:01
|
quote: then make those Dbs which are not accessed for a month 'Single User/Read only'
Beware that READONLY will muck up the standard maintenance plans, if you use them, and have the "Make minor repairs" option checked.Kristen |
 |
|
|
eyechart
Master Smack Fu Yak Hacker
3575 Posts |
Posted - 2004-05-24 : 01:14:46
|
| well, to my knowledge there is nothing that you can look at to determine if a database has been idle for the last 5 months.However, you could schedule a job that executed sp_who, or sp_who2 or whichever variant is currently floating around. The output of this sproc could be stored in a table. If this were run, say every 5 minutes, a pretty accurate picture of who accesses which databases could be determined. Of course, you would want to aggregate this information in some way to avoid a table that grows out of control.Anyway, from this you could determine the following:1. which accounts are actually being used2. which databases are being used3. the activity patterns of all your databases-ec |
 |
|
|
derrickleggett
Pointy Haired Yak DBA
4184 Posts |
Posted - 2004-05-24 : 08:18:56
|
| If you go into Profiler, you just need to set up a profiler trace that includesSecurity Audit>Audit Login>Audit Login FailedThe trace data columns need to include:>DatabaseID>DBUserName>HostName>NTDomainName>ApplicationName>NTUserName>LoginName>ClientProcessID>StartTimeThis will allow you to easily filter by the host, application, and PID number they are using to login with. It will also allow you to easily filter what database they were trying to hit and if it failed.You can save the profiler traces to a SQL Server table for easy querying. In addition, you can download Bill Graziano's nifty little trace tool and use that to automate the trace.MeanOldDBAderrickleggett@hotmail.comWhen life gives you a lemon, fire the DBA. |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2004-05-25 : 03:48:43
|
quote: You can save the profiler traces to a SQL Server table for easy querying.
I had a look in the HELP, expecting to find a CREATE statement, but all I found was under "SQL Profiler, saving trace" and that didn't help - is there a pointer to a table structure of somesuch pls?Kristen |
 |
|
|
derrickleggett
Pointy Haired Yak DBA
4184 Posts |
Posted - 2004-05-25 : 20:25:59
|
| When you set up the Profiler Trace, there's a section to select a database to save the trace to. When you select the server in the dropdown, it lets you call the table whatever you want to. It creates the table for you and when you run the trace, all the data gets put in that table. You can query the table after it's been created and is loading the trace information into the table.MeanOldDBAderrickleggett@hotmail.comWhen life gives you a lemon, fire the DBA. |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2004-05-26 : 01:30:50
|
Oh Bother Said Pooh!I assumed I had to give it an existing table Thanks for that,Kristen |
 |
|
|
|