| Author |
Topic |
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2004-05-04 : 18:23:14
|
I'm working on a stored procedure to run DBCC INDEXDEFRAG. I won't bore you with the details of the code, but I've run into a problem. Take a look at the code below. Note that the SELECT is the same. One SELECT will run in pubs due to the USE pubs, the other will run in master. But both SELECTs have sysindexes and sysobjects qualified to pubs. Why doesn't the second one work in master when I've qualified both tables?USE pubsGOSELECT i.id, i.nameFROM pubs.dbo.sysindexes i INNER JOIN pubs.dbo.sysobjects o ON i.id = o.id WHERE o.name <> 'dtproperties' AND o.type = 'U' AND o.status > 0 AND INDEXPROPERTY(i.id, i.name, 'IsStatistics') = 0USE masterGOSELECT i.id, i.nameFROM pubs.dbo.sysindexes i INNER JOIN pubs.dbo.sysobjects o ON i.id = o.id WHERE o.name <> 'dtproperties' AND o.type = 'U' AND o.status > 0 AND INDEXPROPERTY(i.id, i.name, 'IsStatistics') = 0 Now if I change the code to use non sys tables, it works fine (both return result sets):USE pubsGOSELECT type, pub_id, t.title_idFROM pubs.dbo.titles tINNER JOIN pubs.dbo.titleauthor taON t.title_id = ta.title_idUSE masterGOSELECT type, pub_id, t.title_idFROM pubs.dbo.titles tINNER JOIN pubs.dbo.titleauthor taON t.title_id = ta.title_id Also, if I remove the JOIN, it works fine too:USE pubsGOSELECT i.id, i.nameFROM pubs.dbo.sysindexes i USE masterGOSELECT i.id, i.nameFROM pubs.dbo.sysindexes i So why doesn't it work for sysindexes and sysobjects with a JOIN? For my stored procedure, I'll be in the master database and using dynamic sql to build the DBCC INDEXDEFRAG statement for indexes in all user databases. That's why I need to qualify the tables with the database name.Tara |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2004-05-04 : 18:49:00
|
| So the problem is with INDEXPROPERTY. If I comment that part out in the WHERE, it works fine. But then how do I get a record set of the indexes excluding statistics without using INDEXPROPERTY?Tara |
 |
|
|
derrickleggett
Pointy Haired Yak DBA
4184 Posts |
Posted - 2004-05-04 : 21:17:37
|
| Can't you just run it in the context of each database? Have the query use the INDEXPROPERTY still. When you set up a job, have a job step for each database. This also allows you to perform actions on each step.Or, can you do this? DECLARE @sql NVARCHAR(4000)SELECT @sql = ' SELECT i.id, i.name FROM sysindexes i INNER JOIN sysobjects o ON i.id = o.id WHERE o.name <> ''' + 'dtproperties''' + ' AND o.type = ''' + 'U''' + ' AND o.status > 0 AND INDEXPROPERTY(i.id, i.name, ''' + 'IsStatistics''' + ') = 0'EXEC pubs..sp_executesql @sqlEXEC master..sp_executesql @sqlEXEC Northwind..sp_executesql @sql:) Isn't that nice and ugly?????MeanOldDBAderrickleggett@hotmail.comWhen life gives you a lemon, fire the DBA. |
 |
|
|
MuadDBA
628 Posts |
Posted - 2004-05-05 : 09:41:43
|
| Think about what you were passing INDEXPROPERTY. It's not that it wasn't working, you were just passing it invalid data. You were giving it the id of a table that probably doesn't exist in MASTER, but since you were executing it in the MASTER database, that is where it looked to find the table whose id you specified. By pure luck, your ids didn't match up and end up showing you something terribly incorrect.Derrick's suggestion sounds good. |
 |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2004-05-05 : 12:26:39
|
| Assume you know why indexproperty isn't working - it basically doesn't take a database context just the object_id.You can build the statement in a string and execute it in the context of the destianation database byselect @str = 'exec ' + @dbname + '..sp_executesql ''' + replace(@str,'''','''''') + ''''exec (@str)Or just execute the indexproperty statement using the same method.If all you want is to find if the index is a statistics then you can get that from sysindexes by checking the hypothetical bit (think it's status & 32 but check it).==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2004-05-05 : 12:33:03
|
| I understood why INDEXPROPERTY wasn't working after I figured out it was INDEXPROPERTY causing my problem. Thanks Derrick and Nigel, I hadn't thought about EXEC @dbname..sp_executesql. That certainly will work for me.Tara |
 |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2004-05-05 : 13:12:37
|
| Oops - didn't notice that derrickleggett had already posted that - but you will have to put the sp_executesql call in dynamic sql otherwise you will have to hard code the database name.I would use sysindexes instead - but then I don't like these new-fangled functions.==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2004-05-05 : 13:23:30
|
Yeah, I realized after I got my coffee that I would need dynamic sql within dynamic sql to get the @dbname..sp_executesql to work. I had thought about the status column last night but still couldn't figure out how to tell if it was a statistic. I'll see if I can figure it out today. On the same topic, how do you guys do your DBCC DBREINDEX or DBCC INDEXDEFRAG? Do you use dynamic sql and loop through sysindexes for a particular database or do you have the statements written out? I wanted this to be dynamic so it would catch new indexes, but I'm open to suggestions. I'm not looking to steal your code just an explanation of what you do. I'm curious as to how you handle these in a production environment.Here's what I've got for DBCC CHECKDB:CREATE PROC isp_DBCC_CHECKDBASSET NOCOUNT ON DECLARE @dbid INTDECLARE @DBName SYSNAMEDECLARE @rowcnt INTDECLARE @SQL NVARCHAR(4000)SET @dbid = 0 SELECT TOP 1 @dbid = dbid, @DBName = nameFROM sysdatabasesWHERE dbid > @dbidORDER BY dbid SET @rowcnt = @@ROWCOUNT WHILE @rowcnt <> 0BEGIN SET @SQL = 'DBCC CHECKDB(' + @DBName + ')' EXEC sp_executesql @statement = @SQL SELECT TOP 1 @dbid = dbid, @DBName = name FROM sysdatabases WHERE dbid > @dbid ORDER BY dbid SET @rowcnt = @@ROWCOUNT ENDRETURNI wanted it to be dynamic so that I could use this stored procedure in environments where I didn't always know when new databases were created (we've got two such systems like this here). Do you guys do something similar for integrity checks or just explicitly have the statements written out in a job?Tara |
 |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2004-05-05 : 14:01:13
|
| Think the status is 32 ^ 16 or 64 ^ 32 for statistics (think 32 hypothetical, 64 statistics).If I was to do this I would do it in dynamic sql in an SP to which I passed the database name.But I don't re-index as a matter of course - only when it's needed and try to design so it's not.Wouldn't run integrity checks on a live database but do it on a restored backup on another server - that's mainly from the days when nearly any dbcc command had a good chance of corrupting a database.If I don't know a database is being created then it doesn't get much attention (and may get deleted). This would only happen on a dev box - I back up everything unless agreed not to and script all objects for sourcesafe but that's about it. Backups get test restored (maybe) but any dev database should be recreatable as the data isn't important business-wise.==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2004-05-05 : 14:09:19
|
| Thanks Nigel.On the system where I don't know about databases being created, I only help out. I do not get paged in the middle of the night or anything. They contact me for assistance. I setup the backups and other maintenance jobs and then walk away until called for help. I've considered running integrity checks at our disaster recovery site since it's only 15 minutes behind the live site. I'm right now going through the exercise of analyzing the environments and seeing if anything should be done differently. I'll add this to my list. How do you design so that reindexing isn't needed? What considerations do you take into account?Tara |
 |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2004-05-05 : 14:13:48
|
| Bit shorter than yours.I would also put this in an admin database - hence master referenced explicitly.Also what are you going to do with errors? If you are running this from a job it could quickly fill up the buffer and lose any errors.CREATE PROC isp_DBCC_CHECKDBASSET NOCOUNT ON DECLARE @dbid INTDECLARE @DBName SYSNAMEDECLARE @SQL NVARCHAR(4000)SET @dbid = 0WHILE @dbid < (select max(dbid) from master..sysdatabases)BEGIN SELECT TOP 1 @dbid = dbid, @DBName = name FROM master..sysdatabases WHERE dbid > @dbid ORDER BY dbid SET @SQL = 'DBCC CHECKDB(' + @DBName + ')' EXEC sp_executesql @statement = @SQL ENDRETURN==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2004-05-05 : 14:18:06
|
| Yeah I am planning on putting these in an admin database. I just wrote that one the other day, so I hadn't gotten that far yet. Currently, I've got the DBCC CHECKDB written out for each of the databases.I hadn't though about errors except to e-mail me (most likely page me for this one) upon failure of the job. I would then run the stored procedure in Query Analyzer to view the errors. What do you do with errors?And thanks for the shortened code. I'll certainly use it and also use it as a template for other DBA related stored procs.Tara |
 |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2004-05-05 : 14:21:19
|
| For reindexing it's mainly a matter of choosing correct indexes and designing the app so that it doesn't need indexes that are going to fragment.Try to access everything by a PK which is sequential, don't update anything that is indexed, don't increase the size of rows, never report off an oltp system, keep indexes small and don't add unecessay ones, archive data that isn't needed and try to archive sequential pages.It's usually a lot easier in batch systems than oltp systems but you can still minimise the amount of maintenance needed.Things like inserting a row and updating it with the required data will cause problems.Often not having a clustered index will prevent fragmentation.==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
 |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2004-05-05 : 14:36:09
|
| Also have a look athttp://www.nigelrivett.net/BackupAllDatabases.htmlIt's just for backups but the important bit is having a table to define what happens to a database. It creates a new entry with defaults for any new database it finds and you can stop the action on any database by setting a flag.I will aslo tend to put in the last datetimes that things are run and also have a history table for everything that happens.I also have a table which holds the physical size of every database file which gets added to daily so I can see if anything is growing abnormally and when.I also save jobhistory daily to a table so that I don't lose entries due to truncation in msdb and have something I can look at easily to get the run times of jobs.Don't know what you would do with message from checkdb - maybe osql? But if you're mainly worried about completion status then it doesn't matter (just as long as the completion status doesn't get lost :)).==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
|
Seventhnight
Master Smack Fu Yak Hacker
2878 Posts |
Posted - 2004-06-22 : 20:24:08
|
| Tara - when I get to work tomorrow i'll post my sproc that rebuilds or defrag all indexes or just the primary keys for a given table. I'll also include the one that loops through the tables, but i don't think its a big deal.CoreyCorey |
 |
|
|
Seventhnight
Master Smack Fu Yak Hacker
2878 Posts |
Posted - 2004-06-23 : 08:50:43
|
| Posted to your weblog...Corey |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2004-06-24 : 02:19:31
|
quote: Originally posted by nr It's just for backups but the important bit is having a table to define what happens to a database. It creates a new entry with defaults for any new database it finds and you can stop the action on any database by setting a flag.
I have something that does that too.In case of interest: I support Full/Diff and Trans backups (we tend to do FULL on Sunday and DIFF on weekdays; these are on web servers so restore from tape held by the ISP might be problematic [i.e. slow to happen], so we want locally stored recovery of TRANS for 2 days, DAILY for a week and WEEKLY for a couple of weeks.)The time at which the full backup occurs can vary - some applications have different sensitivities as to when this should occur - plus we might have to support different time zones.So I store the time of last backup, and calculate the time of next backup (next anniversary after the current start - so if a 4AM backup doesn't start until 4:01AM, or even 5:12AM, then tomorrows will still be scheduled at 4AM).The Backup Job is run at the minimum interval required for TRANs backups (one hour for us, could be ten minutes etc.). If Full / Diff backups are overdue they are done (which could pose a problem after prolonged downtime as everything would immediately start full backups when the system comes upright, rather than waiting until the next 4AM slot)I keep a separate table of Backups History, and use this to delete backups from disk - i.e. A.N.Other Ad Hoc backup in the backup folder will not get deleted (but I use by own SProc for Ad Hoc backups, and that adds to my backup history table anyway). [The BackupTask SProc sis scheduled, looks at the Task Table, and then calls another SProc to do the Backup - with DBName / Backup Type - which logs to the Backup History table; easy therefore to call this SProc from QA for an ad hoc backup.]The Delete deals with different retention times for FULL/DIFF/TRANS - no point hanging on to the DIFF/TRANS files once the FULL backup has been deleted - we're off to tape anyway if we need those.We have a need to store different types of backups for different periods - so a TEMP database may have a different strategy to an ARCHIVE backup (possibly only backed up weekly after data is purged into it from LIVE DB).I haven't got to the point of putting the DBCC stuff in as yet, but it needs doing ...Also found that I need to check for OffLine, ReadOnly, DBO only/Single User type flags so that these don't stop the process completing processing of all the databases.Also, I put the Date, Type, Server Name and Database Name in the DESCRIPTION attribute of the BACKUP command. That has helped when "helpful" people have renamed a backup and moved it from one server to another!CREATE TABLE [dbo].[kbm_LOC_BAKS_BackupScheme] ( [loc_baks_Created] [datetime] NULL , [loc_baks_Updated] [datetime] NULL , [loc_baks_Database] [varchar] (128) NOT NULL , --PK [loc_baks_Status] [tinyint] NULL , [loc_baks_FullDailyRetension] [tinyint] NULL , [loc_baks_FullWeeklyRetension] [tinyint] NULL , [loc_baks_DiffDailyRetension] [tinyint] NULL , [loc_baks_TransHourlyRetension] [tinyint] NULL , [loc_baks_LastFull] [datetime] NULL , [loc_baks_LastDiff] [datetime] NULL , [loc_baks_LastTrans] [datetime] NULL , [loc_baks_NextFull] [datetime] NULL , [loc_baks_NextDiff] [datetime] NULL , [loc_baks_NextTrans] [datetime] NULL )CREATE TABLE [dbo].[kbm_LOC_BAKH_BackupHistory] ( [loc_bakh_Created] [datetime] NULL , [loc_bakh_Updated] [datetime] NULL , [loc_bakh_ID] [int] IDENTITY (10000, 1) NOT NULL , -- PK [loc_bakh_Database] [varchar] (128) NULL , [loc_bakh_ExpiryDt] [datetime] NULL , [loc_bakh_File] [varchar] (128) NULL , [loc_bakh_Type] [tinyint] NULL , [loc_bakh_Status] [tinyint] NULL ) Kristen |
 |
|
|
|