Author |
Topic |
nishithrn
Yak Posting Veteran
58 Posts |
Posted - 2003-11-04 : 01:39:38
|
The following procedure will display the size of all the user tables in a database. CREATE proc sp_tablesizeasif exists (select * from sysobjects where name = 'sp_tablesize')begin goto calculate_tablesizeendelsebegin CREATE TABLE #SpaceUsed ( TableName sysname, TableRows int, TableSize varchar(10), DataSpaceUsed varchar(10), IndexSpaceUsed varchar(10), UnusedSpace varchar(10) ) goto calculate_tablesizeendcalculate_tablesize:declare @tablename nvarchar(50)declare @cmd nvarchar(50) declare c1 cursor for select name from sysobjects where xtype='u'open c1fetch c1 into @tablenamewhile @@fetch_status = 0begin set @cmd='exec sp_spaceused['+@tablename+']' insert into #SpaceUsed exec sp_executesql @cmd fetch next from c1 into @tablenameendselect * from #SpaceUseddrop table #SpaceUseddeallocate c1 |
|
Stoad
Freaky Yak Linguist
1983 Posts |
Posted - 2003-11-04 : 07:03:45
|
Work fine.Only one thing... the underscore in the sp name. It's bad. |
|
|
Stoad
Freaky Yak Linguist
1983 Posts |
Posted - 2003-11-04 : 07:38:00
|
... and where lost "close c1" before "deallocate c1"? |
|
|
nishithrn
Yak Posting Veteran
58 Posts |
Posted - 2003-11-05 : 02:57:54
|
Thanks a lot for your suggestion Stoad... |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2003-11-05 : 04:25:34
|
I always wonder why people create cursors for things like this.Table names should be sysname or at least 128 bytesYou might have a look at the information schema views too.declare @tablename @nvarchar(128) ,@maxtablename @nvarchar(128) ,@cmd nvarchar(1000) select @tablename = '', @maxtablename = max(name) from sysobjects where xtype='u'while @tablename < @maxtablename beginselect @tablename = min((name) from sysobjects where xtype='u' and name > @tablename set @cmd='exec sp_spaceused['+@tablename+']'insert into #SpaceUsed exec sp_executesql @cmdend==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-11-05 : 14:32:24
|
quote: Originally posted by nr I always wonder why people create cursors for things like this.Table names should be sysname or at least 128 bytesYou might have a look at the information schema views too.declare @tablename @nvarchar(128) ,@maxtablename @nvarchar(128) ,@cmd nvarchar(1000) select @tablename = '', @maxtablename = max(name) from sysobjects where xtype='u'while @tablename < @maxtablename beginselect @tablename = min((name) from sysobjects where xtype='u' and name > @tablename set @cmd='exec sp_spaceused['+@tablename+']'insert into #SpaceUsed exec sp_executesql @cmdend==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy.
You just blew away any reason I ever had for needing a cursor.....Thanks....Brett8-) |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2003-11-05 : 14:40:53
|
I've got a few "admin" scripts that use cursors. I'll have to check to see if they can be reworked so that they use this method. Thanks Nigel. Even though you only run admin scripts every once in a while and they are not part of an application, it is still a good idea not to use cursors for the sake of good programming and performance.Tara |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-11-05 : 14:58:39
|
'cept that it doesn't work...Got the cask next to the keyboard?A little tweaking...and thanks again for the idea..Question though (and I'll go test it). How many singleton selects have to be performed until the get ouyweighed by a cursor?USE NorthwindGOCREATE TABLE #SpaceUsed ( -- The Actual output sizes of sp_spaceused -- Anything else is a waste [name] varchar(20) , [rows] char(11) , [reserved] varchar(15) , [data] varchar(15) , [index_size] varchar(15) , [unused] varchar(15))GODECLARE @tablename nvarchar(128) , @maxtablename nvarchar(128) , @cmd nvarchar(1000) SET NOCOUNT ON SELECT @tablename = '' , @maxtablename = MAX(name) FROM sysobjects WHERE xtype='u'WHILE @tablename < @maxtablename BEGIN SELECT @tablename = MIN(name) FROM sysobjects WHERE xtype='u' and name > @tablename SET @cmd='exec sp_spaceused['+@tablename+']' INSERT INTO #SpaceUsed EXEC sp_executesql @cmd ENDSELECT * FROM #SpaceUsedSET NOCOUNT OFFGODROP TABLE #SpaceUSedGO Brett8-) |
|
|
ehorn
Master Smack Fu Yak Hacker
1632 Posts |
Posted - 2003-11-05 : 15:43:38
|
I believe the singleton always outperforms the cursor. However, Beware The Lost Update...When the selected row must be updated after it is retrieved, using a cursor for update is recommended over a singleton SELECT. If you use a singleton SELECT, the row can be updated by someone else after the singleton SELECT but before the subsequent UPDATE, thereby causing the intermediate modification to be lost. |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-11-05 : 15:57:29
|
quote: Originally posted by ehorn I believe the singleton always outperforms the cursor.
What if you have to do a million of them?Or maybe something not that crazy?The cursor will bring back whole pages...Brett8-) |
|
|
ehorn
Master Smack Fu Yak Hacker
1632 Posts |
Posted - 2003-11-05 : 15:58:40
|
A CURSOR requires an OPEN, FETCH, and CLOSE to retrieve one row, whereas a singleton SELECT requires only SELECT…INTO.Makes for an interesting examination... |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-11-05 : 16:09:07
|
quote: Originally posted by ehorn A CURSOR requires an OPEN, FETCH, and CLOSE to retrieve one row
uhhh...noA CURSOR requires one OPEN, and CLOSE (you forgot DEALLOCATE) to instansiate (and de-instantiate..is that really a word) a Cursor (ooops...don't forget the DECLARE)And a fetch for each and every row, from pages that are in memory..No more rows on the page, fetch another page..(do they have sequential prefetch in SQL server?)Brett8-) |
|
|
ehorn
Master Smack Fu Yak Hacker
1632 Posts |
Posted - 2003-11-05 : 16:14:14
|
Sound like a lot of overhead to me.It seems logical that if there is more overhead in retrieving one row, than there will be more overhead in retreiving pages of data. |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-11-05 : 16:17:15
|
quote: Originally posted by ehorn Sound like a lot of overhead to me.It seems logical that if there is more overhead in retrieving one row, than there will be more overhead in retreiving pages of data.
You don't consider a million singleton selects (ok 250,000) alot of overhead?I gotta test it out...let you know...Brett8-) |
|
|
ehorn
Master Smack Fu Yak Hacker
1632 Posts |
Posted - 2003-11-05 : 16:22:41
|
quote: You don't consider a million singleton selects (ok 250,000) alot of overhead?
OK.. OK... More overhead than 250,000 cursor fetches?Time to bust out the showplan. |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-11-05 : 16:28:01
|
quote: Originally posted by ehorn
quote: You don't consider a million singleton selects (ok 250,000) alot of overhead?
OK.. OK... More overhead than 250,000 cursor fetches?Time to bust out the showplan.
Remeber now...you have to use a WHILE loop 250,000 times...No set based anything...Brett8-) |
|
|
Stoad
Freaky Yak Linguist
1983 Posts |
Posted - 2003-11-05 : 16:32:28
|
LOL, ehorn... don't be hypnotized in the cursor theme. One sample for you:suppose you have two tables. Both these tables store coordinates (x, y) ofpoints. And in the 1st table there are say 5 mln. rows, and in the 2nd - aboutsay 500 rows.Now you need to find for each point from the small table the nearest to it pointfrom the huge table...You can't even imagine how many times a cursor solution will be faster thanANY non-cursor one!!OK. Don't be lazy and just check it out. |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-11-05 : 16:39:29
|
quote: Originally posted by Stoad LOL, ehorn... don't be hypnotized in the cursor theme. One sample for you:suppose you have two tables. Both these tables store coordinates (x, y) ofpoints. And in the 1st table there are say 5 mln. rows, and in the 2nd - aboutsay 500 rows.Now you need to find for each point from the small table the nearest to it pointfrom the huge table...You can't even imagine how many times a cursor solution will be faster thanANY non-cursor one!!OK. Don't be lazy and just check it out.
No. You did not read.We are discussing attacking a solution withouth the aid of set based processing.Nigel Came up with a great solution (a non cursor based one) for some admin functions.The question was....quote: How many singleton selects have to be performed until they get outweighed(out-performed) by a cursor?
In other words, when is a cursor more effecient instead of looping through singleton SELECTS?I know you're a bright guy, but ya gotta read the whole thread...Brett8-) |
|
|
ehorn
Master Smack Fu Yak Hacker
1632 Posts |
Posted - 2003-11-05 : 16:57:56
|
AND THE WINNER IS!!oh, I dont know yet....I am still waiting for the 250,000 fetches on my p3Gotta get a new pccome on... come on... I'm going to have dinner.. maybe it will be finished when I get back. |
|
|
robvolk
Most Valuable Yak
15732 Posts |
Posted - 2003-11-05 : 18:56:01
|
quote: Originally posted by Stoad LOL, ehorn... don't be hypnotized in the cursor theme. One sample for you:suppose you have two tables. Both these tables store coordinates (x, y) ofpoints. And in the 1st table there are say 5 mln. rows, and in the 2nd - aboutsay 500 rows.Now you need to find for each point from the small table the nearest to it pointfrom the huge table...You can't even imagine how many times a cursor solution will be faster thanANY non-cursor one!!OK. Don't be lazy and just check it out.
Post some DDL and example output and we'll take (another) crack at it using set-based operations. Betcha we'll beat the cursor (again). You need more faith Stoad. |
|
|
ehorn
Master Smack Fu Yak Hacker
1632 Posts |
Posted - 2003-11-05 : 19:18:47
|
Alright, someone will have to see how this scales to 250,000 recscause my POS ain't gonna doit...I originally had the data cols at 4000 to push out more pages butI need more space on my box so I scaled back to CHAR(400) with 10,000 recs for these numbers.Would like to see numbers with data >= CHAR(4000) and >= 250,000 recs if anyone can run them....EDIT: Maybe this should be a new post.... We've gone off the topic a bit... /*---------------------------------------------------------- Sample Data---------------------------------------------------------*/CREATE TABLE TestData (n INT IDENTITY(0,1) PRIMARY KEY, data CHAR(400) NOT NULL )GOSET NOCOUNT ONINSERT INTO TestData (data) SELECT REPLICATE('BLAH', 100)FROM NumbersWHERE n < 10000GOCREATE TABLE OutData (n INT, data CHAR(400) )GO/*---------------------------------------------------------- Cursor Insert---------------------------------------------------------*/SET NOCOUNT ONDECLARE @MaxRows intSET @MaxRows = 10000DECLARE @n int,@data char(400)DECLARE number_cursor CURSOR --FAST_FORWARDFORSELECT n FROM Numbers WHERE n <= @MaxRowsOPEN number_cursorFETCH NEXT FROM number_cursor INTO @n WHILE @@FETCH_STATUS = 0BEGIN INSERT INTO OutData SELECT n,data FROM TestData WHERE n = @n FETCH NEXT FROM number_cursor INTO @nENDCLOSE number_cursorDEALLOCATE number_cursorGODELETE OutDataGO/*---------------------------------------------------------- Singleton Insert---------------------------------------------------------*/SET NOCOUNT ONDECLARE @MaxRows intSET @MaxRows = 10000DECLARE @n INT,@data CHAR(400)SET @n = 0WHILE @n <= @MaxRowsBEGIN INSERT INTO OutData SELECT n,data FROM TestData WHERE n = @n SET @n = @n+1ENDGODROP TABLE TestDataDROP TABLE OutData/*---------------------------------------------------------------------------------------------------------------------------------------- Estimate Rows Estimate IO Estimate CPU AvgRowSize TTLSubTreCost Duration----------------------------------------------------------------------------------------------------------------------------------------Cursor 30003.3010000 0.1787737479 0.1101602000 438.0000000000 0.3121736240 1:03 ---------------------------------------------------------------------------------------------------------------------------------------- |--Dynamic Cursor(number_cursor, Optimistic) |--Fetch Query(Fetch Query) | |--Table Scan(OBJECT:([Common].[dbo].[Numbers]), WHERE:([Numbers].[n]<=10000)) |--Refresh Query(Refresh Query) OPEN number_cursor FETCH NEXT FROM number_cursor INTO @n WHILE @@FETCH_STATUS = 0 BEGIN INSERT INTO OutData SELECT n,data FROM TestData WHERE n = @n |--Table Insert(OBJECT:([Common].[dbo].[OutData]), SET:([OutData].[data]=[TestData].[data], [OutData].[n]=[TestData].[n])) |--Clustered Index Seek(OBJECT:([Common].[dbo].[TestData].[PK__TestData__50FB042B]), SEEK:([TestData].[n]=[@n]) ORDERED FORWARD) FETCH NEXT FROM number_cursor INTO @n END CLOSE number_cursor DEALLOCATE number_cursor------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ Estimate Rows Estimate IO Estimate CPU AvgRowSize TTLSubTreCost Duration----------------------------------------------------------------------------------------------------------------------------------------Singleton 3.0000000000 0.0167507979 0.0000806000 415.0000000000 0.0400710940 1:01----------------------------------------------------------------------------------------------------------------------------------------DECLARE @n INT,@data CHAR(400) SET @n = 0 WHILE @n <= 10000 BEGIN INSERT INTO OutData SELECT n,data FROM TestData WHERE n = @n SET |--Table Insert(OBJECT:([Common].[dbo].[OutData]), SET:([OutData].[data]=[TestData].[data], [OutData].[n]=[TestData].[n])) |--Clustered Index Seek(OBJECT:([Common].[dbo].[TestData].[PK__TestData__50FB042B]), SEEK:([TestData].[n]=[@n]) ORDERED FORWARD) @n = @n+1 END*/ |
|
|
Next Page
|