Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2008 Forums
 Transact-SQL (2008)
 deleting referenced table records in efficient way

Author  Topic 

manibad
Starting Member

9 Posts

Posted - 2012-09-19 : 16:34:54
HI,
I have tables namely X,Y,Z,M,L and O.These are like X->Y,Z,M,L and O,Z->M,L->O.(Here -> shows that this table is referenced to that one).In each table there are about 30000000 records.And to addition all the mentioned tables are referenced to base tables namely A and B which are like stationary records having less records.

My problem is if i am deleting 1500000 records from table O means it takes me a day or two.Is there any way in reducing the time consumption in deleting the records.

Pls give me a solution.

chadmat
The Chadinator

1974 Posts

Posted - 2012-09-19 : 16:38:24
Try deleteing in small chunks (around 5000 rows) at a time rather than one huge transaction.

-Chad
Go to Top of Page

manibad
Starting Member

9 Posts

Posted - 2012-09-19 : 17:05:25
Thank you chad.

But the thing is actually i tried doing it in chunks as well as in huge loads.But there was no proper result.
Go to Top of Page

chadmat
The Chadinator

1974 Posts

Posted - 2012-09-19 : 17:11:11
What do you mean no proper result? It wasn't any faster? Or the appropriate rows weren't deleted?

-Chad
Go to Top of Page

manibad
Starting Member

9 Posts

Posted - 2012-09-19 : 17:13:17
It wasn't faster.Records was being deleted without any error but it wasn't faster.
Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2012-09-19 : 17:24:45
The only "fast" way to delete this amount of data is to do partition switching, which is an Enterprise Edition feature, and may not work correctly if you have declared foreign keys (unless you disable or drop them while doing the delete).
Go to Top of Page

TG
Master Smack Fu Yak Hacker

6065 Posts

Posted - 2012-09-19 : 17:34:21
deleting that many rows is not going to be fast. The easiest thing to do is let it run as long as it takes. Chunking will ease the transaction log. A couple ugly options to speed things up in certain cases may be:

- select only the data you want to keep into new tables then use those instead.
- put the db in single user mode, drop the FK constraints, do all deletes, re-aply FKs

Be One with the Optimizer
TG
Go to Top of Page
   

- Advertisement -