Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
RoyalSher
Yak Posting Veteran
95 Posts |
Posted - 2004-08-26 : 07:29:13
|
HiI have a tight situation here in my shop. I have a database which sits on an old server in production environment. Of late, the Business manager of this application wanted me to dump the data of the database which is older than an year. I managed to find out the tables which have no relationships in this database and these tables are of gigs size, all with data of yester years. We called a meeting and decided to nuke the data off the database (I believe they are planning for a new system, and want this old db to run for next one year)...!I started operations by deleting data from one of the tables which was holding 7 gigs of data. I chose to delete data of years 2002 & 2003 which would get me some space on the system by 3 gigs. Here is the query I am using to delete,delete x.Connections from x.Connections where year(access_time) = 2002 and month(access_time)= 3 and day (access_time) between 1 and 15What I did was to avoid long table and page locks with millions of rows being deleted, I have predicated as above. Another constraint is that I cannot add any index or drop existing as it is a third party appln., (rule in my shop not to change structures at any time) I need to run the above query with best efficiency.Right now it takes, eeks..20 mins to delete 1/2 million records...that is infact very disgusting.Can anyone throw somelight into my distress situation to get this query running for the 2 years...? and the production box is an old box with 200 MHz cpu & 256 MB RAM. I am in a real fix..!! Thanks in advanceRoyalSher.*********The world is the great gymnasium where we come to make ourselves strong. |
|
|
timmy
Master Smack Fu Yak Hacker
1242 Posts |
Posted - 2004-08-26 : 08:43:34
|
| I find that using DELETE with SELECT TOP works quite well in our production environment:DELETE FROM table WHERE tableID in (SELECT TOP 1000 tableID from table WHERE...)You can run this in a script to keep running until it runs out of stuff to delete. |
 |
|
|
jen
Master Smack Fu Yak Hacker
4110 Posts |
Posted - 2004-08-26 : 08:50:32
|
| Hi RoyalSher,Here's the idea, make a backup of the data you're deleting, then delete them. Perform this by batch. Design your own dts package by 1. copying the records to be deleted onto another table (from another server or repository machine), 2. then if the copy is completed, then that's the only time you can delete.Divide the job (say per year or month depending on how large the data set will be) so that even if you encounter an error, the rollback will only affect each batch. Schedule all the packages so you don't have to oversee them.Hope this helps... |
 |
|
|
rockmoose
SQL Natt Alfen
3279 Posts |
Posted - 2004-08-26 : 08:54:24
|
You could bcp out the data that you want to keep.truncate table.bcp or bulk insert the data back in to the table.please backup before you do this ,-)quote: rule in my shop not to change structures at any time
If this was not the case I would use a new table, drop old and rename the new table to the old name. ( adding constraints, indexes of course )/rockmoose |
 |
|
|
|
|
|
|
|