Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
joebickley
Starting Member
4 Posts |
Posted - 2006-01-11 : 06:35:43
|
HiSorry for the double post but i posted in the wrong place :(I Have a database which holds around 1.5 million rows in 3 tables and is purley used as a data warehouse, it is in simple recovery mode. When i run my SSIS(DTS) package to drop all the data in the table and import it all again the log file still keeps wanting vast amounts of space to let the package run. Having monitiored the package the majority of this space is taken when 3 "delete from tablex" statements run to clear down the tables.Any ideas of a way round this? A 1.5gig log file for a DB that has no data edits seems a tad odd!JoePS im starting with an empty log each time. |
|
AndrewMurphy
Master Smack Fu Yak Hacker
2916 Posts |
Posted - 2006-01-11 : 08:54:24
|
BUT you do have daily loads!....and daily DELETES...which are logged!You could BCP in nonlogged mode to avoid the import overhead.And use TRUNCATE to avoid the delete overhead...or use a loop to delete 1 record at a time to minimise the number of DB actions before COMMIT TRANSACTION takes place. |
 |
|
khtan
In (Som, Ni, Yak)
17689 Posts |
Posted - 2006-01-11 : 19:59:16
|
>> delete 1 record at a time>> 1.5 million rowsThis might take forever -----------------'KH'Only two things are infinite, the universe and human stupidity, and I'm not sure about the former. |
 |
|
AndrewMurphy
Master Smack Fu Yak Hacker
2916 Posts |
Posted - 2006-01-12 : 05:16:12
|
>> delete 1 record at a time>> 1.5 million rows"This might take forever"....yea...but the objective of minimising the T-Log would be met!!! |
 |
|
|
|
|