Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
AskSQLTeam
Ask SQLTeam Question
0 Posts |
Posted - 2003-02-10 : 07:58:12
|
| VJ writes "I have a database on SQL7.0, there are about 25 DTS packages scheduled to run daily to import data to the database.The tables are first truncated and then data is imported.The transaction log for this DB grows tremendously big, as big as 15gb in 3 days. When this happens I detach the db and do a single file attach to create a new log of 1mb and the log will again grow to 15gb in the next 3 days..Can I get help of having a script which can clear the log efficiently? Will appreciate the help very much.Thanks" |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2003-02-10 : 12:21:11
|
| Why don't you just leave the transaction log file at 15GB? You are causing performance problems with your jobs by creating a new log file that is only 1MB. Your job needs the space, so let it have it.Edited by - tduggan on 02/10/2003 12:21:43 |
 |
|
|
MichaelP
Jedi Yak
2489 Posts |
Posted - 2003-02-10 : 12:34:09
|
| What about setting the server in "Simple" recovery mode?Wouldn't that help the transaction logs by not getting too large?Michael<Yoda>Use the Search page you must. Find the answer you will.</Yoda> |
 |
|
|
Onamuji
Aged Yak Warrior
504 Posts |
Posted - 2003-02-10 : 12:36:25
|
| backup log with truncate_onlythat should cut the used part of your transaction log and allow it to be re-used |
 |
|
|
|
|
|