Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
mecameron
Starting Member
2 Posts |
Posted - 2007-07-12 : 12:02:33
|
ClearTrace appears to blow up on large trace files. We recently tried to import a 9gb log file with 12.5 million rows. ClearTrace processed until about 4.2 million rows but slowly consumed more memory in the process, starting from about 50 mb and finally topping out at around 600mb. Are there any known limitations with large files? |
|
graz
Chief SQLTeam Crack Dealer
4149 Posts |
Posted - 2007-07-12 : 15:09:00
|
Yes. It doesn't work with 9GB trace files :)Seriously I haven't tested it with values much over a few hundred MB. It stores the summary information in memory and then flushes with each file rollover so I'm not suprised it had issues. I don't think I can get around this one easily.===============================================Creating tomorrow's legacy systems today.One crisis at a time. |
|
|
mecameron
Starting Member
2 Posts |
Posted - 2007-07-13 : 09:34:52
|
Would it be possible to reset your state every 1 million lines or so? would this not be equivalent to having many smaller files? |
|
|
graz
Chief SQLTeam Crack Dealer
4149 Posts |
Posted - 2007-07-13 : 09:45:24
|
Not easily. Right now I group by the trace file in the database where I store this. If I add another grouping construct I need to modify thae database and the client code. I'd love to do it but I have other priorities I need to work on.===============================================Creating tomorrow's legacy systems today.One crisis at a time. |
|
|
|
|
|