Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Administration (2000)
 Memory Used by SQL Server and DTS

Author  Topic 

AjarnMark
SQL Slashing Gunting Master

3246 Posts

Posted - 2009-05-15 : 13:08:08
We have a nightly SQL Job that sporadically fails (once or twice per week) in the middle of the night. The job consists of a couple of dozen steps that each call different DTS packages which in turn import data from some Pervasive SQL tables into our SQL Server 2000 tables.

The job failure is a nuisance, but we have it designed so that recovery is relatively easy, but if we could find a way to eliminate the failures, that would be even better. Unfortunately, my prioritized workload is such that I cannot spend a bunch of time analyzing this. I have a note from a former coworker suggesting that she thought the problem was some sort of periodic low-memory issue.

The machine has 8 GB of RAM now (4 instances of SQL 2000 Enterprise SP3, on Windows Server 2003 Enterprise). Does DTS use the same memory block that SQL Server does? What are the chances that if we just slap a couple more gigs of RAM into the machine that DTS will use that vs. the chances the RAM won't be touched? What are the chances that my former coworker was out of her mind?

Or, with four instances running on this one server, might it be more beneficial to tweak memory allocation between the instances to provide more memory to the dominant instance? Would that have any impact on DTS?

Opinions anyone?

--------------------------------------------
Brand yourself at EmeraldCityDomains.com

mcrowley
Aged Yak Warrior

771 Posts

Posted - 2009-05-18 : 10:16:40
I believe (but can not prove) that DTS will use its own memory for most of the execution, and list it under DSTRun.EXE in task manager. Naturally, reading data from an instance, and writing data to an instance will cause some memory pressure on those instances, as well. Do you batch the inserts by any chance? Setting an insert batch size will slow down the package a bit, but may reduce the memory footprint a bit. Normally batching the inserts is done to be kind to the transaction log, though.

Sorry for the blatant guesses above, so take it all with a shaker of salt, and test copiously.
Go to Top of Page

AjarnMark
SQL Slashing Gunting Master

3246 Posts

Posted - 2009-05-18 : 11:56:57
I appreciate the ideas. I had suspected that DTS would use its own memory allocation as well, but also have not proven it. We are not batching the inserts at this time, and I will keep that in mind.

I am in the process of tweaking the memory allocation between the four instances to see if we get different results.

--------------------------------------------
Brand yourself at EmeraldCityDomains.com
Go to Top of Page
   

- Advertisement -