Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
aurodenver190
Starting Member
2 Posts |
Posted - 2010-10-04 : 20:41:15
|
Hi,I have an SSIS package which populates almost all my dimension table.The package populates around 20 tables..The package does not deal with much of logic..It just reads staging and writes into destination for 80 percent of those 20 tables. All tables in staging and destination are of Nvarcharand Bigint type..My package has 3 big containers which has multiple containers inside it for each table..every small container for the table deals with truncating,inserting a row for unknown member and a data flow task which just reads staging and populates the staging.Now the problem..we have 6 very large Junk dimensions..which we need at this point of time. ( don want to talk abt design stuff).The last big container fills one of the big dimension..e.g say sales order.which has 40 M records.That one is performed in a small container.we have 3 parallel processing which look up to those 40M records and another 80 M records to populate the other 4 large dimension. I have observer a significant decrease in perf one it reaches ther.It just fails to swap buffers and data tranfer gets very slow and at last it fails..Those table loading includes union, Lookup ( which takes almost all mem)..i have seen the memory increases exponentially)..That specific container takes almost 18 GB of space for look up and we have only 40 GB of space.. Is there a way to control this..we have to do a server reboot to make this process successful which i don like..i have played around..if i set the buffer limit , it takes time to process rows..if i set free..its just process fast..I can explain more if u have any further questions.sam |
|
|
|
|