Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
Carat
Yak Posting Veteran
92 Posts |
Posted - 2010-05-19 : 12:30:52
|
We have more than 20 txt files that comes from another system and we load these 20 txt files into our OLTP SQL Server database. I want to know whether we are doing this using the best practices or not. I will describe briefly how that we work.- For each txt file we have a different SSIS package that loads the txt file into an Importtable in a database called ImportDB. This table has already the correct datatypes for all the fields. This table also has a PK (Identity field).- In the SSIS package we give the right datatype to each field in the txt file using the "Flat File Connection Manager Editor".- From here we do certain calculations on the Importtable (using a Stored Procedure in the "Execute SQL Task").- When the data in the Importtable is correct, the data is inserted into our OLTP database.Can somebody tell me that this is the best way to do this or is there a better way?Some people say that its better to load the data from the txt file into an ImportTable in an ImportDB using varchar(1000) as the datatype for all the fields. Is this better?Do we need a Staging Area (database)? Or do we need this only when loading data into a Datawarehouse (OLAP)?What I basically want to know are the best practices for loading data into an OLTP database.Thanks! |
|
naveengopinathasari
Yak Posting Veteran
60 Posts |
Posted - 2010-05-25 : 05:49:57
|
Hi,Let me try to give a solution to you.as i understand from your mail.1. Instead of each package you can include in one Package for all the txt file.2. You need to have a staging DB inorder to store for verification and History purpose if required, let to you depending on your business requirement.Loading data in essentially require your business requirement and the way you need to archive data.Lets unLearn |
|
|
webfred
Master Smack Fu Yak Hacker
8781 Posts |
Posted - 2010-05-25 : 06:02:24
|
quote: Originally posted by Carat We have more than 20 txt files that comes from another system and we load these 20 txt files into our OLTP SQL Server database. I want to know whether we are doing this using the best practices or not. I will describe briefly how that we work.- For each txt file we have a different SSIS package that loads the txt file into an Importtable in a database called ImportDB. This table has already the correct datatypes for all the fields. This table also has a PK (Identity field).- In the SSIS package we give the right datatype to each field in the txt file using the "Flat File Connection Manager Editor".- From here we do certain calculations on the Importtable (using a Stored Procedure in the "Execute SQL Task").- When the data in the Importtable is correct, the data is inserted into our OLTP database.Can somebody tell me that this is the best way to do this or is there a better way?Some people say that its better to load the data from the txt file into an ImportTable in an ImportDB using varchar(1000) as the datatype for all the fields. Is this better?Do we need a Staging Area (database)? Or do we need this only when loading data into a Datawarehouse (OLAP)?What I basically want to know are the best practices for loading data into an OLTP database.Thanks!
I would do it in a similar way.If there is no problem then IMHO there is no need to change anything.It depends on the quality of your txt-files. No, you're never too old to Yak'n'Roll if you're too young to die. |
|
|
|
|
|