Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
Chris Orlando
Starting Member
9 Posts |
Posted - 2005-06-02 : 16:36:00
|
I have a flat file I need to import nightly with 135 million rows in it. I only need about 35 million of the rows... so I have 100 million rows of garbage. Obviously, this is taking a long time and I would like to speed up the process.The Value of Column 001 in my file has the same as a key column value in one of my tables. I want to lookup that value in the table during transformation and if it is in there, I want to allow the insert. If it is not, I want to ignore the row.What is the best wand fastest way to implement this? Unfortunatley, I am stuck with this file just get the information I need. I thought you were supposed to use the data driven task but, it is not working for the flat file. Can anyone help me?ThanksChris |
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2005-06-02 : 16:40:15
|
I would BULK INSERT all of the data into a staging table. The staging table would match the layout of the file. Add an index to Column001 after the import. Then use T-SQL to move the data from the staging table to your table like this:INSERT INTO YourTable(ColumnList)SELECT ColumnListFROM StagingTableWHERE Column001 NOT IN (SELECT ColumnName FROM YourTable)Tara |
 |
|
Chris Orlando
Starting Member
9 Posts |
Posted - 2005-06-02 : 16:43:04
|
OK...Is Bulk Insert really that much faster? |
 |
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2005-06-02 : 16:47:04
|
Yep. It's the fastest for importing rows from a file. It goes BULK INSERT, bcp, then DTS.Tara |
 |
|
|
|
|