Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 Import/Export (DTS) and Replication (2000)
 SQL Server DTS: detect too many columns in source

Author  Topic 

ajays
Starting Member

1 Post

Posted - 2006-05-31 : 14:33:34
I have a DTS import that imports up to 50-70 MB of data from a number of flat text files (CSV).
I have pre-processing that logs an error for missing files, and DTS detects fewer fields than are used in the transformation and we log an error for that.

What I don't have is a way to detect too many columns in the text file; we will need a good way to detect some bad records with too many comma-separated columns, or if the source export changes the format and adds extra columns in the middle or end, on purpose or accidentally.

For example, the source file has 5 fields, and the transformation puts all 5 into a destination table; there may be more columns in the table that aren't used. If the text file has 4 columns, there's an error that one of those that is used is missing. If it has 6 columns, we want to detect an error.

We tried checking dynamic properties in an ActiveX task; no good. Can't run a SQL on that source.

Any ideas?

Thanks

khtan
In (Som, Ni, Yak)

17689 Posts

Posted - 2006-05-31 : 21:18:28
You can import the CSV into a single column staging table and perform the checking in SQL. You can make use of the split function here http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=55210&SearchTerms=CSV,Splitting%20delimited%20lists,Best%20split%20functions


KH

Go to Top of Page
   

- Advertisement -