Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
lnt_sql
Starting Member
6 Posts |
Posted - 2010-07-21 : 11:26:25
|
I have flat file which i read to my database tables.I want to have a mechanism to identify and move the duplicates records to an error log table.How do i effectively identify the duplicated records?Please note that the i don't want to move all the records to error log. All repeated records except the first one will go into it.So i can't use aggregate component(it sends both) or sorting(it removes). I might have to use a script component to group the records based on teh primary keys, number the grouped records and move all those which have number> 1. But this solution is pretty complicated and I would like to know if there are any simpler and more efficient methods to do the same.Awaiting a reply.. thanks in advance... |
|
RIKIL
Starting Member
20 Posts |
Posted - 2010-07-28 : 14:40:18
|
one thing we do here is load the data to a temporary table and then remove the dupes using an outer join. |
|
|
|
|
|