Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
JAdauto
Posting Yak Master
160 Posts |
Posted - 2012-08-22 : 13:52:17
|
I have the following bulk insertBULK INSERT #DataFROM ''' + @FilePathName + '''WITH ( FIELDTERMINATOR = '','', ROWTERMINATOR = ''\n'', FIRSTROW = 2)but, I have a file that Im importing that sometimes has a couple of bad rows of data, for example, it might be missing the price column on row 5. (price and comma both). Is there a way to check each row of data to see if it has a valid number of columns?Thanks! |
|
xhostx
Constraint Violating Yak Guru
277 Posts |
Posted - 2012-08-22 : 14:15:40
|
Do you have to use the bulk insert?Import/Export Wizard can do the job.it will allow you to specify the TextQualifier.--------------------------Joins are what RDBMS's do for a living |
|
|
visakh16
Very Important crosS Applying yaK Herder
52326 Posts |
Posted - 2012-08-22 : 14:29:30
|
quote: Originally posted by JAdauto I have the following bulk insertBULK INSERT #DataFROM ''' + @FilePathName + '''WITH ( FIELDTERMINATOR = '','', ROWTERMINATOR = ''\n'', FIRSTROW = 2)but, I have a file that Im importing that sometimes has a couple of bad rows of data, for example, it might be missing the price column on row 5. (price and comma both). Is there a way to check each row of data to see if it has a valid number of columns?Thanks!
if its missing delimiter also then it would be difficult. If you've a blank place holder it will work fine. how is this file generated?------------------------------------------------------------------------------------------------------SQL Server MVPhttp://visakhm.blogspot.com/ |
|
|
JAdauto
Posting Yak Master
160 Posts |
Posted - 2012-08-23 : 09:39:31
|
Im not sure how file is generated. We recieve it from 3rd party. It is using Bulk insert now, but that is how I inhereted the code. Im not completely locked into that, although Im trying to make the least amount of invasive changes as possible. |
|
|
visakh16
Very Important crosS Applying yaK Herder
52326 Posts |
Posted - 2012-08-23 : 10:05:52
|
quote: Originally posted by JAdauto Im not sure how file is generated. We recieve it from 3rd party. It is using Bulk insert now, but that is how I inhereted the code. Im not completely locked into that, although Im trying to make the least amount of invasive changes as possible.
if intermediate columns are missing without place holders then its really difficult to process it. ------------------------------------------------------------------------------------------------------SQL Server MVPhttp://visakhm.blogspot.com/ |
|
|
JAdauto
Posting Yak Master
160 Posts |
Posted - 2012-10-09 : 16:10:22
|
I know this issue is old, but Im back to it. I have the following example in my CSVLastName,FirstName,Number,SizeDoe,John,101,MedDoe,Jane, 102,MedWilliams,Frank,201,LargeWilliams,202,LargeHoward,Henry,301,SmallNotice that record #4 is missing the FirstName column completely (comma is missing as well).I run BulkCopy to import that into a temp table and then evaluate data and insert into a permanent table. the Columns of my permanent table include the following fields:lastname, firstname, number, size, activeThe real probably I am having is not that the record #4 is not inserting as it should (bad data). The issue I am having is that record #5 is jacked up because of record #4 and then inputs a 0 into Active for record #5.Has anyone seen anything like this before where the record following a bad row gets also messed up? |
|
|
visakh16
Very Important crosS Applying yaK Herder
52326 Posts |
Posted - 2012-10-09 : 22:58:42
|
quote: Originally posted by JAdauto I know this issue is old, but Im back to it. I have the following example in my CSVLastName,FirstName,Number,SizeDoe,John,101,MedDoe,Jane, 102,MedWilliams,Frank,201,LargeWilliams,202,LargeHoward,Henry,301,SmallNotice that record #4 is missing the FirstName column completely (comma is missing as well).I run BulkCopy to import that into a temp table and then evaluate data and insert into a permanent table. the Columns of my permanent table include the following fields:lastname, firstname, number, size, activeThe real probably I am having is not that the record #4 is not inserting as it should (bad data). The issue I am having is that record #5 is jacked up because of record #4 and then inputs a 0 into Active for record #5.Has anyone seen anything like this before where the record following a bad row gets also messed up?
I think the only way in such cases is to pull entire row of data into a single column table and then use a parsing logic based on , delimiter to get individual values out.------------------------------------------------------------------------------------------------------SQL Server MVPhttp://visakhm.blogspot.com/ |
|
|
|
|
|
|
|