Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2008 Forums
 SSIS and Import/Export (2008)
 Mixed input file having rows and col into Tables

Author  Topic 

amrita
Starting Member

1 Post

Posted - 2011-05-13 : 18:19:50
Am trying to set up a .fmt file using the bcp utility.
Or anything that helps me run a script to import data into the database tables. I have been successfully able to create a simple .fmt file before , directly mapped to the table that data gets exported to as input:
bcp cmsDatabase.dbo.Table1 format nul -T -c -f Table1Format.fmt

9.0
3
1 SQLCHAR 0 12 "\t" 1 Col1_Table1 ""
2 SQLCHAR 0 100 "\t" 2 Col2_Table1 SQL_Latin1_General_CP1_CI_AS
3 SQLCHAR 0 0 "\t" 3 Col3_Table1 SQL_Latin1_General_CP1_CI_AS
Table1 from which it was created: Col1_Table1 int (primary key) Col2_Table1 nvarchar(50) Col3_Table1 nvar

Now I'm facing an issue. I have an input .txt/.csv file (delimited flat file) with data that will not be imported directly into one table, it has to go into many tables.

Example Input File (note how it is seperated by quotations and another table's rows are seperated by commas)

"Col1_Table1x" "Col2_Table1x" "Col3_Table1x" "Col1_Table2x, Col1_Table2y, Col1_Table2z"

In the above file, values have to go into:

Col1_Tablex -> Col1_Table1 column of Table1
Col2_Tablex -> Col2_Table1 column of Table1
Col3_Table1x -> Col3_Table1 column of Table1

so that,
Table1:

Col1_Table1 Col2_Table1 Col3_Table1
-------------------------------------------
Col1_Table1x Col2_Table1x Col3_Table1x

This is the part I'm finding tricky, especially since it has both quotations and commas, and also it signifies different rows in a single table:
"Col1_Table2, Col1_Table2, Col1_Table2, ....." -> Multiple records to be placed in Table2, populating Col_Table2 with the comma-seperated values, and with Col1_Table1 as foreign key.

i.e Table2 should have

Col1_Table1 Col1_Table2
----------------------------------------------
Col1_Table1x Col1_Table2x
Col1_Table1x Col1_Table2y
Col1_Table1x Col1_Table2z
where Col1_Table2 needs to go as rows into Table2 (and is comma seperated), and Col1_Table1 is a foreign key for Table2 so it needs to be copied over too.

Is there a way to create a .fmt file that will allow this kind of mixed-input to be copied over?

Additional details: Each time I need to load up the tables from this input file, I can truncate all old data and re-populate. Any of the columns can have special characters like <, " , & etc. so is there a way to handle that too?

sunitabeck
Master Smack Fu Yak Hacker

5155 Posts

Posted - 2011-05-13 : 18:50:35
I have used bcp only in its basic forms - never with sophisticated format files, so my opinion is biased. With that caveat, if I had to do this, I would first import the data into a staging table. The staging table could even be one with a single column of type varchar(max). Then, write T-SQL queries to parse the data and put it into appropriate tables.

If the input data is jagged (i.e., the number of space-delimited, double-quote-escapted columns can vary from row to row), importing it into a staging table with a single column may be your only option. If it is not jagged, you could import it into a table with multiple columns and then parse the data.
Go to Top of Page
   

- Advertisement -