Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2008 Forums
 SSIS and Import/Export (2008)
 Help needed with speeding up insert

Author  Topic 

Bill_C
Constraint Violating Yak Guru

299 Posts

Posted - 2012-07-03 : 08:10:34
After being uanble to aacomplish what I tried in another thread,
http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=176201
I am now having a measure of success with the following, but I could do with a little help in speeding things up a little if possible.



I have a table (Table1) with data in a single field (varchar(max)) which is in the form of a comma delimited list which holds the data of 1250 seperate pieces of info. e.g.


"data for field1","data for field2","data for field3","data for field4","data for field5","data for field6","data for field7","","","","","","data for field1250"

(all the above in one field as a comma delimited list - so there is about 1 million rows of data with each row as shown above)

To recieve this data I have created a table (Table2) with 1250 columns of type SPARSE VARCHAR(50), which also has a columnset (cs) of type XML.

At present, I am trying to populate the table in the following way (which is VERY SLOW)

------------------------------------------------------------
DECLARE @1a VARCHAR(MAX),@sql VARCHAR(MAX),@Rows INT, @IdCol INT
DECLARE fld_cursor CURSOR FAST_FORWARD FOR
SELECT field1 FROM [Table1]

OPEN fld_cursor

FETCH NEXT FROM fld_cursor
INTO @1a

WHILE @@FETCH_STATUS = 0
BEGIN
--replace the double quotes with single apostrophe.
set @1a = REPLACE(@1a,'"','''')

--create an sql statement for executing

set @sql = '
INSERT [Table2]
(
field1,
field2,
field3,
............
field1250
)
VALUES ('+@1a+')'

EXEC (@sql)

FETCH NEXT FROM fld_cursor
INTO @1a
END
CLOSE fld_cursor
DEALLOCATE fld_cursor

-------------------------------------------------------------------

Is there a faster way of populating 'Table2'

Can someone help with this.

Thanks

robvolk
Most Valuable Yak

15732 Posts

Posted - 2012-07-03 : 09:57:20
You're better off splitting this into 2 tables with less than 1024 columns each, like you did in the other thread, then using bcp or BULK INSERT with format files to import only the columns specific to those tables. If there's a key column in the file be sure to include it both tables and make sure the format files includes it for importing. Otherwise use an identity column with the same seed and increment in both tables and make sure to reseed or truncate both tables before importing.

You can still keep the sparse table for regular use, just write a query to join the 2 import tables and insert into the sparse.
Go to Top of Page

Bill_C
Constraint Violating Yak Guru

299 Posts

Posted - 2012-07-03 : 10:23:10
Thanks, I did consider two tables but was hoping there would be a faster way of populating the one table.

I'll bear that in mind whilst trying out some quicker way of writing the strings read out of table1 into table2.
Go to Top of Page
   

- Advertisement -