Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Development (2000)
 Best method to update a RS with much data

Author  Topic 

hyefive
Starting Member

16 Posts

Posted - 2007-09-20 : 19:43:02
Hi,

I have a delimited text file containing over 10,000 residential property records, with each record containing over 100 fields of data.

My question is what is the most efficient method of loading them into a db table, in terms of SQL Server CPU and Duration? This is a shared SQL Server, and the hosting company has limits...

I've tried everything to streamline the process, but would like to some more. If anyone has any suggestions, I would greatly appreciate it.

My latest effort has been as follows, with much stripped out to make it easier to read:

update.asp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
<%@ Language=VBScript %>
<%
Set MyConnection = Server.CreateObject("ADODB.Connection")
MyConnection.Open(Session("MyConnectionString"))
Set cmdTemp = Server.CreateObject("ADODB.Command")
cmdTemp.CommandType = adCmdText
Set cmdTemp.ActiveConnection = MyConnection

cmdTemp.CommandText = "SELECT * INTO dbo.TABLE2 FROM TABLE"
cmdTemp.Execute

cmdTemp.CommandText = "TRUNCATE TABLE TABLE2"
cmdTemp.Execute

Set rsDataUpload = Server.CreateObject("ADODB.Recordset")

rsDataUpload.Open "SELECT LST_NUMB FROM TABLE WHERE ORDER BY LST_NUMB", MyConnection, adOpenForwardOnly, adLockReadOnly, adCmdText
If NOT rsDataUpload.EOF Then
sLST_NUMB = rsDataUpload.GetString()
End If
rsDataUpload.Close

rsDataUpload.CursorLocation = adUseClient
rsDataUpload.Open "up_SEL_TABLE2", MyConnection, adOpenStatic, adLockBatchOptimistic, adCmdStoredProc
'** This sp is "SELECT * FROM TABLE2"
Set rsDataUpload.ActiveConnection = Nothing

Set fso = CreateObject("Scripting.FileSystemObject")
Set f = fso.GetFile (PATH)
Set ts = f.OpenAsTextStream(ForReading, TristateUseDefault)

Do Until ts.AtEndOfStream = True
TextStr = ts.ReadLine

Columns = Split(TextStr, "|")
LST_NUMB = Trim(Columns(0))
CITY_NAME = UCase(Trim(Columns(2)))
etc...
ROOF_MTRL = Trim(Columns(125))

* This section performs extensive string manipulation, i.e.
str = ""
str = Replace(ROOF_MTRL, "BITU", "Bituthane")
str = Replace(str, "CEME", "Cement")
str = Replace(str, "COMP", "Comp Shingle")
str = Replace(str, "SING", "Shingle")
str = Replace(str, "SHAK", "Shake")
str = Replace(str, "SLAT", "Slate")
str = Replace(str, "TARG", "Tar/Gravel")
ROOF_MTRL = SQLFix(str)

etc...

rsDataUpload.AddNew
If INSTR(sLST_NUMB, LST_NUMB) = False Then
rsDataUpload("NEWLIST") = "Y"
End If
rsDataUpload("LST_NUMB") = LST_NUMB
rsDataUpload("CITY_NAME") = CITY_NAME
etc...
rsDataUpload("ROOF_MTRL")
rsDataUpload.Update

Loop

rsDataUpload.ActiveConnection = MyConnection
rsDataUpload.UpdateBatch
rsDataUpload.Close

Set rsDataUpload = Nothing
Set cmdTemp = Nothing
MyConnection.Close
Set MyConnection = Nothing
%>
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<

Kristen
Test

22859 Posts

Posted - 2007-09-21 : 03:57:54
You appear to be walking through TABLE row by row.

Getting a SELECT * FROM TABLE2 for that specific row [I assume]

Then you are checking if the first row read from your delimited text file matches the property number and setting a flag NEWLIST depending on that.

And then inserting the new data from the delimited file.

I think you would be better off doing this "set-based" in SQL. Here's what I would do:

Bulk import the file to a "staging table" in your database. (Create or Truncate the table first)

Fix up the data:

UPDATE U
SET ROOF_MTRL = REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(
ROOF_MTRL, "BITU", "Bituthane")
"CEME", "Cement")
"COMP", "Comp Shingle")
"SING", "Shingle")
"SHAK", "Shake")
"SLAT", "Slate")
"TARG", "Tar/Gr
FROM MyStagingTable AS U

-- Flag existing records
UPDATE U
SET NEWLIST = Y
FROM MyStagingTable AS U
JOIN TABLE AS T
ON T.LST_NUMB = U.LST_NUMB

manipulate any other column data in the staging table.

Insert the finalised data into the appropriate table:

INSERT INTO TABLE2(Col1, Col2, Col3, ...)
SELECT LST_NUMB, CITY_NAME, ROOF_MTRL, ...
FROM MyStagingTable

Kristen
Go to Top of Page

hyefive
Starting Member

16 Posts

Posted - 2007-09-21 : 18:08:05
Thanks for the input, Kristen.

You mentioned "Bulk import"ing the file into a temp table, and that's what I'm currently looking into. First I tried disconnecting the rs, updating, then re-connecting it to the db, but that didn't speed things up any...
Go to Top of Page

Jeff Moden
Aged Yak Warrior

652 Posts

Posted - 2007-09-21 : 23:04:38
No, no... not like that... write a stored proc that uses BULK INSERT and do the processing there. It's very fast that way... I can load 5.2 million rows of 20 columns in 60 seconds flat using BULK INSERT. Once you have the data in a "staging" table, you could do the necessary updates, inserts, and deletes to merge the new data...

...or, better yet, if ALL the data that you need is in the text file, then simply rename the temp table to be the new table to use. You can do this most effeciently by altering a view to point at the new table. Next time around, the truncate and insert into the old table, making it the new table, and repoint the view at it.

--Jeff Moden
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2007-09-22 : 02:29:50
However ... it does depend on you being able to "see" your data text file from the SQL box, and your hosting company may make that hard for you ...

... but if you can put the file on the InterWebNetThinige you should be able to see it from the SQL box.

Kristen
Go to Top of Page

hyefive
Starting Member

16 Posts

Posted - 2007-09-24 : 13:06:11
Hey thanks again for the replies.

Just as I was getting into the BULK INSERT tactic, I was informed by the hosting co. they're not allowable on the shared SQL server. They did suggest scheduling the updates for off-hours was a possibility however...
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2007-09-24 : 13:45:34
Might you be able to do it via XML, if they don't allow Bulk Import?

You'd have to convert the file to XML, but that you could do something pretty "skinny" in that regard.

Kristen
Go to Top of Page
   

- Advertisement -