Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Development (2000)
 Advice on updating table on linked server

Author  Topic 

wooldawg
Starting Member

12 Posts

Posted - 2010-02-11 : 15:27:06
Hello all,

I am wondering if I can get some guidance on a general design question. I have an application that is self contained on a client system. It is a .net application with MSDE (SQL 2000) installed locally. At the end of each transaction, data is moved from temporary working tables into the final tables. (All happening on the local box on local copy of database.) There are several hundred of these clients deployed, each of them saving data in its own local DB. These systems send back flat files once a day to the home office for consolidation and storage in our data center. The reason for this design is we have been on dial up for the past ten year. This general system works quite nicely for us, but demand for real time data is increasing. On top of all this, these installations are seasonal and only run for a couple months. (Thus the need for dial up connections.) I am looking into moving us to 3G cellular communications with VPN which in theory will allow each client to be connected to home 24/7.
If we get the broadband connection I will have many more options for pushing (or polling) data throughout the day. I have done this already in my lab using a VERY simple poor man's solution. I plugged the client into the network, added a linked server and also added a very basic insert statement at the beginning of the stored proc that moves data from the working to final tables on the local db. Example:
insert into [linkedserver].database.dbo.items select * from working_items

This seems to work. I get the details of the items after each transaction is completed on my linked server database. However, I’m certain this must be a very poor method and will be prone to errors or locking up the clients if they cannot complete the inserts into the linked server. When tried enclosing the insert statement inside a begin and commit tran I get errors related to MSDTC. Name exactly the error message found at the top of this article http://support.microsoft.com/kb/839279. I have tried to follow the recommendations of this and similar articles with no luck. That is when I thought of this forum, wanting to investigate the more general question of what is best way to do this. That is, copy a small amount of data to a remote database during each transaction with minimal design change to the current system. Ideally only dealing with T-sql procedure changes and not changing the .net code itself. I know this may sound like I am a hacker looking for advice, but I am a sole survivor in a downsized company with lots of custom applications and not a great deal of coding experience. Thanks!!!

Error that occurs if I try enclosing insert statement within begin and commit tran:
The operation could not be performed because the OLE DB provider 'SQLOLEDB' was unable to begin a distributed transaction. [OLE/DB provider returned message: New transaction cannot enlist in the specified transaction coordinator. ] OLE DB error trace [OLE/DB Provider 'SQLOLEDB' ITransactionJoin::JoinTransaction returned 0x8004d00a].

Kristen
Test

22859 Posts

Posted - 2010-02-12 : 03:01:14
We do it similarly to you (but without transaction blocks, for the reason that you have encountered). Our way may fit with your current system of export to flat file.

You could also look at Merge Replication

Lets say we have "Remote" clients and a "Local" central server.

Remote client has a Stored Procedure which gathers new / changed data into Staging tables. We have date/time LastUpdated column on all our tables, so we can work out what has changed easily, but other applications we pull data from don't have that, so for those we have to do:

SELECT *
INTO #TEMP
FROM OtherDatabase.dbo.SomeTable

Whichever method we use we then compare the data in #TEMP with our Staging tables with full WHERE clause to identify any row with one, or more, columns that have changed. (We only do this for columns that we are interested in - so the row may have changed on the Remote, but not in a column that we are interested in, so we don't want to transfer that).

Delete from #TEMP any rows that are identical to Staging Table (don't want those!)

DELETE T
FROM #TEMP AS T
JOIN StagingTable AS S
ON S.PK_ID = T.PK_ID
AND (S.Col1 = T.Col1 OR (S.Col1 IS NULL AND T.Col1 IS NULL)
... repeat for all columns of interest ...

(We force Binary Collation for VARCHAR columns so that we catch changes in just capitalisation)

Then we pre-delete any existing rows from the Staging Table - to make room to import the newer rows:

DELETE S
FROM StagingTable AS S
JOIN #TEMP T
ON T.PK_ID = S.PK_ID

and then import them:

INSERT INTO StagingTable
SELECT [Action] = 'I', -- Insert
[UpdateDate] = GetDate(),
Col1, Col2, ...
FROM #TEMP

then the Sproc calls an Sproc on the receiving end to transfer the data (this ensures that the operation is sequential)

Note: The following is the code for the Sproc on the Local database

That does:

DECLARE @strSQL varchar(8000),
@LastestDate datetime

SELECT @LastestDate = MAX(UpdateDate) FROM StagingTable -- Get most recent date previously received

SELECT @strSQL = '
SELECT *
FROM RemoteDatabase.dbo.StagingTable
WHERE UpdateDate > ''' + CONVERT(varchar(23), @LastestDate, 126) + '''
'

SELECT @strSQL = '
SELECT *
INTO ##TEMP
FROM OPENQUERY(LinkedServer,
'''
+ REPLACE(@strSQL, '''', '''''')
+ '''
)'

EXEC (@strSQL)

that gets the newer staging data to the local server, that is then inserted into the local staging tables (no need to compare columns this time, if it was inserted at the far end we assume it should be inserted this end too)

DELETE S
FROM dbo.StagingTable AS S
JOIN ##TEMP AS T
ON T.PK_ID = S.PK_ID

Note that it may be necessary to force the collation on any PK columns that are Varchar if the database /server they came from has a different collation

Then insert rows into the local staging tables:

INSERT INTO dbo.StagingTable
SELECT T.*
FROM ##TEMP AS T

DROP TABLE ##TEMP

then execute the SProc (on Local) that will process the data from the staging tables into real tables. That should store the LastestUpdate date in a Config table so it knows where it got to last time etc.

The final operation to insert into the local staging table is atomic, so if the connection fails at any time some of the earlier tables being transferred may have completed, but there will be no insert into a single staging table which is only partially complete - the implicit atomic transaction will either complete, or rollback. Thus working on "More recent that LastUpdate" is safe.

Also, the Remote Sproc "loads" its staging tables, then calls the Local Sproc to Pull the data, and then "loads" the local staging tables, and finally that then calls a Local Sproc to process the data means that the process is linear.

It is important to prevent a second instance of that Sproc running concurrently (but a scheduled SQL Server task will not re-run if the earlier instance has not yet completed); you could also use Semaphores to prevent that.

If you need to transfer Deletions too then the first step, at the remote end, should update the [Action] column for any records that no longer exist (in the remote application's database). This requires a full query of the PKs (and in generally we only do this once a day - however, most data we deal with seems to be retained, but marked as obsolete, rather than being physically deleted - but your application may be different of course!)

I have some templates of the various Sprocs which I may be able to form into something suitable for publication if you need them.
Go to Top of Page
   

- Advertisement -