Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
mattt
Posting Yak Master
194 Posts |
Posted - 2011-07-27 : 10:00:08
|
Hi,I'm sure this is probably very easy, but I'm not sure how best to go about it.I need to do a fairly large (10,000,000 rows) regular import/export job running off a query rather than a direct table-to-table mapping. The trouble is that there are three distinct steps to this process, and the second and third steps are dependent on the data selected in the first step and occur on different servers.Let me break it down.Step 1 is to select those ten million rows from an "Customer_Evidence" table on server 1 and transfer them to an "Evidence" table on server 2. This is done via a specific query as the two tables have different columns and data types.Step 2 is to then select all the data matching a particular criteria from a "Customer_Item" table on server 1 and transfer them to an "Item" table on server 2. However I only need to select data that matches a key based on the data transferred in step 1. This is the nub of the problem - my selection for step 2 depends on joining against a table on a different server.Step 3 is basically the same as step 2 just using a different selection criteria. I still need to grab only rows that have a key matching that imported in step 1.The problem is compounded by the fact that the relationship between servers is not fixed. One day I might be transferring between server 1 and server 2 and the next between server 1 and server 3. These servers are often in different domains. So setting up a linked server each time is not an easy or realistic option.Now I'm well aware that the easy solution would be to, as part of step 2, re-run the query used in step 1 into a temp table and join against that. But for ten million rows that's going to be horribly expensive, I would have thought. Or am I wrong?Or is there a better solution? |
|
|
|
|