Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 General SQL Server Forums
 New to SQL Server Programming
 Copy data from one server to another

Author  Topic 

pvccaz
Yak Posting Veteran

87 Posts

Posted - 2011-08-26 : 13:10:31
Hi,
I have to copy around 500,000 records from one table in one server to a different server. I want to use scripts to automate the process instead of using the import/export wizard. The table is already present in the destination server with all the Pk, and indexes. Just the data needs to be exported/imported. Is there sample scripts that i can follow to copy data from one server to another.

Thanks,

russell
Pyro-ma-ni-yak

5072 Posts

Posted - 2011-08-26 : 13:30:34
Use BCP to copy the data out and Bulk Insert to pump it in.

Here's a sample
bcp "SELECT * FROM DatabaseName..TableName" queryout c:\bcp\fileName.txt -c -Sservername -T


BULK INSERT DatabaseName..TableName FROM 'c:\bcp\fileName.txt'
WITH (
DATAFILETYPE = 'char',
FIELDTERMINATOR = '\t',
ROWTERMINATOR = '\n'
);
Go to Top of Page

visakh16
Very Important crosS Applying yaK Herder

52326 Posts

Posted - 2011-08-27 : 04:24:07
you can directly copy from one server to other by setting up a linked server between them and then using four part naming convention like

USE TargetServer

INSERT INTO TargetTable
SELECT columns... FROM SourceServer.SourceDB.dbo.SourceTable


see below on how to create linked server

http://sqlserverplanet.com/dba/how-to-add-a-linked-server/

------------------------------------------------------------------------------------------------------
SQL Server MVP
http://visakhm.blogspot.com/

Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2011-08-27 : 05:05:02
If using BCP I would use Native format - avoids any problems with embedded characters "spoiling" the delimiters

bcp "SELECT * FROM DatabaseName..TableName" queryout c:\bcp\fileName.txt -c -Sservername -n

BULK INSERT DatabaseName..TableName FROM 'c:\bcp\fileName.txt'
WITH (
DATAFILETYPE = 'native'
);

If you need to support wide-characters use -N (i.e. this parameter is case-sensitive) and 'widenative' respectively.

I think (not entirely sure though ...) that 500,000 records would be OK for a direct INSERT as a single transaction. It depends a bit how "wide" your columns are, and how many indexes there are. If a narrow column / few indexes then fine. If lots of long VARCHAR columns and/or lots of indexes then I am less confident.

The issue I would consider is whether it will increase the Log File such that it is extended (and I then have to consider shrinking it, which raises issues related to fragmentation, and also (in my case) the fact that we have hand-optimised the log file to have the minimum number of VLFs, and I would not want to wreck that work by an unexpected log file extension.

None of that may matter to you, in which case ignore me!

You can tell BCP what batch size to import. If you have SIMPLE recovery model that's all you need to do to stop the TLog file extending, for Fully Logged recovery model you'll need to have a loop running a TLog backup "continuously" (or once a minute, say) for the duration of the import.

To reduce the batch size you could do:

USE TargetServer

DECLARE @intRowCount int
SET @intRowCount=1 -- Force first iteration

WHILE @intRowCount >= 1
BEGIN
SET ROWCOUNT 100000 -- Your batch size
INSERT INTO TargetTable
SELECT S.columns...
FROM SourceServer.SourceDB.dbo.SourceTable AS S
LEFT OUTER JOIN dbo.SourceTable AS D
ON D.MyPK = S.MyPK
WHERE D.MyPK IS NULL -- Destination record does not exist
SELECT @intRowCount = @@ROWCOUNT
... Force a TLog backup here ...
END
Go to Top of Page
   

- Advertisement -