If using BCP I would use Native format - avoids any problems with embedded characters "spoiling" the delimitersbcp "SELECT * FROM DatabaseName..TableName" queryout c:\bcp\fileName.txt -c -Sservername -nBULK INSERT DatabaseName..TableName FROM 'c:\bcp\fileName.txt'WITH (DATAFILETYPE = 'native');
If you need to support wide-characters use -N (i.e. this parameter is case-sensitive) and 'widenative' respectively.I think (not entirely sure though ...) that 500,000 records would be OK for a direct INSERT as a single transaction. It depends a bit how "wide" your columns are, and how many indexes there are. If a narrow column / few indexes then fine. If lots of long VARCHAR columns and/or lots of indexes then I am less confident.The issue I would consider is whether it will increase the Log File such that it is extended (and I then have to consider shrinking it, which raises issues related to fragmentation, and also (in my case) the fact that we have hand-optimised the log file to have the minimum number of VLFs, and I would not want to wreck that work by an unexpected log file extension.None of that may matter to you, in which case ignore me!You can tell BCP what batch size to import. If you have SIMPLE recovery model that's all you need to do to stop the TLog file extending, for Fully Logged recovery model you'll need to have a loop running a TLog backup "continuously" (or once a minute, say) for the duration of the import.To reduce the batch size you could do:USE TargetServerDECLARE @intRowCount intSET @intRowCount=1 -- Force first iterationWHILE @intRowCount >= 1BEGIN SET ROWCOUNT 100000 -- Your batch size INSERT INTO TargetTable SELECT S.columns... FROM SourceServer.SourceDB.dbo.SourceTable AS S LEFT OUTER JOIN dbo.SourceTable AS D ON D.MyPK = S.MyPK WHERE D.MyPK IS NULL -- Destination record does not exist SELECT @intRowCount = @@ROWCOUNT ... Force a TLog backup here ...END