Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
proud_child
Starting Member
6 Posts |
Posted - 2011-02-23 : 03:16:02
|
Hello,I am trying to insert 17M records to a 32bit server. The destination table is initially empty, no indexes and triggers.The database also has enough space to accommodate the data.It only took a 30 minutes to BCP out the data into a text file.However, when I tried to BCP in to SQL, it is taking forever. Like 500 records per second, extremely slow.I have already checked all performance areas (CPU,DISK I/O,Memory, etc). Everything is fine. By the way, the destination SQL server is never busy because its purpose is only for archive. So what is running now is just the BCP.Anybody who can help me with this issue?Thanks. |
|
webfred
Master Smack Fu Yak Hacker
8781 Posts |
Posted - 2011-02-23 : 04:39:50
|
Is the infile on a local disk on the destination server or is it maybe on a drive in a slow network? No, you're never too old to Yak'n'Roll if you're too young to die. |
|
|
AndrewMurphy
Master Smack Fu Yak Hacker
2916 Posts |
Posted - 2011-02-23 : 04:45:30
|
Are you logging on BCP input? |
|
|
dinakar
Master Smack Fu Yak Hacker
2507 Posts |
Posted - 2011-02-23 : 13:56:13
|
have you tried to parallelize the import? try different batch size settings.. I would break it up into multiple ranges (if you have a monotonically increasing column like an identity column) such as 1-500000, 500001-1000000.. etc and fire off multiple bcp in scripts each one running its own range of records.. even if one batch fails you re-run just that batch.Dinakar Nethi************************Life is short. Enjoy it.************************http://weblogs.sqlteam.com/dinakar/ |
|
|
jezemine
Master Smack Fu Yak Hacker
2886 Posts |
Posted - 2011-02-23 : 21:19:09
|
500 rows/sec on import is very slow. but 30 mins to export 17m rows is also very slow. I'm guessing it's a network issue. elsasoft.org |
|
|
proud_child
Starting Member
6 Posts |
Posted - 2011-02-24 : 04:46:22
|
Hello Everyone,Thanks for all your suggestions. Frustrated as I was, I left the BCP to run overnight. It did finish in the morning but on a 600 rows / sec. rateI cannot go on like this since I am about to archive too many tables with more than 17M.Here is the script I was using :bcp destinationdb.dbo.archivetable in "C:\BCP Files"\archivetable.txt -S destServer -n -T -b 10000@webfred : this is a local disk in the destination server. I don't know if this helps but the BCP is running on the 64 bit source server. So it has to connect to the destination server to dump data.@dinakar : i have not tried multiple BCP. But next time, I will surely opt for that option.@jezemine : I might agree with you. The last time I did an archive of 25M with the same source and destination. It was faster than this.Again, thanks everyone. I have a few things to try next time. Let you know what happens. |
|
|
jezemine
Master Smack Fu Yak Hacker
2886 Posts |
Posted - 2011-02-24 : 09:21:23
|
can you post the DDL for the table you are importing to and exporting from? is the export a "queryout" or just "out"? also you can add -hTABLOCK to the import to speed things up a bit. elsasoft.org |
|
|
|
|
|
|
|