Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
cidr
Posting Yak Master
207 Posts |
Posted - 2011-08-12 : 11:18:01
|
Hi there,Hope you're all well,Quick question really. I've imported a text file and added one small column from a query, about three characters per row. and then I've exported it back to text file. What was once a file size of 349mb is now 2.49gig; too big to open in most notepad programs.1. Does anyone know why the massive size increase2. Is there any compression techniques in SQL Server when exporting? or any compression software after exporting?Many thanks |
|
robvolk
Most Valuable Yak
15732 Posts |
Posted - 2011-08-12 : 11:36:21
|
Was the original file ANSI text, and the exported copy Unicode? That could bloat a file (but only double the size). Is the exported format exactly the same? Did the export add quotes, or save as fixed-width instead of CSV/TSV?And why are you opening a 349 MB text file in an editor anyway? |
|
|
cidr
Posting Yak Master
207 Posts |
Posted - 2011-08-12 : 11:51:01
|
Thanks for the reply robvolk.I'll have a look at your observations and see if that'll make a difference.I don't understand your question pertaining to why am I opening a large file in an editor? I want to ensure the data is correct by looking at the top rows.Cheers |
|
|
robvolk
Most Valuable Yak
15732 Posts |
Posted - 2011-08-12 : 13:04:46
|
Ahhh, I thought you were editing the file manually. If all you need is to check rows, you can BULK INSERT or bcp with the LASTROW or -L option to limit how much is imported. This is easier than opening a huge file.Or if you want to do some command-line stuff, there's a Unix utility called "head" that's built just for this purpose. There are Windows ports available here: http://unxutils.sourceforge.net/There's also Powershell: http://technet.microsoft.com/en-us/library/ee692806.aspx |
|
|
|
|
|