Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2005 Forums
 Transact-SQL (2005)
 alter column depending upon total number of rows?

Author  Topic 

reoaran
Starting Member

20 Posts

Posted - 2010-07-26 : 09:25:13
Hi,

alter table tblname
alter column [col_name] varchar(30)
alter table tblname
alter column col_name varchar(1000)

When i was running a script for resizing of columns(around 15 to 20 columns like above) of a table, the log file got filled up and failed. The db size is 70 GB and have set the log size limit to 65 GB and then also it failed with log full error. This table has app 17662000 rows. Does the above column resize proportional depend upon the data of these rows and hence making the log full ?

In the Prod, the above query script failed when the db recovery mode was Full and success when changed to Simple without incresing the log file. But in the test environment, it succeeded with the db in Full recovery mode it self. Not sure where is the difference ?

slimt_slimt
Aged Yak Warrior

746 Posts

Posted - 2010-07-26 : 10:06:32
if you can set recovery mode to simple do so and then alter the table and return the recovery mode to full.

and if you can shrink/truncate the log file before altering the table.
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2010-07-26 : 10:33:05
Unless this is a Development Environment then I think twiddling the Recovery Model on a LIVE database is risky - it will break the backup chain, so at the very least needs an immediate FULL backup when done - which itself may take an hours or so?

What datatype was the column before? I would not expect increasing a varchar from 20 to 30 to cause much logging. But maybe you are changing something else at the same time?
Go to Top of Page

reoaran
Starting Member

20 Posts

Posted - 2010-07-26 : 11:15:28
ok ...say suppose if i have 10000 rows... changing the column will effective for these 10000 rows tooo???
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2010-07-26 : 11:37:30
What are you changing the column FROM and TO?

Different datatype? NOT NULL to NULL?

Or just increasing a VARCHAR to a larger limit?
Go to Top of Page

Lamprey
Master Smack Fu Yak Hacker

4614 Posts

Posted - 2010-07-26 : 11:51:06
I'd try creating a new table with the desired schema and "Copy" the data from the existing table to the new table. Then drop the existing table and then rename the new table. (or rename the existing table to keep a backup of it incase it is needed before dropping it compeltely).
Go to Top of Page

slimt_slimt
Aged Yak Warrior

746 Posts

Posted - 2010-07-27 : 01:41:11
yes, changing on production server recovery mode is risky, i agree. especially due to replication settings, backup settings and the value of the data :)
Go to Top of Page
   

- Advertisement -