Before going any further, check this code:
IF OBJECT_ID('TestAlterTable') IS NOT NULL
DROP TABLE TestAlterTable
create table dbo.TestAlterTable
Id int identity not null CONSTRAINT PK_TestAlterTable PRIMARY KEY CLUSTERED
,col1 char(8000) not null
,col2 char(49) not null
Declare @i int =1
While @i <100
IF OBJECT_ID('DF_Col2') IS NOT NULL
ALTER TABLE dbo.TestAlterTable DROP CONSTRAINT DF_COl2
Alter table dbo.TestAlterTable Drop Column Col2
Alter table dbo.TestAlterTable add Col2 char(49) NOT NULL CONSTRAINT DF_Col2 DEFAULT('a')
Based on the number of columns and the data types you have, at one point, you may receive an error similar too this:
.Net SqlClient Data Provider: Msg 1701, Level 16, State 1, Line 9
Creating or altering table 'TestAlterTable' failed because the minimum row size would be 8061, including 8 bytes of internal overhead. This exceeds the maximum allowable table row size of 8060 bytes.
The number of times you need to drop and add columns depend totally on number of columns and data types used.
Initially we thought that this was due to cleaning of data of the dropped columns from data pages. Like the ghost entries, (where the row is deleted logically, but the data is still there in the data page) I was even prepared to name this action as ghost columns but, sorry, I was wrong. This happens at the meta data level.
Still we haven;t figured out the reason behind this. But based on our research, it has all the behaviors of a bug. It does not happen at the first time.
Interestingly, if you rebuild the primary key, you can continue on the same path for another fixed number of times.
If anyone knows a good reason to say, why this is not a bug, please let me know.
Thanks to all the guys at eCollege who worked on this.