I have a database in production with one table that has grown extremely large (lots of accumulated data).
To improve query performance I used the sql server optimizer which suggested a new index.
So I made a copy of the production database to test against and it does improve performance, however my problem is that it took about 24 hours to create the index and while the index is being created the application is unusable.
For this particular application, being down for a few hours is not a problem but a 24 hour downtime would be and I am looking for a way to create this index without having to do that.
I only have a few ideas at the moment.
One idea is to copy a backup to another server. Apply the new index and any other changes. Copy the backup back to the production server. Take the application down and merge over any new data since when I took the backup.
Of course this has its own set of problems like having to merge the data back together so I don't like this idea for that reason.
This is SQL Server 2008 Standard Ed.
I normally deploy database changes by script.
UPDATE: Another idea would be to move the archive data out of the main table over several days in chunks. Then create the index when the table got small enough. Then slowly migrate the data back.