I'm running into an interesting argument with a DBA who insists that the DELETE command should not be used in T-SQL, that instead the proper way to remove data is to move the data you'd like to keep to a temp table, and drop and re-create the original table. The justification given for this is that it should prevent index fragmentation issues.
Has anyone out there heard of this as a practice, and can anyone suggest reasons why this might be a good idea? This is a fairly complex structure, and we are generally talking about small numbers of selective deletions (figure, less than 1000 rows at a time) from tables that are intended to aggregate data indefinitely.
I can't imagine a reason to do this rather than to simply reorganize/rebuild indexes where appropriate, but I would be happy to be educated if I'm missing something. Thanks!