views:

255

answers:

3

Hello all,

I am putting together a job on SQL Enterprise Manager 2000 to copy and delete records in a couple database tables. We've run a straight up mass copy and delete stored procedure, but it could be running it on millions of rows, and therefore hangs the server. I was interested in trying to run the service in 100-ish record chunks at a time, so the server doesn't grind to a halt (this is a live web database). I want this service to run once a night, which is why I've put it in an agent job. Is there any way to loop the calls to the stored procedures that actually do the copy and delete, and then "sleep" in between each call to give the server time to catch up? I know there is the WAITFOR command, but I'm unsure if this will hold the processor or let it run other queries in the meantime.

Thanks!

A: 

WAITFOR will let other processes 'have a go'. I've used this technique to stop large DELETE's locking up the machine. Create a WHILE loop, delete a block of rows, and then WAITFOR a few seconds (or less, whatever is appropriate).

Mitch Wheat
+1  A: 

Don't worry about waiting between loops, SQL server should handle the contention between your maintenance job and the regular activity on the server.

What really causes the problem in these types of situations is that the entire delete process happens all at once, inside a single transaction. This blows up the log for the database, and can cause the kinds of problems it sounds like you are experiencing.

Use a loop like this to delete in manageable chunks:

DECLARE @i INT
SET @i = 1

SET ROWCOUNT 10000

WHILE @i > 0
BEGIN
    BEGIN TRAN
        DELETE TOP 1000 FROM dbo.SuperBigTable
     WHERE RowDate < '2009-01-01'
    COMMIT

    SELECT @i = @@ROWCOUNT
END
SET ROWCOUNT 0

You can use similar logic for your copy.

BradC
Thanks for the feedback. Would it matter if this code was executed in a stored procedure (and called from the agent job) or just in the agent job itself?
Kevin
No, not really. Just personal preference/company policy.
BradC
+1  A: 

"Chunkifying" your deletes is the preferred way to delete excessive amounts of data without bloating up transaction log files. BradC's post is a reasonable example of this.

Managing such loops is best done within a single stored procedure. To spread such work out over time, I'd still keep it in the procedure. Inserting a WAITFOR in the loop will put a "pause" between each set of deletes, if you deem that necessary to deal with possible concurrency issues. Use a SQL Agent job to determine when the procedure start--and if you need to make sure it stops by a certain time, work that into the loop as well.

My spin on this code would be:

--  NOTE: This is a code sample, I have not tested it
CREATE PROCEDURE ArchiveData

    @StopBy DateTime
    --  Pass in a cutoff time.  If it runs this long, the procedure will stop.
AS

DECLARE @LastBatch  int

SET @LastBatch = 1
--  Initialized to make sure the loop runs at least once


WHILE @LastBatch > 0
 BEGIN

    WAITFOR DELAY '00:00:02'
    --  Set this to your desired delay factor

    DELETE top 1000  --  Or however many per pass are desired
     from SourceTable
    --  Be sure to add a where clause if you don't want to delete everything!

    SET @LastBatch = @@rowcount

    IF getdate() > @StopBy
        SET @LastBatch = 0

 END

RETURN 0

Hmm. Rereading you post implies that you want to copy the data somewhere first before deleting it. To do that, I'd set up a temp table, and inside the loop first truncate the temp table, then copy in the primary keys of the TOP N items, insert into the "archive" table via a join to the temp table, then delete the source table also via a join to the temp table. (Just a bit more complex than a straight delete, isn't it?)

Philip Kelley
You are correct. In two of the tables, I'm technically "archiving" the data, as in copying it to another database on the same server, then deleting the existing records. In another table, I'm just deleting all records created before a certain date. I thought about using a temp table to assist with the transfer, which I may end up doing. I'll give your suggestion a try and see how it turns out. Thanks!
Kevin