"Chunkifying" your deletes is the preferred way to delete excessive amounts of data without bloating up transaction log files. BradC's post is a reasonable example of this.
Managing such loops is best done within a single stored procedure. To spread such work out over time, I'd still keep it in the procedure. Inserting a WAITFOR in the loop will put a "pause" between each set of deletes, if you deem that necessary to deal with possible concurrency issues. Use a SQL Agent job to determine when the procedure start--and if you need to make sure it stops by a certain time, work that into the loop as well.
My spin on this code would be:
-- NOTE: This is a code sample, I have not tested it
CREATE PROCEDURE ArchiveData
@StopBy DateTime
-- Pass in a cutoff time. If it runs this long, the procedure will stop.
AS
DECLARE @LastBatch int
SET @LastBatch = 1
-- Initialized to make sure the loop runs at least once
WHILE @LastBatch > 0
BEGIN
WAITFOR DELAY '00:00:02'
-- Set this to your desired delay factor
DELETE top 1000 -- Or however many per pass are desired
from SourceTable
-- Be sure to add a where clause if you don't want to delete everything!
SET @LastBatch = @@rowcount
IF getdate() > @StopBy
SET @LastBatch = 0
END
RETURN 0
Hmm. Rereading you post implies that you want to copy the data somewhere first before deleting it. To do that, I'd set up a temp table, and inside the loop first truncate the temp table, then copy in the primary keys of the TOP N items, insert into the "archive" table via a join to the temp table, then delete the source table also via a join to the temp table. (Just a bit more complex than a straight delete, isn't it?)