I am not a database person, exactly, and most of my db work has been with MySQL, so forgive me if something in this question is incredibly naive.
I need to delete 5.5 million rows from an Oracle table that has about 100 million rows. I have all the IDs of the rows I need to delete in a temporary table. If it were a just a few thousand rows, I'd do this:
DELETE FROM table_name WHERE id IN (SELECT id FROM temp_table);
COMMIT;
Is there anything I need to be aware of, and/or do differently, because it's 5.5 million rows? I thought about doing a loop, something like this:
DECLARE
vCT NUMBER(38) := 0;
BEGIN
FOR t IN (SELECT id FROM temp_table) LOOP
DELETE FROM table_name WHERE id = t.id;
vCT := vCT + 1;
IF MOD(vCT,200000) = 0 THEN
COMMIT;
END IF;
END LOOP;
COMMIT;
END;
First of all - is this doing what I think it is - batching commits of 200,000 at a time? Assuming it is, I'm still not sure if it's better to generate 5.5 million SQL statements, and commit in batches of 200,000, or to have one SQL statement and commit all at once.
Ideas? Best practices?
EDIT: I ran the first option, the single delete statement, and it only took 2 hours to complete in development. Based on that, it's queued to be run in production. Thanks for your input!