Deleting data from a large table

My 2 cents:

If you are using SQL 2005 and above, you can consider to partition your table based on the date field, so the table doesn't get locked when deleting old records.

Maybe, if you are in position of making dba decisions, you can temporarily change your log model to Simple, so it won't grow up too fast, it will still be growing, but the log won't be too detailed.


Try this

WHILE EXISTS ( SELECT * FROM table WHERE (condition for deleting))

BEGIN
SET ROWCOUNT 1000
DELETE Table WHERE (condition for deleting)
SET ROWCOUNT 0
ENd

This will delete the rows in groups of 1000


Better is to create a temporary table and insert only the data you want to keep. Then truncate your original table and copy back the backup.

Oracle syntax (SQL Server is similar)

create table keep as select * from source where data_is_good = 1;
truncate table source;
insert into source select * from keep;

You'll need to disable foreign keys, if there are any on the source table.

In Oracle, index names must be unique in the entire schema, not just per-table. In SQL server, you can further optimize this by just renaming "keep" to "source", as you can easily create indexes of the same name on both tables

Tags:

Sql