DELETE FROM `table`
WHERE (whatever criteria)
ORDER BY `id`
LIMIT 1000

Wash, rinse, repeat until zero rows affected. Maybe in a script that sleeps for a second or three between iterations.


I had a use case of deleting 1M+ rows in the 25M+ rows Table in the MySQL. Tried different approaches like batch deletes (described above).
I've found out that the fastest way (copy of required records to new table):

  1. Create Temporary Table that holds just ids.

CREATE TABLE id_temp_table ( temp_id int);

  1. Insert ids that should be removed:

insert into id_temp_table (temp_id) select.....

  1. Create New table table_new

  2. Insert all records from table to table_new without unnecessary rows that are in id_temp_table

insert into table_new .... where table_id NOT IN (select distinct(temp_id) from id_temp_table);

  1. Rename tables

The whole process took ~1hr. In my use case simple delete of batch on 100 records took 10 mins.


the following deletes 1,000,000 records, one at a time.

 for i in `seq 1 1000`; do 
     mysql  -e "select id from table_name where (condition) order by id desc limit 1000 " | sed 's;/|;;g' | awk '{if(NR>1)print "delete from table_name where id = ",$1,";" }' | mysql; 
 done

you could group them together and do delete table_name where IN (id1,id2,..idN) im sure too w/o much difficulty


I'd also recommend adding some constraints to your table to make sure that this doesn't happen to you again. A million rows, at 1000 per shot, will take 1000 repetitions of a script to complete. If the script runs once every 3.6 seconds you'll be done in an hour. No worries. Your clients are unlikely to notice.