Removing millions of files in a directory

Since my Exim email server wan’t configured for Spam detection, it became blacklisted and was storing literally millions of Spam emails in the mail queue (which are stored as regular files in the directories `/var/spool/exim4/input` and `/var/spool/exim4/msglog`. There were so many files that I could not remove them with the `rm ./*` command any more, nor even with `find . -type f -delete`, as some tutorials suggested. The reason is that all those commands first collect file names which are then passed as arguments. You quickly can eat up your RAM with these methods and you literally can wait dozens of hours before these commands even start deleting files. The solution was to bypass Bash entirely and delete it with Perl. This is the command that started deleting files only after 1 minute startup time:

cd directoy/to/be/emptied
perl -e 'opendir D, "."; while ($n = readdir D) { print $n; unlink $n }'

According to my subjective estimation, it deleted about 200 files per second.

One thought on “Removing millions of files in a directory”

  1. Thank you for the tip. I had the same problem. I suggest not calling print $n with every file to make it run much faster.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.