Looking for a quick way to delete old files in Linux? For example, log files can build up over a long period of time.
Usage case: The default WordPress backup tool doesn’t delete old backup files. So, over time, these files can accumulate and take up valuable disk space. If you run a large and busy site, this can become a problem. So, in order to maintain a healthy file system, we’ll want to keep say the last 30 days’ worth of backups and discard anything older than that.
Of course, this technique will work equally well in any directory that contains log files. Here’s the code – I’ll explain what we’re doing below:
find /path/to/folder -name '*.sql' -mtime +30 -delete
- Okay, the
findcommand is built right in to Linux. We’re using it to locate – then delete – files that match the date criteria.
- /path/to/folder – Make this to the path to the log file directory you want to delete files from. We restrict this to a particular folder so that we’re not running the command right across the file system – that would be bad!
- We’re using the
-name '*.sql'to restrict the command to only the backup files that end with .sql. As with the previous point, this is a precautionary measure so that we don’t delete all files.
-mtimeswitch allows us to specify files older than a certain date. In this instance, we want everything older than 30 days.
- Finally, the
-deleteswitch tells the command to delete all the files that meet the criteria we specified. If you want to see what files will be deleted, run this command without
-deleteand you’ll be given a list of files.
Note: This command works perfectly on Ubuntu. I can’t vouch for other Linux distributions as all distros vary slightly. As always, try this out on a sample set of data before doing it on something critical – and use restrictions intelligently to avoid deleting more data than you intended to!