Many people who have run Linux file servers and ftp servers have at some point wanted to free up some space. One good algorithm to do this efficiently is to remove old data starting with the largest files first. So how to generate such a list? One method is to use a " find -exec du " command: find /path/to/full/file/system -type f -mtime +10 -exec du -sk {} \; | sort -n > /var/tmp/list_of_files_older_than_10_days_sorted_by_size Once you have that list, you can selectively delete files from the bottom of it. Note that the list will likely be exponentially sorted. That is, the bottom 10% of the list will take up a huge chunk of the used storage space.
Systems, tools, and observations.