The question was posed: How do we list the largest files in a directory that are older than one month?
Step One: Use find with the printf option to list all the files and their sizes:
find [PATH] -printf "%s %p\n"
Step Two: If you try this, you will see a massive list of files scroll across your sceen.
This is not very useful, you need to sort the files by size.
To do this, we willuse the 'sort' command.
Pipe the output to sort by addind the following to the end of the first command:
%s prints the size, %p the filename with path, and the \n prints a newline character.
| sort -n
Step Three: Now, this gives you this list of files in order of size, smallest to largest.
But, we want to identify some number of the largest files.
We will pipe the ouput to another program, called tail:
The '|' pipes the output (basically sends the output from the previous command to the following command), and the -n tells sort that is is sorting numbers.
| tail -20
Step Four: Finally, we wanted to place a time limit on the files.
We will use ctime to filter based on the age of the file.
Well, there are a number of parameters that find can use, %c is the last time it was changed. Use the man page for find for details.
tail give us the last lines in a file. The -20 will limit it to 20 lines. You can substitue any number here, 1 for the largest file, 100 for the largest 100 files.
One additional trick:
If you want to filter the output (say you have some sub directorys you want to skip over), you can filter using grep:
Pipe the output from find to grep:
| grep -v [MASK]
The -v option is 'invert-match'. Instead of listing items that match, it blocks items that match. If you had a sub directory called .Trash you wanted to ignore, grep -v .Trash would filter out any line with that directoy name in the path.