How To: Linux Find Large Files in a Directory

See all GNU/Linux related FAQ
How do I find out all large files in a directory?

There is no single command that can be used to list all large files. But, with the help of find command command and shell pipes, you can easily list all large files. This page explains how to find the largest files and directories in Linux using various commands.
Tutorial details
Difficulty level Easy
Root privileges Yes
Requirements Find and du commands on Linux or Unix
Est. reading time 4 minutes
Advertisement

Linux List All Large Files

To finds all files over 50,000KB (50MB+) in size and display their names, along with size, use following syntax:

The syntax may vary based upon GNU/find and your Linux distro. Hence read man pages.

Syntax for RedHat / CentOS / Fedora Linux

find {/path/to/directory/} -type f -size +{size-in-kb}k -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'
Search or find big files Linux (50MB) in current directory, enter:
$ find . -type f -size +50000k -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'
Search in my /var/log directory:
# find /var/log -type f -size +100000k -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'

Syntax for Debian / Ubuntu Linux

find {/path/to/directory} -type f -size +{file-size-in-kb}k -exec ls -lh {} \; | awk '{ print $8 ": " $5 }'
Search in current directory:
$ find . -type f -size +10000k -exec ls -lh {} \; | awk '{ print $8 ": " $5 }'
Sample output:

./.kde/share/apps/akregator/Archive/http___blogs.msdn.com_MainFeed.aspx?Type=AllBlogs.mk4: 91M
./out/out.tar.gz: 828M
./.cache/tracker/file-meta.db: 101M
./ubuntu-8.04-desktop-i386.iso: 700M
./vivek/out/mp3/Eric: 230M

Above commands will lists files that are are greater than 10,000 kilobytes in size. To list all files in your home directory tree less than 500 bytes in size, type:
$ find $HOME -size -500b
OR
$ find ~ -size -500b
To list all files on the system whose size is exactly 20 512-byte blocks, type:
# find / -size 20

Finding large files using the find command

Let us search for files with size greater than 1000 MB, run

find /dir/to/search -xdev -type f -size +1000M
find $HOME -xdev -type f -size +1000M
# for root fs / #
sudo find / -xdev -type f -size +1000M

Here is what I see:

/var/tmp/.guestfs-0/appliance.d/root
/var/lib/libvirt/images/debian10.qcow2
/var/lib/libvirt/images/debian10-1.qcow2
/var/lib/libvirt/images/centos8.qcow2
/var/lib/libvirt/images/archlinux.qcow2
/var/lib/libvirt/images/centos-8-cloud/centos-8-cloud.1598187044
/var/lib/libvirt/images/centos-8-cloud/centos-8-cloud.qcow2
/home/vivek/backups/books/pdfs/other/full.edition.tar
/home/vivek/.local/share/baloo/index
/home/vivek/snap/sosumi/common/BaseSystem/BaseSystem.img
/home/vivek/snap/sosumi/common/os.qcow2
/isoimages/ZeroShell-3.9.3-X86-USB.img
/isoimages/AlmaLinux-8.3-x86_64-dvd.iso
/isoimages/ZeroShell-3.9.3A-RPI.img

Want to print file size, owner and other information along with largest file names? Pass the -ls as follows:
sudo find / -xdev -type f -size +1000M -ls
# Another syntax to find large files in Linux
sudo find / -xdev -type f -size +1000M -exec ls -lh {} \;

How To - Linux Find Large Files in a Directory Command

Finding Large Files in a Directory Under Linux and Unix (click to enlarge)

Perl hack: To display large files

Jonathan has contributed following perl code print out stars and the length of the stars show the usage of each folder / file from smallest to largest on the box:

 du -k | sort -n | perl -ne 'if ( /^(\d+)\s+(.*$)/){$l=log($1+.1);$m=int($l/log(1024)); printf  ("%6.1f\t%s\t%25s  %s\n",($1/(2**(10*$m))),(("K","M","G","T","P")[$m]),"*"x (1.5*$l),$2);}'

ls command: finding the largest files in a directory

You can also use ls command:
$ ls -lS
$ ls -lS | less
$ ls -lS | head +10

ls command: finding the smallest files in a directory

Use ls command as follows:
$ ls -lSr
$ ls -lSr | less
$ ls -lSr | tail -10

Find Large files and Ddrectories using the du command

You can also use the du command as pointed out by georges in the comments below:

du -xak . |sort -n | tail -50
du -xah . | sort -rh | head -10
56G	.
32G	./snap
30G	./snap/sosumi/common
30G	./snap/sosumi
28G	./snap/sosumi/common/os.qcow2
14G	./backups
7.7G	./backups/books
7.4G	./backups/books/pdfs
5.7G	./backups/books/pdfs/unix
4.8G	./.cache

Where,

  • du -xah . : Find disk space usage in the current working directory indicated by the dot (.), skip directories on different file systems (-x), count all files including dirs (-a), and show files sizes in a human-readable format (-h).
  • sort -rh : Sort all file size values in human-readable format (-h) and print result in reverse (-r).
  • head -10 : Display only the top ten directories eating disk space.

Conclusion

See more find command examples and usage here and here.

🥺 Was this helpful? Please add a comment to show your appreciation or feedback.

nixCrat Tux Pixel Penguin
Hi! 🤠
I'm Vivek Gite, and I write about Linux, macOS, Unix, IT, programming, infosec, and open source. Subscribe to my RSS feed or email newsletter for updates.

49 comments… add one
  • zamaan Oct 16, 2014 @ 11:41

    very useful.

  • Tek Dec 23, 2015 @ 19:47

    When using the following command on Ubuntu 14.04LTS

    find / -type f -size +50000k -exec ls -lh {} \; | awk '{ print $8 ": " $5 }'
    I had to change $8 to $9 for the file path to display.

    • Joshua McGee Nov 12, 2016 @ 15:12

      As did I. It seems the output format of `find` has changed.

Leave a Reply

Your email address will not be published. Required fields are marked *

Use HTML <pre>...</pre> for code samples. Your comment will appear only after approval by the site admin.