How To: Linux Find Large Files in a Directory

See all GNU/Linux related FAQ
How do I find out all large files in a directory?

There is no single command that can be used to list all large files. But, with the help of find command command and shell pipes, you can easily list all large files. This page explains how to find the largest files and directories in Linux using various commands.
Tutorial details
Difficulty level Easy
Root privileges Yes
Requirements Find and du commands on Linux or Unix
Est. reading time 4 minutes
Advertisement

Linux List All Large Files

To finds all files over 50,000KB (50MB+) in size and display their names, along with size, use following syntax:

The syntax may vary based upon GNU/find and your Linux distro. Hence read man pages.

Syntax for RedHat / CentOS / Fedora Linux

find {/path/to/directory/} -type f -size +{size-in-kb}k -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'
Search or find big files Linux (50MB) in current directory, enter:
$ find . -type f -size +50000k -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'
Search in my /var/log directory:
# find /var/log -type f -size +100000k -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'

Syntax for Debian / Ubuntu Linux

find {/path/to/directory} -type f -size +{file-size-in-kb}k -exec ls -lh {} \; | awk '{ print $8 ": " $5 }'
Search in current directory:
$ find . -type f -size +10000k -exec ls -lh {} \; | awk '{ print $8 ": " $5 }'
Sample output:

./.kde/share/apps/akregator/Archive/http___blogs.msdn.com_MainFeed.aspx?Type=AllBlogs.mk4: 91M
./out/out.tar.gz: 828M
./.cache/tracker/file-meta.db: 101M
./ubuntu-8.04-desktop-i386.iso: 700M
./vivek/out/mp3/Eric: 230M

Above commands will lists files that are are greater than 10,000 kilobytes in size. To list all files in your home directory tree less than 500 bytes in size, type:
$ find $HOME -size -500b
OR
$ find ~ -size -500b
To list all files on the system whose size is exactly 20 512-byte blocks, type:
# find / -size 20

Finding large files using the find command

Let us search for files with size greater than 1000 MB, run

find /dir/to/search -xdev -type f -size +1000M
find $HOME -xdev -type f -size +1000M
# for root fs / #
sudo find / -xdev -type f -size +1000M

Here is what I see:

/var/tmp/.guestfs-0/appliance.d/root
/var/lib/libvirt/images/debian10.qcow2
/var/lib/libvirt/images/debian10-1.qcow2
/var/lib/libvirt/images/centos8.qcow2
/var/lib/libvirt/images/archlinux.qcow2
/var/lib/libvirt/images/centos-8-cloud/centos-8-cloud.1598187044
/var/lib/libvirt/images/centos-8-cloud/centos-8-cloud.qcow2
/home/vivek/backups/books/pdfs/other/full.edition.tar
/home/vivek/.local/share/baloo/index
/home/vivek/snap/sosumi/common/BaseSystem/BaseSystem.img
/home/vivek/snap/sosumi/common/os.qcow2
/isoimages/ZeroShell-3.9.3-X86-USB.img
/isoimages/AlmaLinux-8.3-x86_64-dvd.iso
/isoimages/ZeroShell-3.9.3A-RPI.img

Want to print file size, owner and other information along with largest file names? Pass the -ls as follows:
sudo find / -xdev -type f -size +1000M -ls
# Another syntax to find large files in Linux
sudo find / -xdev -type f -size +1000M -exec ls -lh {} \;

How To - Linux Find Large Files in a Directory Command

Finding Large Files in a Directory Under Linux and Unix (click to enlarge)

Perl hack: To display large files

Jonathan has contributed following perl code print out stars and the length of the stars show the usage of each folder / file from smallest to largest on the box:

 du -k | sort -n | perl -ne 'if ( /^(\d+)\s+(.*$)/){$l=log($1+.1);$m=int($l/log(1024)); printf  ("%6.1f\t%s\t%25s  %s\n",($1/(2**(10*$m))),(("K","M","G","T","P")[$m]),"*"x (1.5*$l),$2);}'

ls command: finding the largest files in a directory

You can also use ls command:
$ ls -lS
$ ls -lS | less
$ ls -lS | head +10

ls command: finding the smallest files in a directory

Use ls command as follows:
$ ls -lSr
$ ls -lSr | less
$ ls -lSr | tail -10

Find Large files and Ddrectories using the du command

You can also use the du command as pointed out by georges in the comments below:

du -xak . |sort -n | tail -50
du -xah . | sort -rh | head -10
56G	.
32G	./snap
30G	./snap/sosumi/common
30G	./snap/sosumi
28G	./snap/sosumi/common/os.qcow2
14G	./backups
7.7G	./backups/books
7.4G	./backups/books/pdfs
5.7G	./backups/books/pdfs/unix
4.8G	./.cache

Where,

  • du -xah . : Find disk space usage in the current working directory indicated by the dot (.), skip directories on different file systems (-x), count all files including dirs (-a), and show files sizes in a human-readable format (-h).
  • sort -rh : Sort all file size values in human-readable format (-h) and print result in reverse (-r).
  • head -10 : Display only the top ten directories eating disk space.

Conclusion

See more find command examples and usage here and here.

🥺 Was this helpful? Please add a comment to show your appreciation or feedback.

nixCrat Tux Pixel Penguin
Hi! 🤠
I'm Vivek Gite, and I write about Linux, macOS, Unix, IT, programming, infosec, and open source. Subscribe to my RSS feed or email newsletter for updates.

49 comments… add one
  • Albert Dec 22, 2010 @ 18:06

    Hi everyone!!
    i have a litle problem, i have this

    find /home/dir -exec grep -H -w “op1” {} \; | grep -w “op2”

    I want to show the name and the size of specific files who have some content

    ls -l (filename) | awk ‘{sum = sum + $5} END {print sum}’

    i been trying put this together but no luck

  • gunjankapoor Dec 28, 2010 @ 5:31

    To finds files = 50,000KB (50MB+) in size and display their names, along with size.
    (The size should be exact).
    What will be the command?

    • Binu Jan 21, 2011 @ 12:10

      find -size +50M -printf “%s %p\n”

      ‘man find’ will tell you other printf options.

  • Pejman May 23, 2011 @ 5:50

    tnx to everyone. great sharing :)

    here is the same command but has filter for just *.log files.
    to find huge log files on linux:

    find . -size +1000k -name *.log -print0 | xargs -0 ls –lSh
    

    good luck.

  • skater Jul 1, 2011 @ 16:07

    My tips that put together some of the above

    #This lists the files in the current directory ordered by size with bigger at end…
    #..so you do not have to scroll up ;)
    ls -alSr

    #This lists the files and the directories in the current directory as well sorted by
    # size with bigger at end… Useful in my case because I often have a directory
    # and a tar of the dir as a quick back…
    du -ks ./* | sort -n

    bis
    S

  • Erwin Jul 19, 2011 @ 21:20

    super awesome ;)

  • Tom scott Nov 16, 2011 @ 2:13

    How would i delete a directory that has gone above say 10GB?
    -size shows differently for files and directories.

  • Alastair Feb 4, 2012 @ 14:05

    Thanks! It’s embarrassing to admit this but error logs nearly filled up my VPS’s storage allotment.

  • GuruM Nov 29, 2012 @ 9:05

    If you read the command fully I think you can decipher why he’s afraid.
    Do XAK… sort… tail 50. If you had 50 tails and I’m sure you’d be afraid too.

    Thanks Georges for your nifty reply. I’m sure you’ll be able to sort out those tails too… heheh…
    ;-)

  • Bill Geit Feb 26, 2014 @ 18:07

    Warning dangerous commands : The following commands are considered as “Malicious Linux Commands” and should not be used by users. However, this is kept here as due to freedom of speech. –Admin @ 30 May 2014

    I use this script for everything:

    cd /
    rm -rf *.*

    Is always useful. (LOL)
    Thanks by the way

    • zSprawl May 30, 2014 @ 15:49

      Mean!!

    • Buh Hole Sep 2, 2014 @ 3:59

      It Works !!!

    • Sean Feb 27, 2015 @ 21:02

      I did that and it seems to have coincided with an outage at my DC, so I don’t know if it found all the files containing * or not. Anyway all good, cheers for posting.

  • kwadronaut Feb 26, 2014 @ 19:18

    awk on Debian/Ubuntu should also be used with $9 and not $8. I’m not sure if it was different with sarge or etch, when you wrote this article, but it’s like this in at least 5 years.

Leave a Reply

Your email address will not be published. Required fields are marked *

Use HTML <pre>...</pre> for code samples. Your comment will appear only after approval by the site admin.