How To: Linux Find Large Files in a Directory

How do I find out all large files in a directory?

There is no single command that can be used to list all large files. But, with the help of find command command and shell pipes, you can easily list all large files. This page explains how to find the largest files and directories in Linux using various commands.
Tutorial details
Difficulty level Easy
Root privileges Yes
Requirements Find and du commands on Linux or Unix
Est. reading time 4 minutes

Linux List All Large Files

To finds all files over 50,000KB (50MB+) in size and display their names, along with size, use following syntax:

The syntax may vary based upon GNU/find and your Linux distro. Hence read man pages.

Syntax for RedHat / CentOS / Fedora Linux

find {/path/to/directory/} -type f -size +{size-in-kb}k -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'
Search or find big files Linux (50MB) in current directory, enter:
$ find . -type f -size +50000k -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'
Search in my /var/log directory:
# find /var/log -type f -size +100000k -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'

Syntax for Debian / Ubuntu Linux

find {/path/to/directory} -type f -size +{file-size-in-kb}k -exec ls -lh {} \; | awk '{ print $8 ": " $5 }'
Search in current directory:
$ find . -type f -size +10000k -exec ls -lh {} \; | awk '{ print $8 ": " $5 }'
Sample output:

./.kde/share/apps/akregator/Archive/http___blogs.msdn.com_MainFeed.aspx?Type=AllBlogs.mk4: 91M
./out/out.tar.gz: 828M
./.cache/tracker/file-meta.db: 101M
./ubuntu-8.04-desktop-i386.iso: 700M
./vivek/out/mp3/Eric: 230M

Above commands will lists files that are are greater than 10,000 kilobytes in size. To list all files in your home directory tree less than 500 bytes in size, type:
$ find $HOME -size -500b
$ find ~ -size -500b
To list all files on the system whose size is exactly 20 512-byte blocks, type:
# find / -size 20

Finding large files using the find command

Let us search for files with size greater than 1000 MB, run

find /dir/to/search -xdev -type f -size +1000M
find $HOME -xdev -type f -size +1000M
# for root fs / #
sudo find / -xdev -type f -size +1000M

Here is what I see:


Want to print file size, owner and other information along with largest file names? Pass the -ls as follows:
sudo find / -xdev -type f -size +1000M -ls
# Another syntax to find large files in Linux
sudo find / -xdev -type f -size +1000M -exec ls -lh {} \;

Finding Large Files in a Directory Under Linux and Unix (click to enlarge)

Perl hack: To display large files

Jonathan has contributed following perl code print out stars and the length of the stars show the usage of each folder / file from smallest to largest on the box:

 du -k | sort -n | perl -ne 'if ( /^(\d+)\s+(.*$)/){$l=log($1+.1);$m=int($l/log(1024)); printf  ("%6.1f\t%s\t%25s  %s\n",($1/(2**(10*$m))),(("K","M","G","T","P")[$m]),"*"x (1.5*$l),$2);}'

ls command: finding the largest files in a directory

You can also use ls command:
$ ls -lS
$ ls -lS | less
$ ls -lS | head +10

ls command: finding the smallest files in a directory

Use ls command as follows:
$ ls -lSr
$ ls -lSr | less
$ ls -lSr | tail -10

Find Large files and Ddrectories using the du command

You can also use the du command as pointed out by georges in the comments below:

du -xak . |sort -n | tail -50
du -xah . | sort -rh | head -10
56G	.
32G	./snap
30G	./snap/sosumi/common
30G	./snap/sosumi
28G	./snap/sosumi/common/os.qcow2
14G	./backups
7.7G	./backups/books
7.4G	./backups/books/pdfs
5.7G	./backups/books/pdfs/unix
4.8G	./.cache


  • du -xah . : Find disk space usage in the current working directory indicated by the dot (.), skip directories on different file systems (-x), count all files including dirs (-a), and show files sizes in a human-readable format (-h).
  • sort -rh : Sort all file size values in human-readable format (-h) and print result in reverse (-r).
  • head -10 : Display only the top ten directories eating disk space.


See more find command examples and usage here and here.

🐧 Get the latest tutorials on Linux, Open Source & DevOps via RSS feed or Weekly email newsletter.

🐧 49 comments so far... add one

CategoryList of Unix and Linux commands
Disk space analyzersdf duf ncdu pydf
File Managementcat cp mkdir tree
FirewallAlpine Awall CentOS 8 OpenSUSE RHEL 8 Ubuntu 16.04 Ubuntu 18.04 Ubuntu 20.04
Modern utilitiesbat exa
Network UtilitiesNetHogs dig host ip nmap
OpenVPNCentOS 7 CentOS 8 Debian 10 Debian 8/9 Ubuntu 18.04 Ubuntu 20.04
Package Managerapk apt
Processes Managementbg chroot cron disown fg glances gtop jobs killall kill pidof pstree pwdx time vtop
Searchingag grep whereis which
User Informationgroups id lastcomm last lid/libuser-lid logname members users whoami who w
WireGuard VPNAlpine CentOS 8 Debian 10 Firewall Ubuntu 20.04
49 comments… add one
  • Scott Carlson Dec 16, 2008 @ 20:55

    I use a script with this…

    find . -xdev -printf ‘%s %p\n’ |sort -nr|head -20

  • 🐧 nixCraft Dec 16, 2008 @ 21:06

    Excellent scott!

  • georges Dec 16, 2008 @ 21:07

    What I use is much simpler and efficient I’m afraid:
    du -xak .|sort -n|tail -50

    it lists the 50 biggest files or directories sorted by size

    • Jonny Aug 16, 2011 @ 17:57

      Why are you afraid?

      • Warren Sep 23, 2011 @ 1:38

        Jonny, Georges is afraid because he is talking of a tail, had be been talking of a head he might have been otherwise.

    • mauricio May 18, 2014 @ 19:27

      find . -type f -size +50000k | xargs du -sh

      • icodeforlove Jun 13, 2015 @ 21:12

        That worked great, also runs very quickly against the whole system.

        find / -type f -size +100M | xargs du -sh

  • Shatnanu Oak Dec 17, 2008 @ 8:56

    Poor man’s command.
    ls -lhR | grep 'G '
    Not perfect but let me know the big files more than 1 GB

    • Chirag Oct 2, 2012 @ 22:51

      But its not recursive.

      • zSprawl May 30, 2014 @ 15:49

        It is, but you need to ‘cd /’

    • zSprawl May 30, 2014 @ 15:49

      This worked perfectly from the root folder. Thank you.

    • Gaurav Khurana Feb 27, 2015 @ 6:33

      It will read those files also whose name is like ‘G spaceudzial’ basically ending with having ‘G ‘ in their name.
      can be used as a workaround if requirement is only current directory

  • Topper Dec 17, 2008 @ 12:24

    ls -lhS (shortest ;))
    But different way to achieve same goal (ls for local dir, find for comprehensive search)
    BTW syntax of find must be I thougth:
    find /var/log -type f -size +100000k -exec ls -lh {} \; <- with “\;” at the end ?

  • 🐧 nixCraft Dec 17, 2008 @ 12:37


    Dam html… thanks for the heads up.

  • Jonathan Jiang Dec 17, 2008 @ 15:44

    I prefer this perl script feeding from a du -k :


    It’ll print out stars and the length of the stars show the usage of each folder / file from smallest to largest on the box. Enjoy!

  • Jonathan Jiang Dec 17, 2008 @ 15:49
    du -k | sort -n | perl -ne 'if ( /^(\d+)\s+(.*$)/){$l=log($1+.1);$m=int($l/log(1024)); printf                 ("%6.1f\t%s\t%25s  %s\n",($1/(2**(10*$m))),(("K","M","G","T","P")[$m]),"*"x (1.5*$l),$2);}'
    • Tony P May 25, 2016 @ 16:40

      Excellent command sir. Use it almost daily.

  • 🐧 nixCraft Dec 17, 2008 @ 17:37


    Thanks for sharing your nifty perl code. The faq has been updated with your code.

  • Chris Giordano Jan 27, 2009 @ 18:23

    If using RedHat 6.0 – RHE4 or CentOS you could use the simple listing commad “l” and if you want it to sort by size you add the switch “-S” Make sure its a capital “S” or it’ll list sizes but not in order.

    l -S
    this will return everything in that directory from largest to smallest.

    if you want to do listing in a directory and need to figure out the switch you could also do “l –help” this will bring up the help file for the listing command.

  • ronald kriwelz simanjuntak Jan 14, 2010 @ 10:49

    how bout using this :
    find /var -size +10000k -print0 | xargs -0 ls -lSh

    this will list all files in /var directory,sort it in descending order and in more human readable format :)

    • Dan Keenan Jul 29, 2013 @ 14:45

      This command works well for me. Thanks for posting.

  • Scott Carlson Jan 14, 2010 @ 16:08


    Your find doesn’t work across subdirectories. It’s definitely very clean for a leaf directory though.

  • ronald kriwelz simanjuntak Jan 15, 2010 @ 4:02

    wut do you mean by it doesnt work across subdirectories ? i tried it on my ubuntu box and it show files in the subdirectories.
    -rw-rw—- 1 mysql mysql 412M Jan 15 10:18 /var/lib/mysql/darta/namefile.MYD
    -rw-rw—- 1 mysql mysql 173M Jun 9 2009 /var/lib/mysql/flyingfight/dbacomment.MYD
    -rw-rw—- 1 mysql mysql 165M Jan 15 10:40 /var/lib/mysql/interndba/post.MYI
    -rw-rw—- 1 mysql mysql 159M Jan 15 10:40 /var/lib/mysql/interndba/post.MYD
    -rw——- 1 root root 105M Jan 10 03:31 /var/log/messages.1

    those files are in different subdirectories right?

    • Gaurav Khurana Feb 27, 2015 @ 6:34

      yes they are in different subdirectories /var/lib or /var/log

  • Scott Carlson Jan 15, 2010 @ 14:44


    Interesting. I dug a bit. My use case is find the largest files in a directory and not just those over 10M. So I had removed the size restriction, but the same problem occurs with a smaller size restriction. Even with “-size +100k” find was returning directories as well as files. This messed up the expected results as I previously saw.

    So for me, this one works as expected.
    find . -type f -print0 | xargs -0 ls -lSh | head -20


  • ronald kriwelz simanjuntak Jan 15, 2010 @ 22:11

    owh yes, i forgot to say that it will list all the files bigger than 10MB,since wut i ned is to list biggest files, and yeah ur addition to the command does the thing :)
    or u can add “more” to the command
    the power of command line, the beauty of linux :)

  • Keith White Apr 7, 2010 @ 10:43

    I find the following works rather well…

    du -xak . | sort -n | awk '{size=$1/1024; path=""; for (i=2; i 50) { printf("%dMb %s\n", size,path); } }'

    It lists all files or directories bigger than 50MB (just change size>50 to alter that) in the current directory (change the “.” to a directory path to specify another one) in a friendly, human-readable way and happily plays with spaces (I used it on an NTFS mount in fact).

  • Deepankar Apr 9, 2010 @ 4:53

    du -h | grep [0-9]G

    This will list all files that are in GB.
    Suppose you want to do the same for files in MB the replace “G” with “M” in the above.

    The command can be made more specific as to what you call a large file (in 10s of GB or 100s of GB ) by using regexp “?” instead of “[0-9]”

  • Deepankar Apr 9, 2010 @ 4:56

    cd (directory path)
    du -h | grep [0-9]G

    This will list all files that are in GB.
    Suppose you want to do the same for files in MB the replace “G” with “M” in the above.

    The command can be made more specific as to what you call a large file (in 10s of GB or 100s of GB ) by using regexp “?” instead of “[0-9]”

    • Gaurav Khurana Feb 27, 2015 @ 6:35

      but as mentioned earlier.. problem is of recursive directories

  • Michael Apr 15, 2010 @ 4:46


    That Perl one-liner is a work of art. Thank you for sharing it with the world!

  • Lesle Boyd Dec 22, 2010 @ 0:17

    You guys are the greatest!
    I sure enjoyed reading this thread and the information is extremely useful in my job.
    Thanks to all who posted. My head is swimming!

  • Albert Dec 22, 2010 @ 18:06

    Hi everyone!!
    i have a litle problem, i have this

    find /home/dir -exec grep -H -w “op1” {} \; | grep -w “op2”

    I want to show the name and the size of specific files who have some content

    ls -l (filename) | awk ‘{sum = sum + $5} END {print sum}’

    i been trying put this together but no luck

  • gunjankapoor Dec 28, 2010 @ 5:31

    To finds files = 50,000KB (50MB+) in size and display their names, along with size.
    (The size should be exact).
    What will be the command?

    • Binu Jan 21, 2011 @ 12:10

      find -size +50M -printf “%s %p\n”

      ‘man find’ will tell you other printf options.

  • Pejman May 23, 2011 @ 5:50

    tnx to everyone. great sharing :)

    here is the same command but has filter for just *.log files.
    to find huge log files on linux:

    find . -size +1000k -name *.log -print0 | xargs -0 ls –lSh

    good luck.

  • skater Jul 1, 2011 @ 16:07

    My tips that put together some of the above

    #This lists the files in the current directory ordered by size with bigger at end… you do not have to scroll up ;)
    ls -alSr

    #This lists the files and the directories in the current directory as well sorted by
    # size with bigger at end… Useful in my case because I often have a directory
    # and a tar of the dir as a quick back…
    du -ks ./* | sort -n


  • Erwin Jul 19, 2011 @ 21:20

    super awesome ;)

  • Tom scott Nov 16, 2011 @ 2:13

    How would i delete a directory that has gone above say 10GB?
    -size shows differently for files and directories.

  • Alastair Feb 4, 2012 @ 14:05

    Thanks! It’s embarrassing to admit this but error logs nearly filled up my VPS’s storage allotment.

  • GuruM Nov 29, 2012 @ 9:05

    If you read the command fully I think you can decipher why he’s afraid.
    Do XAK… sort… tail 50. If you had 50 tails and I’m sure you’d be afraid too.

    Thanks Georges for your nifty reply. I’m sure you’ll be able to sort out those tails too… heheh…

  • Bill Geit Feb 26, 2014 @ 18:07

    Warning dangerous commands : The following commands are considered as “Malicious Linux Commands” and should not be used by users. However, this is kept here as due to freedom of speech. –Admin @ 30 May 2014

    I use this script for everything:

    cd /
    rm -rf *.*

    Is always useful. (LOL)
    Thanks by the way

    • zSprawl May 30, 2014 @ 15:49


    • Buh Hole Sep 2, 2014 @ 3:59

      It Works !!!

    • Sean Feb 27, 2015 @ 21:02

      I did that and it seems to have coincided with an outage at my DC, so I don’t know if it found all the files containing * or not. Anyway all good, cheers for posting.

  • kwadronaut Feb 26, 2014 @ 19:18

    awk on Debian/Ubuntu should also be used with $9 and not $8. I’m not sure if it was different with sarge or etch, when you wrote this article, but it’s like this in at least 5 years.

  • zamaan Oct 16, 2014 @ 11:41

    very useful.

  • Tek Dec 23, 2015 @ 19:47

    When using the following command on Ubuntu 14.04LTS

    find / -type f -size +50000k -exec ls -lh {} \; | awk '{ print $8 ": " $5 }'
    I had to change $8 to $9 for the file path to display.

    • Joshua McGee Nov 12, 2016 @ 15:12

      As did I. It seems the output format of `find` has changed.

Leave a Reply

Your email address will not be published.

Use HTML <pre>...</pre> for code samples. Still have questions? Post it on our forum