Linux: Download all file from ftp server recursively

You can use ncftpget command to download entire ftp directory and sub directories from remote ftp server. Let us say you would like to download /www-data directory and all subdirectories inside this one from server. You need to use ncftpget command.

Install ncftp client

ncftp client software can be downloaded from and works with FreeBSD, Solaris and all most all UNIX variant. You can also run command as follows to install ncftp:
$ sudo apt-get install ncftp

FTP get directory recursively

ncftpget is Internet file transfer program for scripts and advance usage. You need to use command as follows:
$ ncftpget –R –v –u "ftpuser" /home/vivek/backup /www-data

  • -R : Copy all subdirectories and files (recursive)
  • -v : Verbose i.e. display download activity and progess
  • -u “USERNAME” : FTP server username, if skipped ncftpget will try anonymous username
  • : Ftp server name
  • /home/vivek/backup : Download everything to this directory
  • /www-data : Remote ftp directory you wish to copy

If you get an error which read as follows:

tar: End of archive volume 1 reached
tar: Sorry, unable to determine archive format.
Could not read directory listing data: Connection reset by peer

Then add –T option to ncftpget command:

$ ncftpget –T –R –v –u "ftpuser" /home/vivek/backup /www-data


  • -T : Do not try to use TAR mode with Recursive mode

🥺 Was this helpful? Please add a comment to show your appreciation or feedback.

nixCrat Tux Pixel Penguin
Hi! 🤠
I'm Vivek Gite, and I write about Linux, macOS, Unix, IT, programming, infosec, and open source. Subscribe to my RSS feed or email newsletter for updates.

23 comments… add one
  • haakondahl Nov 27, 2007 @ 16:30

    This is GREAT! I have been struggling with transferring files from multiple partitions on one machine to multiple partitions on another, so NFS doesn’t get it done no matter which isthe server of client. I am now using ProFTPd as a server, and ncftp as a client, and I am watching the partitions fill up as I expected, rather than as NFS and cp -vurf were getting it done.
    Thank you for posting this fantastic guide!

  • Daniel Aug 26, 2008 @ 13:27

    Great. I’ve been trying to make a full copy of my ftp site for a backup.

  • Chris Mar 6, 2009 @ 16:09

    Thanks! needed to backup my site, and didn’t want to spend too much time writing a script…

  • nasser May 13, 2009 @ 20:58

    very nice guide.

    I install ncftp client in CentOS and its work.

  • computerjan Jun 9, 2009 @ 10:04

    Very useful guide !
    I´m using it for my FTP-Server

  • Ravi Jul 25, 2009 @ 9:25

    Hi Vivek,
    Very nice and encourageing blog you have, it’s inspire to all of us, realy it works in all manner, I hope you could be feeling the same

    Go opensource……

    Ravi Bhure

  • corky Jan 2, 2010 @ 9:09

    This is great…except where the directory on the ftp happens to have a [ or ] character in it. A file in a directory your getting can have those characters, just not the directory you specify in the command 🙁
    ncftpget will return the error “remote directory listing failed.”

    Really unfortunate, this was *almost* the perfect solution for me.

  • Daniel Apr 16, 2010 @ 4:03

    Seems like it dosn’t download all files, I just did this on one of my sites (-T command) and there where 3 directories that wasn’t downloaded.. had to download them one by one.. is there a limit to how many files/directories it will get?

  • daveX99 May 20, 2010 @ 18:25

    Hey – I was initially having trouble. Running the command as written, it would download any files in the remote directory, but not the subdirectories.

    I had to add an asterisk to the line, so it looked like this (I ran it from the directory into which I wanted the files & directories to go):
    $ ncftpget -R -v -u “username” ./ /Remote_Directory/*

    Not sure why, but it did the trick.


    • Anonymous May 20, 2010 @ 23:14

      Actually, I am also missing a few, but not all of the sub directories…

      At this point, I’m now doing this by hand. Any ideas as to why this doesn’t work would be welcome.

  • Brad Griffith Jul 8, 2010 @ 22:48

    A method that has worked much better for me is:

    wget -r*

    • Karl Nov 12, 2010 @ 10:07

      Much better, quicker and easier method Brad, cheers!

    • k4m1 Dec 5, 2010 @ 12:33

      hehe wget rules again.
      So clean and cool solution, thanks a lot 🙂
      -q for quiet mode.

    • jbn Jun 17, 2011 @ 9:51

      Worked great. Except for it not doing recursion deep enough.

      Currently trying
      wget -m*
      as it supposedly sets the correct options.

    • Pavel Kalvoda Sep 30, 2012 @ 19:33

      Sweet! Thanks a lot

  • vamsi Jul 11, 2011 @ 10:09

    Thank You Solved my problem

  • Torbjörn Apr 22, 2012 @ 9:18

    Thank you for the nice blogpost. I however have some problems. Please take a look at the following ncftpget issue. I am trying to get all my private photo/video
    files from a folder on an ftp server containing almost 1.3 million files
    of size around 200-300kB (in total 2GB). I first demonstrate that it
    works fine to get one file at a time, when I specify its name. But as
    soon as I use wild-characters or try to recursively copy all files in
    the folder I get all kind of errors, as shown below. Could you tell me
    which parameters to use or which FTP program to use if ncftp does not
    support transfer of so many files?

    Copied from Windows Command Prompt (my real password is replaced by
    C:UserstnVideosM>ncftpget -u tnordlin -p password . /home/tnordlin/050101001202.3gp
    050101001202.3gp: 257.58 kB 190.23

    C:UserstnVideosM>ncftpget -u tnordlin -p password . /home/tnordlin/05010100*
    ncftpget /home/tnordlin/05010100*: remote directory listing failed.

    C:UserstnVideosM>ncftpget -R -u tnordlin -p password . /home/tnordlin
    Could not read reply from control connection — timed out.
    Passive mode refused.

    C:UserstnVideosM>ncftpget -E -R -u tnordlin -p password . /home/tnordlin
    Could not accept a data connection (errno = 10060.

    C:UserstnVideosM>ncftpget -E -F -R -u tnordlin -p password . /home/tnordlin
    Could not read reply from control connection — timed out.
    Passive mode refused.

    C:UserstnVideosM>ncftpget -T -R -u tnordlin -p password . /home/tnordlin
    Could not read reply from control connection — timed out.
    Passive mode refused.

    I have tried the windows built-in http://ftp.exe, ftp in Windows Explorer,
    FireFTP, FileZilla, CuteFTP, Cyberduck (on Mac), but they all fail to
    LIST the files of the folder or to get even a single file. Ncftp is the
    one coming closest to working, since it at least is possible to get
    files one by one when I specify their name.

    • 🛡️ Vivek Gite (Author and Admin) nixCraft Apr 24, 2012 @ 6:36

      Firewall/Proxy in between connection? If so you need to fix it either at server level or disable it (not recommended).

      Also, most ftp clients can not handle that many file names. I suggest that you use rsync tool to download all files.

  • Paul Sandel Apr 30, 2012 @ 14:24

    thanks for the contribution! I couldn’t get the directories and ended up using wget.

  • Siranjeevi Jun 15, 2012 @ 18:40

    Really awesome tool, thank you so much. 🙂

  • Rich South Sep 3, 2012 @ 9:32

    Thank you very much for this. I’d been using dropbox as a local copy of my webpage across a few computers and didn’t sync before I wiped one of my computers and lost a few local updates but thanks to this I could download a copy off the FTP server.

  • Ambiorixg12 Nov 28, 2014 @ 3:21

    This is the syntax that worked for me

     ncftpget -u "" -p "mypass" -R /var/www/hosting /apps/asterisk/

    where /var/www/hosting is the local directory and apps the remote directory

  • openairhoster Feb 20, 2016 @ 22:54

    When ncftpgetting a Joomla installation with many files from a (pretty fast) web server, I noticed it took quite a while before ncftp actually started getting files. Maybe a minute or so. It probably takes some times to calculate the number of files, directory sizes, whatever. So one should not give up too early when if ncftp seems to stall in the beginning. Just wait until the actual transfer starts.

Leave a Reply

Your email address will not be published. Required fields are marked *

Use HTML <pre>...</pre> for code samples. Your comment will appear only after approval by the site admin.