Linux: Download all file from ftp server recursively

by on April 27, 2005 · 21 comments· LAST UPDATED September 26, 2007

in , ,

You can use ncftpget command to download entire ftp directory and sub directories from remote ftp server. Let us say you would like to download /www-data directory and all subdirectories inside this one from ftp.nixcraft.net server. You need to use ncftpget command.

Install ncftp client

ncftp client software can be downloaded from http://www.ncftp.com/ncftp/ and works with FreeBSD, Solaris and all most all UNIX variant. You can also run command as follows to install ncftp:
$ sudo apt-get install ncftp

FTP get directory recursively

ncftpget is Internet file transfer program for scripts and advance usage. You need to use command as follows:
$ ncftpget –R –v –u "ftpuser" ftp.nixcraft.net /home/vivek/backup /www-data
Where,

  • -R : Copy all subdirectories and files (recursive)
  • -v : Verbose i.e. display download activity and progess
  • -u "USERNAME" : FTP server username, if skipped ncftpget will try anonymous username
  • ftp.nixcraft.net : Ftp server name
  • /home/vivek/backup : Download everything to this directory
  • /www-data : Remote ftp directory you wish to copy

If you get an error which read as follows:

tar: End of archive volume 1 reached
tar: Sorry, unable to determine archive format.
Could not read directory listing data: Connection reset by peer

Then add –T option to ncftpget command:

$ ncftpget –T –R –v –u "ftpuser" ftp.nixcraft.net /home/vivek/backup /www-data

Where,

  • -T : Do not try to use TAR mode with Recursive mode
TwitterFacebookGoogle+PDF versionFound an error/typo on this page? Help us!

{ 21 comments… read them below or add one }

1 haakondahl November 27, 2007 at 4:30 pm

This is GREAT! I have been struggling with transferring files from multiple partitions on one machine to multiple partitions on another, so NFS doesn’t get it done no matter which isthe server of client. I am now using ProFTPd as a server, and ncftp as a client, and I am watching the partitions fill up as I expected, rather than as NFS and cp -vurf were getting it done.
Thank you for posting this fantastic guide!

Reply

2 Daniel August 26, 2008 at 1:27 pm

Great. I’ve been trying to make a full copy of my ftp site for a backup.

Reply

3 Chris March 6, 2009 at 4:09 pm

Thanks! needed to backup my site, and didn’t want to spend too much time writing a script…

Reply

4 nasser May 13, 2009 at 8:58 pm

very nice guide.

I install ncftp client in CentOS and its work.

Reply

5 computerjan June 9, 2009 at 10:04 am

Very useful guide !
I´m using it for my FTP-Server

Reply

6 Ravi July 25, 2009 at 9:25 am

Hi Vivek,
Very nice and encourageing blog you have, it’s inspire to all of us, realy it works in all manner, I hope you could be feeling the same

Cheers
Go opensource……

Ravi Bhure

Reply

7 corky January 2, 2010 at 9:09 am

This is great…except where the directory on the ftp happens to have a [ or ] character in it. A file in a directory your getting can have those characters, just not the directory you specify in the command :(
ncftpget will return the error “remote directory listing failed.”

Really unfortunate, this was *almost* the perfect solution for me.

Reply

8 Daniel April 16, 2010 at 4:03 am

Seems like it dosn’t download all files, I just did this on one of my sites (-T command) and there where 3 directories that wasn’t downloaded.. had to download them one by one.. is there a limit to how many files/directories it will get?

Reply

9 daveX99 May 20, 2010 at 6:25 pm

Hey – I was initially having trouble. Running the command as written, it would download any files in the remote directory, but not the subdirectories.

I had to add an asterisk to the line, so it looked like this (I ran it from the directory into which I wanted the files & directories to go):
$ ncftpget -R -v -u “username” http://ftp.remote_server.com ./ /Remote_Directory/*

Not sure why, but it did the trick.

-dave.

Reply

10 Anonymous May 20, 2010 at 11:14 pm

Actually, I am also missing a few, but not all of the sub directories…

At this point, I’m now doing this by hand. Any ideas as to why this doesn’t work would be welcome.
-dave.

Reply

11 Brad Griffith July 8, 2010 at 10:48 pm

A method that has worked much better for me is:

wget -r ftp://username:password@ftp.server.com/*

Reply

12 Karl November 12, 2010 at 10:07 am

Much better, quicker and easier method Brad, cheers!

Reply

13 k4m1 December 5, 2010 at 12:33 pm

hehe wget rules again.
So clean and cool solution, thanks a lot :)
-q for quiet mode.

Reply

14 jbn June 17, 2011 at 9:51 am

Worked great. Except for it not doing recursion deep enough.

Currently trying
wget -m ftp://username:password@ftp.server.com/*
as it supposedly sets the correct options.

Reply

15 Pavel Kalvoda September 30, 2012 at 7:33 pm

Sweet! Thanks a lot

Reply

16 vamsi July 11, 2011 at 10:09 am

Thank You Solved my problem

Reply

17 Torbjörn April 22, 2012 at 9:18 am

Thank you for the nice blogpost. I however have some problems. Please take a look at the following ncftpget issue. I am trying to get all my private photo/video
files from a folder on an ftp server containing almost 1.3 million files
of size around 200-300kB (in total 2GB). I first demonstrate that it
works fine to get one file at a time, when I specify its name. But as
soon as I use wild-characters or try to recursively copy all files in
the folder I get all kind of errors, as shown below. Could you tell me
which parameters to use or which FTP program to use if ncftp does not
support transfer of so many files?

Copied from Windows Command Prompt (my real password is replaced by
password):
C:\Users\tn\Videos\M>ncftpget -u tnordlin -p password
ftp1.storegate.com . /home/tnordlin/050101001202.3gp
050101001202.3gp: 257.58 kB 190.23
kB/s

C:\Users\tn\Videos\M>ncftpget -u tnordlin -p password
ftp1.storegate.com . /home/tnordlin/05010100*
ncftpget /home/tnordlin/05010100*: remote directory listing failed.

C:\Users\tn\Videos\M>ncftpget -R -u tnordlin -p password
ftp1.storegate.com . /home/tnordlin
Could not read reply from control connection — timed out.
Passive mode refused.

C:\Users\tn\Videos\M>ncftpget -E -R -u tnordlin -p password
ftp1.storegate.com . /home/tnordlin
Could not accept a data connection (errno = 10060.

C:\Users\tn\Videos\M>ncftpget -E -F -R -u tnordlin -p password
ftp1.storegate.com . /home/tnordlin
Could not read reply from control connection — timed out.
Passive mode refused.

C:\Users\tn\Videos\M>ncftpget -T -R -u tnordlin -p password
ftp1.storegate.com . /home/tnordlin
Could not read reply from control connection — timed out.
Passive mode refused.

I have tried the windows built-in http://ftp.exe, ftp in Windows Explorer,
FireFTP, FileZilla, CuteFTP, Cyberduck (on Mac), but they all fail to
LIST the files of the folder or to get even a single file. Ncftp is the
one coming closest to working, since it at least is possible to get
files one by one when I specify their name.

Reply

18 nixCraft April 24, 2012 at 6:36 am

Firewall/Proxy in between connection? If so you need to fix it either at server level or disable it (not recommended).

Also, most ftp clients can not handle that many file names. I suggest that you use rsync tool to download all files.

Reply

19 Paul Sandel April 30, 2012 at 2:24 pm

thanks for the contribution! I couldn’t get the directories and ended up using wget.

Reply

20 Siranjeevi June 15, 2012 at 6:40 pm

Really awesome tool, thank you so much. :)

Reply

21 Rich South September 3, 2012 at 9:32 am

Thank you very much for this. I’d been using dropbox as a local copy of my webpage across a few computers and didn’t sync before I wiped one of my computers and lost a few local updates but thanks to this I could download a copy off the FTP server.

Reply

Leave a Comment

Tagged as: , , , , ,

Previous post:

Next post: