You can use ncftpget command to download entire ftp directory and sub directories from remote ftp server. Let us say you would like to download /www-data directory and all subdirectories inside this one from ftp.nixcraft.net server. You need to use ncftpget command.
Install ncftp client
ncftp client software can be downloaded from http://www.ncftp.com/ncftp/ and works with FreeBSD, Solaris and all most all UNIX variant. You can also run command as follows to install ncftp:
$ sudo apt-get install ncftp
FTP get directory recursively
ncftpget is Internet file transfer program for scripts and advance usage. You need to use command as follows:
$ ncftpget –R –v –u "ftpuser" ftp.nixcraft.net /home/vivek/backup /www-data
Where,
- -R : Copy all subdirectories and files (recursive)
- -v : Verbose i.e. display download activity and progess
- -u “USERNAME” : FTP server username, if skipped ncftpget will try anonymous username
- ftp.nixcraft.net : Ftp server name
- /home/vivek/backup : Download everything to this directory
- /www-data : Remote ftp directory you wish to copy
If you get an error which read as follows:
tar: End of archive volume 1 reached tar: Sorry, unable to determine archive format. Could not read directory listing data: Connection reset by peer
Then add –T option to ncftpget command:
$ ncftpget –T –R –v –u "ftpuser" ftp.nixcraft.net /home/vivek/backup /www-data
Where,
- -T : Do not try to use TAR mode with Recursive mode
🐧 Get the latest tutorials on Linux, Open Source & DevOps via:
- RSS feed or Weekly email newsletter
- Share on Twitter • Facebook • 23 comments... add one ↓
Category | List of Unix and Linux commands |
---|---|
File Management | cat |
Firewall | Alpine Awall • CentOS 8 • OpenSUSE • RHEL 8 • Ubuntu 16.04 • Ubuntu 18.04 • Ubuntu 20.04 |
Network Utilities | dig • host • ip • nmap |
OpenVPN | CentOS 7 • CentOS 8 • Debian 10 • Debian 8/9 • Ubuntu 18.04 • Ubuntu 20.04 |
Package Manager | apk • apt |
Processes Management | bg • chroot • cron • disown • fg • jobs • killall • kill • pidof • pstree • pwdx • time |
Searching | grep • whereis • which |
User Information | groups • id • lastcomm • last • lid/libuser-lid • logname • members • users • whoami • who • w |
WireGuard VPN | Alpine • CentOS 8 • Debian 10 • Firewall • Ubuntu 20.04 |
When ncftpgetting a Joomla installation with many files from a (pretty fast) web server, I noticed it took quite a while before ncftp actually started getting files. Maybe a minute or so. It probably takes some times to calculate the number of files, directory sizes, whatever. So one should not give up too early when if ncftp seems to stall in the beginning. Just wait until the actual transfer starts.
This is the syntax that worked for me
where /var/www/hosting is the local directory and apps the remote directory
Thank you very much for this. I’d been using dropbox as a local copy of my webpage across a few computers and didn’t sync before I wiped one of my computers and lost a few local updates but thanks to this I could download a copy off the FTP server.
Really awesome tool, thank you so much. 🙂
thanks for the contribution! I couldn’t get the directories and ended up using wget.
Thank you for the nice blogpost. I however have some problems. Please take a look at the following ncftpget issue. I am trying to get all my private photo/video
files from a folder on an ftp server containing almost 1.3 million files
of size around 200-300kB (in total 2GB). I first demonstrate that it
works fine to get one file at a time, when I specify its name. But as
soon as I use wild-characters or try to recursively copy all files in
the folder I get all kind of errors, as shown below. Could you tell me
which parameters to use or which FTP program to use if ncftp does not
support transfer of so many files?
Copied from Windows Command Prompt (my real password is replaced by
password):
C:UserstnVideosM>ncftpget -u tnordlin -p password
ftp1.storegate.com . /home/tnordlin/050101001202.3gp
050101001202.3gp: 257.58 kB 190.23
kB/s
C:UserstnVideosM>ncftpget -u tnordlin -p password
ftp1.storegate.com . /home/tnordlin/05010100*
ncftpget /home/tnordlin/05010100*: remote directory listing failed.
C:UserstnVideosM>ncftpget -R -u tnordlin -p password
ftp1.storegate.com . /home/tnordlin
Could not read reply from control connection — timed out.
Passive mode refused.
C:UserstnVideosM>ncftpget -E -R -u tnordlin -p password
ftp1.storegate.com . /home/tnordlin
Could not accept a data connection (errno = 10060.
C:UserstnVideosM>ncftpget -E -F -R -u tnordlin -p password
ftp1.storegate.com . /home/tnordlin
Could not read reply from control connection — timed out.
Passive mode refused.
C:UserstnVideosM>ncftpget -T -R -u tnordlin -p password
ftp1.storegate.com . /home/tnordlin
Could not read reply from control connection — timed out.
Passive mode refused.
I have tried the windows built-in http://ftp.exe, ftp in Windows Explorer,
FireFTP, FileZilla, CuteFTP, Cyberduck (on Mac), but they all fail to
LIST the files of the folder or to get even a single file. Ncftp is the
one coming closest to working, since it at least is possible to get
files one by one when I specify their name.
Firewall/Proxy in between connection? If so you need to fix it either at server level or disable it (not recommended).
Also, most ftp clients can not handle that many file names. I suggest that you use rsync tool to download all files.
Thank You Solved my problem
A method that has worked much better for me is:
wget -r ftp://username:password@ftp.server.com/*
Much better, quicker and easier method Brad, cheers!
hehe wget rules again.
So clean and cool solution, thanks a lot 🙂
-q for quiet mode.
Worked great. Except for it not doing recursion deep enough.
Currently trying
wget -m ftp://username:password@ftp.server.com/*
as it supposedly sets the correct options.
Sweet! Thanks a lot
Hey – I was initially having trouble. Running the command as written, it would download any files in the remote directory, but not the subdirectories.
I had to add an asterisk to the line, so it looked like this (I ran it from the directory into which I wanted the files & directories to go):
$ ncftpget -R -v -u “username” http://ftp.remote_server.com ./ /Remote_Directory/*
Not sure why, but it did the trick.
-dave.
Actually, I am also missing a few, but not all of the sub directories…
At this point, I’m now doing this by hand. Any ideas as to why this doesn’t work would be welcome.
-dave.
Seems like it dosn’t download all files, I just did this on one of my sites (-T command) and there where 3 directories that wasn’t downloaded.. had to download them one by one.. is there a limit to how many files/directories it will get?
This is great…except where the directory on the ftp happens to have a [ or ] character in it. A file in a directory your getting can have those characters, just not the directory you specify in the command 🙁
ncftpget will return the error “remote directory listing failed.”
Really unfortunate, this was *almost* the perfect solution for me.
Hi Vivek,
Very nice and encourageing blog you have, it’s inspire to all of us, realy it works in all manner, I hope you could be feeling the same
Cheers
Go opensource……
Ravi Bhure
Very useful guide !
I´m using it for my FTP-Server
very nice guide.
I install ncftp client in CentOS and its work.
Thanks! needed to backup my site, and didn’t want to spend too much time writing a script…
Great. I’ve been trying to make a full copy of my ftp site for a backup.
This is GREAT! I have been struggling with transferring files from multiple partitions on one machine to multiple partitions on another, so NFS doesn’t get it done no matter which isthe server of client. I am now using ProFTPd as a server, and ncftp as a client, and I am watching the partitions fill up as I expected, rather than as NFS and cp -vurf were getting it done.
Thank you for posting this fantastic guide!