Howto: Use wget Recursively Download All FTP Directories

last updated in Categories ,

I would like to copy all of my files and directories from UNIX server to Linux workstation. How do I use wget command to recursively download whole FTP directories stored at /home/tom/ from ftp.example.com to local directory called /home/tom/backup?

GNU Wget is a free Linux / UNIX utility for non-interactive download of files from the Web or and FTP servers, as well as retrieval through HTTP proxies. GNU/wget has been designed for robustness over slow dialup internet or unstable network connections. If a download fails due to a network problem, it will keep retrying until the whole file has been retrieved. If the server supports regetting, it will instruct the server to continue the download from where it left off.

wget Recursive Example

You can use the -r ( recursive retrieving ) option as follows. You can also pass your ftp username and password to the wget command. First, make a backup directory in your $HOME directory:

mkdir ~/backup/
cd ~/backup/

Now, use wget command as follows:

wget -r ftp://username:password@ftp.example.com/
wget -r ftp://tom:myPassword@ftp.example.com/home/tom/
wget -r ftp://tom:myPassword@ftp.example.com/var/www/

wget recursive ftp with mirroring option

The -m option turns on mirroring i.e. it turns on recursion and time-stamping, sets infinite recursion depth and keeps FTP directory listings:

wget -m ftp://username:Password@ftp.example.com/
wget -m ftp://username:Password@ftp.example.com/var/www/html

Posted by: Vivek Gite

The author is the creator of nixCraft and a seasoned sysadmin, DevOps engineer, and a trainer for the Linux operating system/Unix shell scripting. Get the latest tutorials on SysAdmin, Linux/Unix and open source topics via RSS/XML feed or weekly email newsletter.

Share this on (or read 4 comments/add one below):

4 comment

  1. the mirroring option is not retaining the time stamp of directories but only of the files in those directories…can you think of a solution where someone would want to retain the directory structure of the same.

  2. I also like to drop the directories that it will normally create when using this command. So, starting with the examples above:

    wget -r ftp://username:password@ftp.example.com/home/username

    This command would create a local directory structure that looks like this:

    /ftp.example.com/home/username/

    So, I like to add the following command-line options:

    wget -r -nH –cut-dirs=2 ftp://username:password@ftp.example.com/home/username

    -nH removes the http://ftp.example.com root directory from the local copy. –cut-dirs=2 removes /home/username from the local copy. So you end up with everything being saved to the current directory (and recursively copying folders from the remote /home/username folder.

    Have a question? Post it on our forum!