Linux wget: Your Ultimate Command Line Downloader

It is a common practice to manage UNIX/Linux/BSD servers remotely over the ssh session. You may need to download the software or other files for installation. There are a few compelling graphical download manager exists for Linux and UNIX like operating systems:

  • kget: KGet is a versatile and user-friendly download manager for KDE desktop system.
  • gwget / gwget2: Gwget is a download manager for the Gnome Desktop
  • uget – easy-to-use download manager written in GTK+

When it comes to the command line or shell prompt downloader, wget the non-interactive downloader rules. It supports HTTP, FTP, HTTPS, and other protocols along with authentication facility, and tons of other options. Here are some tips to get most out of it:

Linux wget command examples

The syntax is:
wget url
wget [options] url

Let us see some common Linux wget command examples, syntax and usage.

How to install wget command on Linux

Use the apt command/apt-get command if you are on Ubuntu/Debian/Mint Linux:
$ sudo apt install wget
Fedora Linux user should type dnf command $ sudo dnf install wget
RHEL/CentOS/Oracle Linux user should type the yum command:
$ sudo yum install wget
SUSE/OpenSUSE Linux user should type the zypper command:
$ zypper install wget
Arch Linux user should type the pacman command:
$ sudo pacman -S wget

Download a Single File Using wget

Type the following command:
$ wget http://www.cyberciti.biz/here/lsst.tar.gz
$ wget ftp://ftp.freebsd.org/pub/sys.tar.gz
$ wget -O output.file http://nixcraft.com/some/path/file.name.tar.gz
$ wget http://www.kernel.org/pub/linux/kernel/v2.6/testing/linux-2.6.36-rc3.tar.bz2

Sample outputs:

Fig.01: wget command in action on my Linux box

How Do I Download Multiple Files Using wget?

Use the following syntax:
$ wget http://www.cyberciti.biz/download/lsst.tar.gz ftp://ftp.freebsd.org/pub/sys.tar.gz ftp://ftp.redhat.com/pub/xyz-1rc-i386.rpm
You can create a shell variable that holds all urls and use the ‘BASH for loop‘ to download all files:

URLS="http://www.cyberciti.biz/download/lsst.tar.gz \
ftp://ftp.freebsd.org/pub/sys.tar.gz \
ftp://ftp.redhat.com/pub/xyz-1rc-i386.rpm \
http://xyz-url/abc.iso"
for u in $URLS
do
 wget "$u"
done

How Do I Read URLs From a File?

You can put all urls in a text file and use the -i option to wget to download all files. First, create a text file:
$ vi /tmp/download.txt
Append a list of urls:

http://www.cyberciti.biz/download/lsst.tar.gz
ftp://ftp.freebsd.org/pub/sys.tar.gz
ftp://ftp.redhat.com/pub/xyz-1rc-i386.rpm
http://xyz-url/abc.iso

Type the wget command as follows:
$ wget -i /tmp/download.txt

Resume Downloads

You can also force wget to get a partially-downloaded file i.e. resume downloads. This is useful when you want to finish up a download started by a previous instance of wget, or by another program:
$ wget -c http://www.cyberciti.biz/download/lsst.tar.gz
$ wget -c -i /tmp/download.txt

Please note that the -c option only works with FTP / HTTP servers that support the “range” header.

Force wget To Download All Files In Background

The -o option used to force wget to go into background immediately after startup. If no output file is specified via the -o option, output is redirected to wget-log file:
$ wget -cb -o /tmp/download.log -i /tmp/download.txt
OR
$ nohup wget -c -o /tmp/download.log -i /tmp/download.txt &
nohup runs the given COMMAND (in this example wget) with hangup signals ignored, so that the command can continue running in the background after you log out.

How Do I Limit the Download Speed?

You can limit the download speed to amount bytes per second. Amount may be expressed in bytes, kilobytes with the k suffix, or megabytes with the m suffix. For example, –limit-rate=100k will limit the retrieval rate to 100KB/s. This is useful when, for whatever reason, you don’t want Wget to consume the entire available bandwidth. This is useful when you want to download a large file file, such as an ISO image:
$ wget -c -o /tmp/susedvd.log --limit-rate=50k ftp://ftp.novell.com/pub/suse/dvd1.iso
Use m suffix for megabytes (–limit-rate=1m). The above command will limit the retrieval rate to 50KB/s. It is also possible to specify disk quota for automatic retrievals to avoid disk DoS attack. The following command will be aborted when the quota is (100MB+) exceeded.
$ wget -cb -o /tmp/download.log -i /tmp/download.txt --quota=100m
From the wget man page:

Please note that Wget implements the limiting by sleeping the appropriate amount of time after a network read that took less time than specified by the rate. Eventually this strategy causes the TCP transfer to slow down to approximately the specified rate. However, it may take some time for this balance to be achieved, so don’t be surprised if limiting the rate doesn’t work well with very small files.

Use wget With the Password Protected Sites

You can supply the http username/password on server as follows:
$ wget --http-user=vivek --http-password=Secrete http://cyberciti.biz/vivek/csits.tar.gz
Another way to specify username and password is in the URL itself.
$ wget 'http://username:password@cyberciti.biz/file.tar.gz
Either method reveals your password to anyone who bothers to run ps command:
$ ps aux
Sample outputs:

vivek     27370  2.3  0.4 216156 51100 ?        S    05:34   0:06 /usr/bin/php-cgi
vivek     27744  0.1  0.0  97444  1588 pts/2    T    05:38   0:00 wget http://test:test@www.kernel.org/pub/linux/kernel/v2.6/testing/linux-2.6.36-rc3.tar.bz2
vivek    27746  0.5  0.0  97420  1240 ?        Ss   05:38   0:00 wget -b http://test:test@www.kernel.org/pub/linux/kernel/v2.6/testing/linux-2.6.36-rc3.tar.bz2

To prevent the passwords from being seen, store them in .wgetrc or .netrc, and make sure to protect those files from other users with “chmod”. If the passwords are really important, do not leave them lying in those files either edit the files and delete them after Wget has started the download.

Download all mp3 or pdf file from remote FTP server

Generally you can use shell special character aka wildcards such as *, ?, [] to specify selection criteria for files. Same can be use with FTP servers while downloading files.
$ wget ftp://somedom-url/pub/downloads/*.pdf
$ wget ftp://somedom-url/pub/downloads/*.pdf

OR
$ wget -g on ftp://somedom.com/pub/downloads/*.pdf

Use lftp when you need multithreaded download

lftp fetches HTTP URLs in a manner similar to wget, but segments the retrieval into multiple parts to increase download speed. It gets the specified file using several connections. This can speed up transfer, but loads the net heavily impacting other users. Use only if you really have to transfer the file ASAP.
$ lftp -e 'pget -n 5 -c url/file.tar.gz; exit'
Above command will download file.tar.gz in 5 segments/connections.

Conclusion

Please note that wget command is also available on Linux and *BSD/macOS/Unix- like oses. See man page of wget(1) for more advanced options. Type the following man command:
$ man wget

🐧 If you liked this page, please support my work on Patreon or with a donation.
🐧 Get the latest tutorials on SysAdmin, Linux/Unix, Open Source & DevOps topics via:
CategoryList of Unix and Linux commands
File Managementcat
FirewallAlpine Awall CentOS 8 OpenSUSE RHEL 8 Ubuntu 16.04 Ubuntu 18.04 Ubuntu 20.04
Network Utilitiesdig host ip nmap
OpenVPNCentOS 7 CentOS 8 Debian 10 Debian 8/9 Ubuntu 18.04 Ubuntu 20.04
Package Managerapk apt
Processes Managementbg chroot cron disown fg jobs killall kill pidof pstree pwdx time
Searchinggrep whereis which
User Informationgroups id lastcomm last lid/libuser-lid logname members users whoami who w
WireGuard VPNAlpine CentOS 8 Debian 10 Firewall Ubuntu 20.04
42 comments… add one
  • Srikanth Ravuri Apr 20, 2016 @ 18:33

    I am using the wget command to pull the files from external website. It executes successfully but, I see a file which has html code rather file that I am expecting.

    Logs from the command execution…

    $ wget --ftp-user=xxxxxxx --ftp-password=yyyyy "http://download.nlm.nih.gov/umls/kss/rxnorm/RxNorm_full_04042016.zip"
    --2016-04-20 11:31:25--  http://download.nlm.nih.gov/umls/kss/rxnorm/RxNorm_full_04042016.zip
    Resolving download.nlm.nih.gov... 130.14.16.113, 2607:f220:41e:1016::113
    Connecting to download.nlm.nih.gov|130.14.16.113|:80... connected.
    HTTP request sent, awaiting response... 302 Found
    Location: https://utslogin.nlm.nih.gov/cas/login?service=https%3a%2f%2fdownload.nlm.nih.gov%2fumls%2fkss%2frxnorm%2fRxNorm_full_04042016.zip [following]
    --2016-04-20 11:31:26--  https://utslogin.nlm.nih.gov/cas/login?service=https%3a%2f%2fdownload.nlm.nih.gov%2fumls%2fkss%2frxnorm%2fRxNorm_full_04042016.zip
    Resolving utslogin.nlm.nih.gov... 130.14.16.164, 2607:f220:41e:1016::164
    Connecting to utslogin.nlm.nih.gov|130.14.16.164|:443... connected.
    HTTP request sent, awaiting response... 200 OK
    Length: unspecified [text/html]
    Saving to: “RxNorm_full_04042016.zip.3”
    
        [                                                                                                                             ] 9,947       --.-K/s   in 0.1s
    
    2016-04-20 11:31:26 (98.5 KB/s) - “RxNorm_full_04042016.zip.3” saved [9947]
    

    Can you please help me understand what I am missing using wget?

    Thank you

  • Tony pezzella Aug 13, 2013 @ 1:19

    How do I upload a file using wget?

  • hitayezu Jun 9, 2013 @ 12:43

    it great command thanks

  • mike Sep 18, 2012 @ 10:57

    I’ve been with windows my whole life until I started pc science in college I started playing around with linux its a whole new world!1

  • deepu Aug 25, 2012 @ 20:03

    hi,
    I have to download a .txt file on daily basis through our one of server, these all exercise i have done manualy. and save the file in to unix system. i want to create a automate script who can download and save file into unix directory.

    Thanks in advance

  • netlord Jul 10, 2012 @ 15:05

    hi
    and how can i (most easyli) get files from sourceforge?
    this “automatic closiest proxy” selection is awful!

  • Baronsed Jul 1, 2012 @ 12:36

    And what when it comes to get the list of links contained in a web page ?
    $ lynx -dump -listonly file > html-list

    ^^

  • mohit Dec 18, 2011 @ 18:24

    how can i download a file over a https website using wget .

  • gbviswanadh Sep 18, 2011 @ 11:46

    how can extract file are in the format of 111.zip.bz2 and abc.zip.bz2 , so is this possible to bzip2 and unzip a these files in at time command,

  • Ajay Jul 4, 2011 @ 13:55

    URLS=”http://www.cyberciti.biz/download/lsst.tar.gz
    ftp://ftp.freebsd.org/pub/sys.tar.gz
    ftp://ftp.redhat.com/pub/xyz-1rc-i386.rpm
    http://xyz.com/abc.iso
    for u in $URLS
    do
    wget $u
    done

  • Ajay Jul 4, 2011 @ 13:46

    You can write a small shell script such that for 5 different files u write 5 wget command inside the shell script and finally run the shell script.

  • Krishna Jul 4, 2011 @ 12:13

    When I use wget to download 5 files from server using a script it sends 1 GET request and waits for server to respond then sends the 2nd and so on. I want the GET to be sent simultaneously irrespective of the response from the server at the same moment. How to do this? Any insights? Thanks

  • Bipin May 20, 2011 @ 20:23

    any Idea to put password from prompt with wget command.

    Thanks,
    Bipin Bahuguna

  • Ajay May 3, 2011 @ 10:09

    Hello,

    how can i use the wget command if the following situation arises

    1) when connected to a particular server the wget command will download the file. for this if we set a crontab then at the mentioned time the download will happen. but at time if there is a problem with the server and is not getting connected the wget command will overwrite the existing file with a dummy file there by loosing the contents
    Is there any way to prevent this?? i.e. when not connected to the server the wget command should not create or overwrite.

  • m3nt4t Apr 17, 2011 @ 11:34

    The easiest way to avoid changing owner/group of download files via wget is to use sudo (run process as other user) or set the download path to a mount point with specific uid/gid (e.g. uid=henry,gid=henry)

  • Dinesh Jadhav Feb 15, 2011 @ 7:59

    I want to change the owner of the file with the wget command, what ever the filles wiil be downloaded needed to change the owner.

  • zack Feb 9, 2011 @ 4:40
    wget --http-user=***** --http-passwd=********** -O test.txt -o LogFile.txt http://ds.data.jma.go.jp/gmd/ship/data/ShipData/dd/Z__C_RJTD_yyyymmddhhnn--_SHIP_ORIGINAL_AN.txt
    

    Please help me. I want to download data from this website – http://ds.data.jma.go.jp/gmd/ship/download/howtoget.html using above wget command. The problem is how to download the whole one day data with minutes and hour is changing such as starting from 0000UTC-8am(Malaysia local time) 1/1/2010 until 1/1/2010-2350UTC. The data is available for every 10 minutes and time is in UTC format(hhmm).

    variable
    yyyy=year, mm=month, dd=day, hh=hour, nn=minute

    so for this example, yyyy=2010, mm=01, dd=01, hh=hour, nn=minute, user=mmd, passwd=bmlofeb2011

    I hope anyone can help me how to download the data.

  • Priya Jan 7, 2011 @ 10:51

    Sir,
    I have used the command given as example but error is comming as given below.
    wget url
    –21:55:38– url
    Resolving http://www.cyberciti.biz 75.126.153.206, 2607:f0d0:1002:51::4
    Connecting to http://www.cyberciti.biz|75.126.153.206|:80… failed: Network is unreachable.
    Connecting to http://www.cyberciti.biz|2607:f0d0:1002:51::4|:80… failed: Network is unreachable.

Leave a Reply

Your email address will not be published. Required fields are marked *

Use HTML <pre>...</pre>, <code>...</code> and <kbd>...</kbd> for code samples.