It is a common practice to manage UNIX/Linux/BSD servers remotely over the ssh session. You may need to download the software or other files for installation. There are a few compelling graphical download manager exists for Linux and UNIX like operating systems. But, when it comes to the command line or shell prompt downloader, wget command, the non-interactive downloader rules. The wget command supports HTTP, FTP, HTTPS, other Internet protocols, authentication facility, and tons of other options. Here are some tips to get the most out of wget command in Linux and Unix-like systems.

This page explains how to use the wget command with valuable examples and comprehensive explanations of the most common options to become a pro-CLI user. Let us get started with wget.

Advertisement

Linux wget command examples

The syntax is as follows for wget command in Linux, macOS, and Unix system:
wget url
wget [options] url

The url where files are located is always the last parameter. For instance: https://www.cyberciti.biz/files/somefile.tar.gz. Let us see some common Linux wget command examples, syntax and usage.

How to install wget command on Linux

Use the apt command/apt-get command if you are on Ubuntu/Debian/Mint Linux:
$ sudo apt install wget
Fedora Linux users should type dnf command $ sudo dnf install wget
RHEL/CentOS/Oracle/Rocky and Alma Linux users should type the yum command to install the wget command:
$ sudo yum install wget
SUSE/OpenSUSE Linux user should type the zypper command to install the wget command:
$ zypper install wget
Arch Linux users should type the pacman command when you wish to install the GNU wget command:
$ sudo pacman -S wget
Alpine Linux users try the apk command to install the GNU wget command instead of using busybox wget:
# apk add wget

Installing GNU/wget on Unix like systems

FreeBSD users try the pkg command to install the GNU wget command:
$ sudo pkg install wget
macOS user first install Homebrew on Mac OS to use the brew package manager and then type the following brew command for installing the GNU wget command:
$ brew install wget
OpenBSD and NetBSD users try the following pkg_add command to install the wget command:
$ doas pkg_add wget
Now that the GNU wget command is installed on your Linux or Unix-like system, it is time to learn how to use the wget command to download stuff from the LAN, WAN, or Internet. Let us gets some hands-on experience to increase our productivity.

Downloading a single file using the wget command

Type the following command to grab the latest Linux kernel and compile and install the Linux Kernel from source code:
$ wget https://cdn.kernel.org/pub/linux/kernel/v5.x/linux-5.15.5.tar.xz

How to Download a File with wget command on Linux or Unix machine

Click to enlarge

The above output clearly shows that the wget command connects to the cdn.kernel.org server to download the file. It also displays the download progress, file name, DL speed, and other information. By default, the file is downloaded in the current working directory. However, you can save the downloaded file under a different directory, including the file name.

Examples about the wget

Try some examples about the wget command:
$ wget http://www.cyberciti.biz/here/lsst.tar.gz
$ wget -q ftp://ftp.freebsd.org/pub/sys.tar.gz
$ wget -O output.file http://nixcraft.com/some/path/file.name.tar.gz
$ wget --output-document=file=rhel9.iso 'https://access.cdn.redhat.com/content/origin/23e6decd3160473/rhel-baseos-9.0-beta-0-x86_64-dvd.iso?_auth_=xyz'
$ wget http://www.kernel.org/pub/linux/kernel/v2.6/testing/linux-2.6.36-rc3.tar.bz2

Unix Linux/ wget Command Examples

Fig.01: wget command in action on my Linux box

How to save the downloaded file under a different name and directory using the wget

The -O FILE or --output-document=FILE is used to tell wget to download given file from URL and save into name stated by this option. Say you are downloading https://domain/foo-bar-xyz-long-file.tzx, and you just wanted to save it as file.tzx, then:
$ wget --output-document=file=file.tzx 'https://domain/foo-bar-xyz-long-file.tzx?key=xyz'
## OR use the capital '-O' as follows ##
$ wget -O file.tzx 'https://domain/foo-bar-xyz-long-file.tzx?key=xyz'

In other words, download file and write it to given FILE named file.txz. To save the downloaded file under a different name and /tmp/, pass the -O option as follows:
$ wget -O /tmp/file.tzx \
'https://domain/foo-bar-xyz-long-file.tzx?key=xyz'

You can set directory prefix to prefix by passing the -P prefix option. The directory prefix is the directory where all other files and subdirectories will be saved to. In other words, the top of the retrieval tree. The default is . (the current directory):
$ wget -P /tmp/ url
$ wget -P /isoimages/ https://ur1/freebsd.iso
$ wget -P /isoimages/ https://ur2/opnbsd.iso
$ wget -P /isoimages/ https://ur3/rhel.iso

Logging messages to FILE when using the wget

To log messages to FILE, use:
$ wget --output-file=log.txt https://url1/..
## small '-o' ##
$ wget -o download.log.txt https://url2/..

We can combine both options as follows:
$ wget -o download.log \
-O rhel9.iso \
'https://access.cdn.redhat.com/content/origin/23e6decd3160473/rhel-baseos-9.0-beta-0-x86_64-dvd.iso?_auth_=xyz'

To view log, use the more command/cat command/less command:
$ cat download.log
more download.log
## or use the grep command/egrep command ##
$ grep 'error' download.log
$ egrep -iw 'warn|error' download.log

How do I download multiple files using the wget?

Use the following wget syntax when you wish to download stuff from multiple URLs. For example:
$ wget http://www.cyberciti.biz/download/lsst.tar.gz \
ftp://ftp.freebsd.org/pub/sys.tar.gz \
ftp://ftp.redhat.com/pub/xyz-1rc-i386.rpm

You can create a shell variable that holds all urls and use the ‘BASH for loop‘ to download all files using the wget:

URLS="http://www.cyberciti.biz/download/lsst.tar.gz \
ftp://ftp.freebsd.org/pub/sys.tar.gz \
ftp://ftp.redhat.com/pub/xyz-1rc-i386.rpm \
http://xyz-url/abc.iso"
for u in $URLS
do
 wget "$u"
done

How do I read URLs list from a text file and grab files using wget?

You can put all urls in a text file and use the -i option to the wget to download all files. First, create a text file as follows using a text editor such as nano or vi/vim:
$ vi /tmp/download.txt
Append a list of urls:

https://www.cyberciti.biz/download/lsst.tar.gz
ftp://ftp.freebsd.org/pub/sys.tar.gz
ftp://ftp.redhat.com/pub/xyz-1rc-i386.rpm
http://xyz-url/abc.iso

Then type the wget command as follows:
$ wget -i /tmp/download.txt

Resuming downloads with wget time saving tip

You can also force wget to get a partially-downloaded file i.e. resume downloads. This is useful when you want to finish up a download started by a previous instance of wget, or by another program. For instance:
$ wget -c http://www.cyberciti.biz/download/lsst.tar.gz
$ wget -c -i /tmp/download.txt

Please note that the -c option only works with FTP / HTTP / HTTPS servers that support the “range” header.

Forcing the wget to download all files in background

The -o option used to force wget to go into background immediately after startup. If no output file is specified via the -o option, output is redirected to wget-log file:
$ wget -cb -o /tmp/download.log -i /tmp/download.txt
OR
$ nohup wget -c -o /tmp/download.log -i /tmp/download.txt &
The nohup runs the given COMMAND (in this example wget) with hangup signals ignored, so that the command can continue running in the background after you log out.

How do I limit the download speed with wget command?

You can limit the download speed to amount bytes per second when using the wget command. Amount may be expressed in bytes, kilobytes with the k suffix, or megabytes with the m suffix. For example, --limit-rate=100k will limit the retrieval rate to 100KB/s for your wget download session. This is useful when, for whatever reason, you don’t want the wget to consume the entire available bandwidth. Hence, this is useful when you want to download a large file file, such as an ISO image from mirrors:
$ wget -c -o /tmp/susedvd.log --limit-rate=50k ftp://ftp.novell.com/pub/suse/dvd1.iso
Use the m suffix for megabytes (--limit-rate=1m). The above command will limit the retrieval rate to 50KB/s. It is also possible to specify disk quota for automatic retrievals to avoid disk DoS attack. The following command will be aborted when the disk quota is (100MB+) exceeded.
$ wget -cb -o /tmp/download.log -i /tmp/download.txt --quota=100m
From the wget man page:

Please note that Wget implements the limiting by sleeping the appropriate amount of time after a network read that took less time than specified by the rate. Eventually this strategy causes the TCP transfer to slow down to approximately the specified rate. However, it may take some time for this balance to be achieved, so don’t be surprised if limiting the rate doesn’t work well with very small files.

Using the wget command with the password protected sites

You can supply the http username/password on server as follows:
$ wget --http-user=vivek --http-password=Secrete https://cyberciti.com/vivek/csits.tar.gz
Another way to specify username and password is in the URL itself. For example:
$ wget 'http://username:password@cyberciti.biz/file.tar.gz
Either method reveals your password to anyone who bothers to run ps command:
$ ps aux
Sample outputs:

vivek     27370  2.3  0.4 216156 51100 ?        S    05:34   0:06 /usr/bin/php-cgi
vivek     27744  0.1  0.0  97444  1588 pts/2    T    05:38   0:00 wget http://test:test@www.kernel.org/pub/linux/kernel/v2.6/testing/linux-2.6.36-rc3.tar.bz2
vivek    27746  0.5  0.0  97420  1240 ?        Ss   05:38   0:00 wget -b http://test:test@www.kernel.org/pub/linux/kernel/v2.6/testing/linux-2.6.36-rc3.tar.bz2

To prevent the passwords from being seen, store them in .wgetrc or .netrc, and make sure to protect those files from other users with chmod command. If the passwords are really important, do not leave them lying in those files either edit the files and delete them after the wget has started the download. Under Linux you can hide processes from other users and ps command. Same can be done with FreeBSD to prevent users from seeing information about processes owned by other users using the ps command or top command/htop command.

Download all mp3 or pdf file from a remote FTP server using the wget

Generally you can use shell special character aka wildcards such as *, ?, [] to specify selection criteria for files. Same can be use with FTP servers while downloading files. For instance:
$ wget ftp://some-dot-com/url/pub/downloads/*.pdf
$ wget ftp://somedom-url/pub/downloads/*.pdf

OR
$ wget -g on ftp://somedom-com/pub/downloads/*.mp4

Downloading file to the stdout and shell pipes wget command example

In this example, the wget command will grab the file to stand output device (-O -) and pipe it to the shell command of your choice (such as the tar command:
$ wget -q -O - 'https://url1/file.tar.xz' | tar -Jxzf - -C /tmp/data/

How to make a mirror of a website using the wget

Some websites/ftp servers may throttle requests or ban your IP address for excessive requests. So use this option carefully and do not overload remote servers.

We can create a mirror of a website with wget by passing the -m or --mirror option:
$ wget -m https://url/
$ wget --mirror https://url/

Changing the wget User-Agent

We can identify as -U AGENT (--user-agent=AGENT) instead of default Wget/VERSION. For example, change user-agent to ‘Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:94.0) Gecko/20100101 Firefox/94.0’
$ wget -U 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:94.0) Gecko/20100101 Firefox/94.0' \
https://www.nixcraft.com/robots.txt

Getting around HTTPS (TLS) certificate errors when using the wget

Don’t validate the server’s HTTPS/TLS certificate by passing the --no-check-certificate option:
$ wget --no-check-certificate \
https://www.cyberciti.biz/robots.txt

Tip: Use the lftp when you need multithreaded download instead of wget

lftp fetches HTTP URLs in a manner similar to wget, but segments the retrieval into multiple parts to increase download speed. It gets the specified file using several connections. This can speed up transfer, but loads the net heavily impacting other users. Use only if you really have to transfer the file ASAP.
$ lftp -e 'pget -n 5 -c url/file.tar.gz; exit'
Above command will download file.tar.gz in 5 segments/connections.

The wget command tutorials in video format

Here is a quick video format:

Examples of GUI based file downloaders for Linux and Unix

Not a fan of wget CLI? Try the following GUI based tools:

  • KGet is a versatile and user-friendly download manager for KDE desktop system.
  • Gwget is a download manager for the Gnome Desktop
  • Uget is an easy-to-use download manager written in GTK+

Conclusion

The GNU wget command is a powerful command-line utility to download files, resume broken partial downloads, mirror HTTP/FTP sites, provide user authentication, throttle download speed, and much more. Please note that wget command is also available on Linux and *BSD/macOS/Unix-like systemce. Hence, do check wget command documentation and source code for more advanced options. Of course, seasond developers, users and sysadmin can always type the following man command or help command to read docs on your Linux or Unix server prompt:
$ man wget
$ wget --help
# get a short help for wget options using the egrep command as filter #
$ wget --help | grep -Ewi -- '-(i|w|e)'

🥺 Was this helpful? Please add a comment to show your appreciation or feedback.

nixCrat Tux Pixel Penguin
Hi! 🤠
I'm Vivek Gite, and I write about Linux, macOS, Unix, IT, programming, infosec, and open source. Subscribe to my RSS feed or email newsletter for updates.

44 comments… add one
  • Priya Jan 7, 2011 @ 10:51

    Sir,
    I have used the command given as example but error is comming as given below.
    wget url
    –21:55:38– url
    Resolving http://www.cyberciti.biz 75.126.153.206, 2607:f0d0:1002:51::4
    Connecting to http://www.cyberciti.biz|75.126.153.206|:80… failed: Network is unreachable.
    Connecting to http://www.cyberciti.biz|2607:f0d0:1002:51::4|:80… failed: Network is unreachable.

  • zack Feb 9, 2011 @ 4:40
    wget --http-user=***** --http-passwd=********** -O test.txt -o LogFile.txt http://ds.data.jma.go.jp/gmd/ship/data/ShipData/dd/Z__C_RJTD_yyyymmddhhnn--_SHIP_ORIGINAL_AN.txt
    

    Please help me. I want to download data from this website – http://ds.data.jma.go.jp/gmd/ship/download/howtoget.html using above wget command. The problem is how to download the whole one day data with minutes and hour is changing such as starting from 0000UTC-8am(Malaysia local time) 1/1/2010 until 1/1/2010-2350UTC. The data is available for every 10 minutes and time is in UTC format(hhmm).

    variable
    yyyy=year, mm=month, dd=day, hh=hour, nn=minute

    so for this example, yyyy=2010, mm=01, dd=01, hh=hour, nn=minute, user=mmd, passwd=bmlofeb2011

    I hope anyone can help me how to download the data.

  • Dinesh Jadhav Feb 15, 2011 @ 7:59

    I want to change the owner of the file with the wget command, what ever the filles wiil be downloaded needed to change the owner.

  • m3nt4t Apr 17, 2011 @ 11:34

    The easiest way to avoid changing owner/group of download files via wget is to use sudo (run process as other user) or set the download path to a mount point with specific uid/gid (e.g. uid=henry,gid=henry)

  • Ajay May 3, 2011 @ 10:09

    Hello,

    how can i use the wget command if the following situation arises

    1) when connected to a particular server the wget command will download the file. for this if we set a crontab then at the mentioned time the download will happen. but at time if there is a problem with the server and is not getting connected the wget command will overwrite the existing file with a dummy file there by loosing the contents
    Is there any way to prevent this?? i.e. when not connected to the server the wget command should not create or overwrite.

  • Bipin May 20, 2011 @ 20:23

    any Idea to put password from prompt with wget command.

    Thanks,
    Bipin Bahuguna

  • Krishna Jul 4, 2011 @ 12:13

    When I use wget to download 5 files from server using a script it sends 1 GET request and waits for server to respond then sends the 2nd and so on. I want the GET to be sent simultaneously irrespective of the response from the server at the same moment. How to do this? Any insights? Thanks

  • Ajay Jul 4, 2011 @ 13:46

    You can write a small shell script such that for 5 different files u write 5 wget command inside the shell script and finally run the shell script.

  • Ajay Jul 4, 2011 @ 13:55

    URLS=”http://www.cyberciti.biz/download/lsst.tar.gz
    ftp://ftp.freebsd.org/pub/sys.tar.gz
    ftp://ftp.redhat.com/pub/xyz-1rc-i386.rpm
    http://xyz.com/abc.iso
    for u in $URLS
    do
    wget $u
    done

  • gbviswanadh Sep 18, 2011 @ 11:46

    how can extract file are in the format of 111.zip.bz2 and abc.zip.bz2 , so is this possible to bzip2 and unzip a these files in at time command,

  • mohit Dec 18, 2011 @ 18:24

    how can i download a file over a https website using wget .

  • Baronsed Jul 1, 2012 @ 12:36

    And what when it comes to get the list of links contained in a web page ?
    $ lynx -dump -listonly file > html-list

    ^^

  • netlord Jul 10, 2012 @ 15:05

    hi
    and how can i (most easyli) get files from sourceforge?
    this “automatic closiest proxy” selection is awful!

  • deepu Aug 25, 2012 @ 20:03

    hi,
    I have to download a .txt file on daily basis through our one of server, these all exercise i have done manualy. and save the file in to unix system. i want to create a automate script who can download and save file into unix directory.

    Thanks in advance

  • mike Sep 18, 2012 @ 10:57

    I’ve been with windows my whole life until I started pc science in college I started playing around with linux its a whole new world!1

  • hitayezu Jun 9, 2013 @ 12:43

    it great command thanks

  • Tony pezzella Aug 13, 2013 @ 1:19

    How do I upload a file using wget?

  • Srikanth Ravuri Apr 20, 2016 @ 18:33

    I am using the wget command to pull the files from external website. It executes successfully but, I see a file which has html code rather file that I am expecting.

    Logs from the command execution…

    $ wget --ftp-user=xxxxxxx --ftp-password=yyyyy "http://download.nlm.nih.gov/umls/kss/rxnorm/RxNorm_full_04042016.zip"
    --2016-04-20 11:31:25--  http://download.nlm.nih.gov/umls/kss/rxnorm/RxNorm_full_04042016.zip
    Resolving download.nlm.nih.gov... 130.14.16.113, 2607:f220:41e:1016::113
    Connecting to download.nlm.nih.gov|130.14.16.113|:80... connected.
    HTTP request sent, awaiting response... 302 Found
    Location: https://utslogin.nlm.nih.gov/cas/login?service=https%3a%2f%2fdownload.nlm.nih.gov%2fumls%2fkss%2frxnorm%2fRxNorm_full_04042016.zip [following]
    --2016-04-20 11:31:26--  https://utslogin.nlm.nih.gov/cas/login?service=https%3a%2f%2fdownload.nlm.nih.gov%2fumls%2fkss%2frxnorm%2fRxNorm_full_04042016.zip
    Resolving utslogin.nlm.nih.gov... 130.14.16.164, 2607:f220:41e:1016::164
    Connecting to utslogin.nlm.nih.gov|130.14.16.164|:443... connected.
    HTTP request sent, awaiting response... 200 OK
    Length: unspecified [text/html]
    Saving to: “RxNorm_full_04042016.zip.3”
    
        [                                                                                                                             ] 9,947       --.-K/s   in 0.1s
    
    2016-04-20 11:31:26 (98.5 KB/s) - “RxNorm_full_04042016.zip.3” saved [9947]
    

    Can you please help me understand what I am missing using wget?

    Thank you

  • Pecel Feb 13, 2022 @ 23:58

    Hi! May i ask you something? So, if i dont join patreon, is that mean this website will tracking me?

    • 🛡️ Vivek Gite (Author and Admin) Vivek Gite Feb 14, 2022 @ 14:25

      Hello,

      No. The Patreon content comes from a separate domain. This site is always supported by ads.

      HTH.

Leave a Reply

Your email address will not be published. Required fields are marked *

Use HTML <pre>...</pre> for code samples. Your comment will appear only after approval by the site admin.