It is a common practice to manage UNIX/Linux/BSD servers remotely over the ssh session. You may need to download download the software or other files for installation. There are a few powerful graphical download manager exits for Linux and UNIX like operating systems:
- d4x: Downloader for X is a Linux/Unix userfriendly program with nice X interface to download files from the Internet. It suppotrs both FTP and HTTP protocols, supports resuming
- kget: KGet is a versatile and user-friendly download manager for KDE desktop system.
- gwget / gwget2: Gwget is a download manager for the Gnome Desktop
However, when it comes to command line (shell prompt) wget the non-interactive downloader rules. It supports http, ftp, https protocols along with authentication facility, and tons of other options. Here are some tips to get most out of it:
Download a Single File Using wget
Type the following command:
$ wget http://www.cyberciti.biz/here/lsst.tar.gz
$ wget ftp://ftp.freebsd.org/pub/sys.tar.gz
$ wget -O output.file http://nixcraft.com/some/path/file.name.tar.gz
$ wget http://www.kernel.org/pub/linux/kernel/v2.6/testing/linux-2.6.36-rc3.tar.bz2
How Do I Download Multiple Files Using wget?
Use the following syntax:
$ wget http://www.cyberciti.biz/download/lsst.tar.gz ftp://ftp.freebsd.org/pub/sys.tar.gz ftp://ftp.redhat.com/pub/xyz-1rc-i386.rpm
You can create a shell variable that holds all urls and use the 'BASH for loop' to download all files:
URLS=”http://www.cyberciti.biz/download/lsst.tar.gz \ ftp://ftp.freebsd.org/pub/sys.tar.gz \ ftp://ftp.redhat.com/pub/xyz-1rc-i386.rpm \ http://xyz.com/abc.iso" for u in $URLS do wget $u done
How Do I Read URLs From a File?
You can put all urls in a text file and use the -i option to wget to download all files. First, create a text file:
$ vi /tmp/download.txt
Append a list of urls:
http://www.cyberciti.biz/download/lsst.tar.gz ftp://ftp.freebsd.org/pub/sys.tar.gz ftp://ftp.redhat.com/pub/xyz-1rc-i386.rpm http://xyz.com/abc.iso
Type the wget command as follows:
$ wget -i /tmp/download.txt
You can also force wget to get a partially-downloaded file. This is useful when you want to finish up a download started by a previous instance of wget, or by another program:
$ wget -c http://www.cyberciti.biz/download/lsst.tar.gz
$ wget -c -i /tmp/download.txt
Please note that the -c option only works with FTP / HTTP servers that support the "range" header.
Force wget To Download All Files In Background
The -o option used to force wget to go into background immediately after startup. If no output file is specified via the -o option, output is redirected to wget-log file:
$ wget -cb -o /tmp/download.log -i /tmp/download.txt
$ nohup wget -c -o /tmp/download.log -i /tmp/download.txt &
nohup runs the given COMMAND (in this example wget) with hangup signals ignored, so that the command can continue running in the background after you log out.
How Do I Limit the Download Speed?
You can limit the download speed to amount bytes per second. Amount may be expressed in bytes, kilobytes with the k suffix, or megabytes with the m suffix. For example, --limit-rate=100k will limit the retrieval rate to 100KB/s. This is useful when, for whatever reason, you don't want Wget to consume the entire available bandwidth. This is useful when you want to download a large file file, such as an ISO image:
$ wget -c -o /tmp/susedvd.log --limit-rate=50k ftp://ftp.novell.com/pub/suse/dvd1.iso
Use m suffix for megabytes (--limit-rate=1m). The above command will limit the retrieval rate to 50KB/s. It is also possible to specify disk quota for automatic retrievals to avoid disk DoS attack. The following command will be aborted when the quota is (100MB+) exceeded.
$ wget -cb -o /tmp/download.log -i /tmp/download.txt --quota=100m
From the wget man page:
Please note that Wget implements the limiting by sleeping the appropriate amount of time after a network read that took less time than specified by the rate. Eventually this strategy causes the TCP transfer to slow down to approximately the specified rate. However, it may take some time for this balance to be achieved, so don't be surprised if limiting the rate doesn't work well with very small files.
Use wget With the Password Protected Sites
You can supply the http username/password on server as follows:
$ wget --http-user=vivek --http-password=Secrete http://cyberciti.biz/vivek/csits.tar.gz
Another way to specify username and password is in the URL itself.
$ wget 'http://username:firstname.lastname@example.org/file.tar.gz
Either method reveals your password to anyone who bothers to run ps command:
$ ps aux
vivek 27370 2.3 0.4 216156 51100 ? S 05:34 0:06 /usr/bin/php-cgi vivek 27744 0.1 0.0 97444 1588 pts/2 T 05:38 0:00 wget http://test:email@example.com/pub/linux/kernel/v2.6/testing/linux-2.6.36-rc3.tar.bz2 vivek 27746 0.5 0.0 97420 1240 ? Ss 05:38 0:00 wget -b http://test:firstname.lastname@example.org/pub/linux/kernel/v2.6/testing/linux-2.6.36-rc3.tar.bz2
To prevent the passwords from being seen, store them in .wgetrc or .netrc, and make sure to protect those files from other users with "chmod".
If the passwords are really important, do not leave them lying in those files either---edit the files and delete them after Wget has started the download.
G) Download all mp3 or pdf file from remote FTP server:
Generally you can use shell special character aka wildcards such as *, ?,  to specify selection criteria for files. Same can be use with FTP servers while downloading files.
$ wget ftp://somedom.com/pub/downloads/*.pdfOR
$ wget ftp://somedom.com/pub/downloads/*.pdf
$ wget -g on ftp://somedom.com/pub/downloads/*.pdfH) Use aget when you need multithreaded http download:
aget fetches HTTP URLs in a manner similar to wget, but segments the retrieval into multiple parts to increase download speed. It can be many times as fast as wget in some circumstances( it is just like Flashget under MS Windows but with CLI):
$ aget -n=5 http://download.soft.com/soft1.tar.gzAbove command will download soft1.tar.gz in 5 segments.
Please note that wget command is available on Linux and UNIX/BSD like oses.
See man page of wget(1) for more advanced options.
- 30 Handy Bash Shell Aliases For Linux / Unix / Mac OS X
- Top 30 Nmap Command Examples For Sys/Network Admins
- 25 PHP Security Best Practices For Sys Admins
- 20 Linux System Monitoring Tools Every SysAdmin Should Know
- 20 Linux Server Hardening Security Tips
- Linux: 20 Iptables Examples For New SysAdmins
- Top 20 OpenSSH Server Best Security Practices
- Top 20 Nginx WebServer Best Security Practices
- 20 Examples: Make Sure Unix / Linux Configuration Files Are Free From Syntax Errors
- 15 Greatest Open Source Terminal Applications Of 2012
- My 10 UNIX Command Line Mistakes
- Top 10 Open Source Web-Based Project Management Software
- Top 5 Email Client For Linux, Mac OS X, and Windows Users
- The Novice Guide To Buying A Linux Laptop