I have turned on gzip compression as modern web browser supports and accepts compressed data transfer. However, I’m unable to do so with the wget command. How do I force wget to download file using gzip encoding?
GNU wget command is a free and default utility on most Linux distribution for non-interactive download of files from the Web. It supports various protocols such as HTTP, HTTPS, and FTP protocols, as well as retrieval through HTTP proxies.
You can save the headers sent by the HTTP server to the file, preceding the actual contents, with an empty line as the separator.
The --header option
The synax is as follows:
wget --header='HEADER-LINE' http://server1.cyberciti.biz/file.tar.gz wget -option1 --header='HEADER-LINE' http://server1.cyberciti.biz/images.bmp ### compressed speed test ### wget -O /dev/null --header='HEADER-LINE' http://server1.cyberciti.biz/lib1html5v2.js ### debug on screen ## wget -O- --header='HEADER-LINE' http://server1.cyberciti.biz/file.tar.gz
You can send HEADER-LINE along with the rest of the headers in each HTTP request. The supplied header is sent as-is, which means it must contain name and value separated by colon, and must not contain newlines. You may define more than one additional header by specifying --header more than once as follows:
wget --header='Accept-Charset: iso-8859-2' --header='Accept-Language: hr' http://server1.cyberciti.biz/file.css
Example: Testing gzip encoding with wget command
To send gzip encoding request, enter:
$ wget --header='Accept-Encoding: gzip' http://www.cyberciti.biz/hardware/linux-find-and-recover-wasted-disk-space/
--2012-10-28 17:48:06-- http://www.cyberciti.biz/hardware/linux-find-and-recover-wasted-disk-space/ Resolving www.cyberciti.biz... 188.8.131.52 Connecting to www.cyberciti.biz|184.108.40.206|:80... connected. HTTP request sent, awaiting response... 200 OK Length: unspecified [text/html] Saving to: `index.html.54' [ <=> ] 12,657 --.-K/s in 0.02s 2012-10-28 17:48:07 (583 KB/s) - `index.html.54' saved 
Download the sample page without gzip:
$ wget http://www.cyberciti.biz/hardware/linux-find-and-recover-wasted-disk-space/
--2012-10-28 17:48:37-- http://www.cyberciti.biz/hardware/linux-find-and-recover-wasted-disk-space/ Resolving www.cyberciti.biz... 220.127.116.11 Connecting to www.cyberciti.biz|18.104.22.168|:80... connected. HTTP request sent, awaiting response... 200 OK Length: unspecified [text/html] Saving to: `index.html.55' [ <=> ] 45,729 73.7K/s in 0.6s 2012-10-28 17:48:38 (73.7 KB/s) - `index.html.55' saved 
From the above two outputs:
- gzip enabled page was downloaded in 0.2 seconds using wget command.
- Without gzip page was downloaded in 0.6 seconds using wget command.
Use this option to test:
- Testing and troubleshooting HTTP server problems
- CDN edge node speed.
- Your origin server speed.
- Web server gzip comparability.
- Load balancer / reverse proxy server testing.
As of wget v1.10, this option can be used to override headers otherwise generated automatically. In this example wget is used connect to www.cyberciti.biz, but to specify ‘beta.cyberciti.biz’ in the Host header (i.e. show page from beta.cyberciti.biz for same domain name :
wget --header="Host: beta.cyberciti.biz" http://www.cyberciti.biz/
Finally, you can ave the headers sent by the HTTP server to the file, run:
$ wget --save-headers http://www.cyberciti.biz
$ vi index.html