Wget: Resume Broken Download

Posted on in Categories Debian Linux, Linux, Networking, Shell scripting, Suse Linux, Tip of the day last updated February 15, 2006

The GNU Wget is a free utility for non-interactive download of files from the Web. It supports HTTP, HTTPS, and FTP protocols, as well as retrieval through HTTP proxies. Recently, I was downloading a Ubuntu Linux ISO (618 MB) file for testing purpose at my home PC. My Uninterrupted Power Supply (UPS) unit was not working. I started download with the following wget command:

$ wget http://ftp.ussg.iu.edu/linux/ubuntu-releases/5.10/ubuntu-5.10-install-i386.iso
However, due to power supply problem, my computer rebooted at 98% download. Again, after reboot I typed wget at a shell prompt:
$ wget http://ftp.ussg.iu.edu/linux/ubuntu-releases/5.10/ubuntu-5.10-install-i386.iso
However, wget restarted to download ISO image from scratch again. I thought wget should resume partially downloaded ISO file.

wget resume download

After reading wget(1), I found the -c or --continue option to continue getting a partially downloaded file. This is useful when you want to finish a download started by a previous instance of wget, or by another program. The syntax is:

wget -c url
wget --continue url
wget --continue [options] url

So I decided to continue getting a partially-downloaded ubuntu-5.10-install-i386.iso file using the following command:
$ wget -c http://ftp.ussg.iu.edu/linux/ubuntu-releases/5.10/ubuntu-5.10-install-i386.iso
OR
$ wget --continue http://ftp.ussg.iu.edu/linux/ubuntu-releases/5.10/ubuntu-5.10-install-i386.iso
Sample session:

Animated gif 01: wget resume  a failed download
Animated gif 01: wget resume a failed download

If there is a file named ubuntu-5.10-install-i386.iso in the current directory, Wget will assume that it is the first portion of the remote file, and will ask the server to continue the retrieval from an offset equal to the length of the local file. Thus, it will result in saving both time and bandwidth.

See also:

Posted by: Vivek Gite

The author is the creator of nixCraft and a seasoned sysadmin and a trainer for the Linux operating system/Unix shell scripting. He has worked with global clients and in various industries, including IT, education, defense and space research, and the nonprofit sector. Follow him on Twitter, Facebook, Google+.

Share this on (or read 75 comments/add one below):

75 comment

  1. Thanks a lot !! wget is my fav command line tool i never knew abt this option ! this time it happend my internet connection suddenly got disconnected while downloadin an .iso file
    -c option really saved my time ! thx again ๐Ÿ™‚

  2. Thank you. I kept losing my wireless signal today and after 3 attempts at downloading a 719M file (close to 70% on one attempt), I realized that there MUST be a way to resume. I googled “resume” and “wget” and here I am. Good post

  3. You have seriously saved me. I needed the last 4KB of a 1.9GB file – and I’m at a site with 7KB/s – 60KB/s download speeds. I kiss your virtual feet on this one.

  4. does anyone had experience on how to resume a same file from different server url? Seem that wget can’t use a different url to resume download a file (shown file not found, but actually can start a new download)

    Thank.

  5. SWEET! for me, this is HEAVEN. I am out in the middle of the forest, and all i get for internet is tethering my cell phone to my computer…. at a whopping 3k-10k speed.
    (good thing i have unlimited data plan) It can hell just doing a simple 20-30 mb file. At least now i do not have to start all the way at zero again. Huge thanks!

  6. Thanks a ton Vivek.
    I used your info to resume a broken http (browser) download of a 11.04 iso.
    The command-line / URL I eventually figured for this was :

    wget -c http://ubunturelease.hnsdc.com/11.04/ubuntu-11.04-desktop-i386.iso
    

    Sharing this just in case this contemporary URL info helps some more of us in a similar situation.
    Cheers
    TechNife, 20-Jun-2011.

  7. Hello, thanks for this post. I have been using wget before. And I would like to ask anyone of you if there is a way for wget to resume a partially downloaded file but this time from a different server url? I am using torrific and it gives a different url after computer shutdown and resuming downloads.

  8. How I wish I came across this post before I got screwed up by chrome. I have been downloading the image for Ubuntu 12.04 and it so happened that my internet connection went off and chrome threw me the “INTERRUPTED DOWNLOAD” error and worst of all even the downloaded part was deleted… and I was almost getting to the end of it. Though I really thought there MUST be a way of resuming a download however much chrome had let me down due to my internet connection going down. I Googled and here I landed on this great solution. To me this is really great because now this is the method am going to use to do downloads from now onward.

  9. Its been 3 years that i use wget extensevly because you can resume on any file
    even when using another mirror link (well even wrong links for another file)
    even at the time i could resume megaupload(before it was shutdown) and mediafire
    (which is still working)
    I wrote some scripts with wget that can ripe anything from any site
    read the man and check the cookies section which enable to download otherwise forbiden content (that was intended to be downloaded using the browser only)
    and changing wget signature (lying that its a browser and not a download tool)

  10. Hehe, i know this is an quite old post, and it might seem funny these days with 100MB connection when 700MB just take 40 seconds to download …

    Does anyone know a way to make wget behave a bit more agressive if a download stops? I need to something from a really bad connection, it stops and waits long long time until retry. if i cancel and resume the download it goes quite well for another 10 min or so …

Leave a Comment