curl Command Resume Broken Download

See all UNIX related articles/faq
I know wget can resume a failed download. I am on a Mac OS X and do now want to install wget command. How can I resume a failed download using curl command on Linux or Unix-like systems?

You can continue getting a partially downloaded file using curl command. You need to pass the -C or --continue-at <offset> option resume a previous file transfer at the given offset.[donotprint]
Tutorial details
Difficulty level Easy
Root privileges No
Requirements curl
Est. reading time 2 minutes

curl resume broken download

use command The syntax is as follows to to automatically find out where/how to resume the transfer using curl command:

curl -C - url


curl -L -O -C - url


curl -L -o 'filename-here' -C - url

In this example, finish a download started by a previous instance of curl command:

curl -L -O -C -

Animated gif 01: Resume a broken download

Animated gif 01: Resume a broken download

If there is a file named CentOS-6.5-x86_64-bin-DVD1.iso in the current directory, curl will assume that it is the first portion of the remote file, and will ask the server to continue the retrieval from an offset equal to the length of the local file. Thus, it will result in saving both time and bandwidth.

Another example

You can continue a previous file transfer at the given offset. The given offset is the exact number of bytes that will be skipped, counting from the beginning of the source file before it is transferred to the destination. The syntax is:

curl -C offset url

In this example, retrieve ifdata-welcome-0.png file using the following command:

## Get first 20000 bytes ##
curl  -o file.png --header "Range: bytes=0-20000"
## Resume at 20001 offset and download the rest of the file ##
curl -o file.png -C 20001
## View an image file using a local image viewer ##
ls -lh file.png
open file.png

Sample outputs from ls command:

-rw-r--r--@ 1 vivek  wheel    30K Feb 28 23:24 file.png

Understanding options

  • -L – Follow location if HTTP 3xx status code found. For example, redirect url.
  • -O – Write output to a local file named like the remote file we get. Only the file part of the remote file is used, the path is cut off.
  • -C – Continue/Resume a previous file transfer.
  • -C offsetAT – Continue/Resume a previous file transfer at the given offset.
  • -o 'filename' – Write/Save output to ‘filename’ instead of stdout/screen.
See also

🥺 Was this helpful? Please add a comment to show your appreciation or feedback.

nixCrat Tux Pixel Penguin
Hi! 🤠
I'm Vivek Gite, and I write about Linux, macOS, Unix, IT, programming, infosec, and open source. Subscribe to my RSS feed or email newsletter for updates.

5 comments… add one
  • Rahul Jawale Apr 24, 2015 @ 14:39

    Awesome! Saved 4 GB worth of FUP data this way. :)

  • Suresh Sep 23, 2015 @ 16:05

    Very useful on unreliable BSNL connection.

  • carlo Sep 29, 2015 @ 11:43

    The perfect man

  • fred Feb 4, 2017 @ 21:29

    This is great. I was printing a dollar bill from the internet before the feds stopped me. Will this help resuming that?

  • John W. Feb 15, 2017 @ 16:57

    Great tip. Was downloading itunes and both on my mac and a windows 10 laptop kept getting a network timeout — for 3 days. Just copied the url from my web browser and pasted into my mac terminal along with the switches, noted above. Thanks!

Leave a Reply

Your email address will not be published. Required fields are marked *

Use HTML <pre>...</pre> for code samples. Your comment will appear only after approval by the site admin.