curl Command Resume Broken Download

I know wget can resume a failed download. I am on a Mac OS X and do now want to install wget command. How can I resume a failed download using curl command on Linux or Unix-like systems?

You can continue getting a partially downloaded file using curl command. You need to pass the -C or --continue-at <offset> option resume a previous file transfer at the given offset.[donotprint]
Tutorial details
Difficulty Easy (rss)
Root privileges No
Requirements curl
Time 1m

curl resume broken download

COMMAND command The syntax is as follows to to automatically find out where/how to resume the transfer using curl command:

curl -C - url


curl -L -O -C - url


curl -L -o 'filename-here' -C - url

In this example, finish a download started by a previous instance of curl command:

curl -L -O -C -

Animated gif 01: Resume a broken download

If there is a file named CentOS-6.5-x86_64-bin-DVD1.iso in the current directory, curl will assume that it is the first portion of the remote file, and will ask the server to continue the retrieval from an offset equal to the length of the local file. Thus, it will result in saving both time and bandwidth.

Another example

You can continue a previous file transfer at the given offset. The given offset is the exact number of bytes that will be skipped, counting from the beginning of the source file before it is transferred to the destination. The syntax is:

curl -C offset url

In this example, retrieve ifdata-welcome-0.png file using the following command:

## Get first 20000 bytes ##
curl  -o file.png --header "Range: bytes=0-20000"
## Resume at 20001 offset and download the rest of the file ##
curl -o file.png -C 20001
## View an image file using a local image viewer ##
ls -lh file.png
open file.png

Sample outputs from ls command:

-rw-r--r--@ 1 vivek  wheel    30K Feb 28 23:24 file.png

Understanding options

  • -L – Follow location if HTTP 3xx status code found. For example, redirect url.
  • -O – Write output to a local file named like the remote file we get. Only the file part of the remote file is used, the path is cut off.
  • -C – Continue/Resume a previous file transfer.
  • -C offsetAT – Continue/Resume a previous file transfer at the given offset.
  • -o 'filename' – Write/Save output to ‘filename’ instead of stdout/screen.
See also

🐧 Get the latest tutorials on Linux, Open Source & DevOps via RSS feed or Weekly email newsletter.

🐧 5 comments so far... add one

CategoryList of Unix and Linux commands
File Managementcat
FirewallAlpine Awall CentOS 8 OpenSUSE RHEL 8 Ubuntu 16.04 Ubuntu 18.04 Ubuntu 20.04
Network Utilitiesdig host ip nmap
OpenVPNCentOS 7 CentOS 8 Debian 10 Debian 8/9 Ubuntu 18.04 Ubuntu 20.04
Package Managerapk apt
Processes Managementbg chroot cron disown fg jobs killall kill pidof pstree pwdx time
Searchinggrep whereis which
User Informationgroups id lastcomm last lid/libuser-lid logname members users whoami who w
WireGuard VPNAlpine CentOS 8 Debian 10 Firewall Ubuntu 20.04
5 comments… add one
  • Rahul Jawale Apr 24, 2015 @ 14:39

    Awesome! Saved 4 GB worth of FUP data this way. :)

  • Suresh Sep 23, 2015 @ 16:05

    Very useful on unreliable BSNL connection.

  • carlo Sep 29, 2015 @ 11:43

    The perfect man

  • fred Feb 4, 2017 @ 21:29

    This is great. I was printing a dollar bill from the internet before the feds stopped me. Will this help resuming that?

  • John W. Feb 15, 2017 @ 16:57

    Great tip. Was downloading itunes and both on my mac and a windows 10 laptop kept getting a network timeout — for 3 days. Just copied the url from my web browser and pasted into my mac terminal along with the switches, noted above. Thanks!

Leave a Reply

Your email address will not be published. Required fields are marked *

Use HTML <pre>...</pre> for code samples. Problem posting comment? Email me @