Download a file with curl on Linux / Unix command line

I am a new macOS Unix user. I am writing a small bash shell script. How do I download files straight from the command-line interface using curl? How can I download files with cURL on a Linux or Unix-like systems?

Introduction : cURL is both a command line utility and library. One can use it to download or transfer of data/files using many different protocols such as HTTP, HTTPS, FTP, SFTP and more. The curl command line utility lets you fetch a given URL or file from the bash shell. This page explains how to download files with curl command on a Linux, macOS, *BSD and Unix-like operating systems.

ADVERTISEMENTS

How to download a file with curl command

The basic syntax:

  1. Grab files with curl run: curl https://your-domain/file.pdf
  2. Get files using ftp or sftp protocol: curl ftp://ftp-your-domain-name/file.tar.gz
  3. You can set the output file name while downloading file with the curl, execute: curl -o file.pdf https://your-domain-name/long-file-name.pdf
  4. Follow a 301-redirected file while downloading files with curl, run: curl -L -o file.tgz http://www.cyberciti.biz/long.file.name.tgz

Let us see some examples and usage about the curl to download and upload files on Linux or Unix-like systems. The command syntax is:
curl url --output filename
curl https://url -o output.file.name

Let us try to download a file from https://www.cyberciti.biz/files/sticker/sticker_book.pdf and save it as output.pdf
curl https://www.cyberciti.biz/files/sticker/sticker_book.pdf -o output.pdf
OR
curl https://www.cyberciti.biz/files/sticker/sticker_book.pdf --output output.pdf
Download a file with curl command
The -o or --output option allows you to give the downloaded file a different name. If you do not provide the output file name curl will display it to the screen.

How to download a single file without giving output name

You can save output file as it is i.e. write output to a local file named like the remote file we get. For example, sticker_book.pdf is a file name for remote URL https://www.cyberciti.biz/files/sticker/sticker_book.pdf. One can save it sticker_book.pdf directly without specifying the -o or --output option by passing the -O (capital
curl -O https://www.cyberciti.biz/files/sticker/sticker_book.pdf

Downloading files with curl

Downloading files with curl in a single shot

How to deal with HTTP 301 redirected file

The remote HTTP server might send a different location status code when downloading files. For example, HTTP URLs are often redirected to HTTPS URLs with HTTP/301 status code. Just pass the -L follow the 301 (3xx) redirects and get the final file on your system:
curl -L -O http://www.cyberciti.biz/files/sticker/sticker_book.pdf

How to download multiple files using curl

Try:
curl -O url1 -O url2
curl -O https://www.cyberciti.biz/files/adduser.txt \
     -O https://www.cyberciti.biz/files/test-lwp.pl.txt

One can use the bash for loop too:

## define a bash shell variable ##
urls="https://www.cyberciti.biz/files/adduser.txt https://www.cyberciti.biz/files/test-lwp.pl.txt"
 
## let us grab it ##
for u in $urls
do
   curl -O "$u"
done
How to download a file using curl and bash for loop

How to download a file using curl and bash for loop

How to download a password protected file with curl

Try any one of the following syntax
curl ftp://username:passwd@ftp1.cyberciti.biz:21/path/to/backup.tar.gz
curl --ftp-ssl -u UserName:PassWord ftp://ftp1.cyberciti.biz:21/backups/07/07/2012/mysql.blog.sql.tar.gz
curl https://username:passwd@server1.cyberciti.biz/file/path/data.tar.gz
curl -u Username:Password https://server1.cyberciti.biz/file/path/data.tar.gz

How to download file using a proxy server

Again syntax is as follows:
curl -x proxy-server-ip:PORT -O url
curl -x 'http://vivek:YourPasswordHere@10.12.249.194:3128' -v -O https://dl.cyberciti.biz/pdfdownloads/b8bf71be9da19d3feeee27a0a6960cb3/569b7f08/cms/631.pdf

How to use curl command with proxy username/password

How to use curl command with proxy username/password

Examples – Downloading files with curl

curl command can provide useful information, especially HTTP headers. Hence, one can use such information for debugging server issues. Let us see some examples of curl commands. Pass the -v for viewing the complete request send and response received from the web server.
curl -v url
curl -o output.pdf -v https://www.cyberciti.biz/files/sticker/sticker_book.pdf

Getting HTTP headers information without downloading files

Another useful option is to fetch HTTP headers. All HTTP-servers feature the command HEAD which this uses to get nothing but the header of a document. For instance, when you want to view the HTTP response headers only without downloading the data or actual files:
curl -I url
curl -I https://www.cyberciti.biz/files/sticker/sticker_book.pdf -o output.pdf

Download a file with curl

Getting header information for given URL

How do I skip SSL skip when using curl?

If the remote server has a self-signed certificate you may want to skip the SSL checks. Therefore, pass pass the -k option as follows:
curl -k url
curl -k https://www.cyberciti.biz/

Rate limiting download/upload speed

You can specify the maximum transfer rate you want the curl to use for both downloads and uploads files. This feature is handy if you have a limited Internet bandwidth and you would like your transfer not to use your entire bandwidth. The given speed is measured in bytes/second, unless a suffix is appended. Appending ‘k’ or ‘K’ will count the number as kilobytes, ‘m’ or ‘M’ makes it megabytes, while ‘g’ or ‘G’ makes it gigabytes. For Examples: 200K, 3m and 1G:
curl --limit-rate {speed} url
curl --limit-rate 200 https://www.cyberciti.biz/
curl --limit-rate 3m https://www.cyberciti.biz/

Setting up user agent

Some web application firewall will block the default curl user agent while downloading files. To avoid such problems pass the -A option that allows you to set the user agent.
curl -A 'user agent name' url
curl -A 'Mozilla/5.0 (X11; Fedora; Linux x86_64; rv:66.0) Gecko/20100101 Firefox/66.0' https://google.com/

Upload files with CURL

The syntax is as follows to upload files:
curl -F "var=@path/to/local/file.pdf" https://url/upload.php
For example, you can upload a file at ~/Pictures/test.png to the server https://127.0.0.1/app/upload.php which processes file input with form parameter named img_file, run:
curl -F "img_file=@~/Pictures/test.png" https://127.0.0.1/app/upload.php
One can upload multiple files as follows:
curl -F "img_file1=@~/Pictures/test-1.png" \
-F "img_file2=@~/Pictures/test-2.png" \
https://127.0.0.1/app/upload-multi.php

Conclusion

Like most Linux or Unix CLI utilities, you can learn much more about curl command by visiting this help page.

🐧 Get the latest tutorials on SysAdmin, Linux/Unix, Open Source/DevOps topics:
CategoryList of Unix and Linux commands
File Managementcat
FirewallCentOS 8 OpenSUSE RHEL 8 Ubuntu 16.04 Ubuntu 18.04 Ubuntu 20.04
Network Utilitiesdig host ip nmap
OpenVPNCentOS 7 CentOS 8 Debian 10 Debian 8/9 Ubuntu 18.04 Ubuntu 20.04
Package Managerapk apt
Processes Managementbg chroot cron disown fg jobs killall kill pidof pstree pwdx time
Searchinggrep whereis which
User Informationgroups id lastcomm last lid/libuser-lid logname members users whoami who w
WireGuard VPNCentOS 8 Debian 10 Firewall Ubuntu 20.04

ADVERTISEMENTS
3 comments… add one
  • Stick Dec 23, 2012 @ 13:41

    I always set output file with
    curl -o linux.tar.gz https://cdn.kernel.org/pub/linux/kernel/v3.x/linux-3.0.13.tar.gz

  • PJMon Feb 3, 2013 @ 12:13

    This is nice. Here is one more hint, if you use self signed SSL cert, I can get it verified it with the following:
    curl --cacert my-ssl.crt -O https://my-ip/my-file.tgz

    Cheers mate

  • Rajesh May 1, 2017 @ 3:12

    So here is the thing. I am download certain mp4 files and the remote http server limiting the connection. So after a bit of reading and thanks to your for loop example, I can get around by setting timeout time value. I found this in man page:

    Maximum time in seconds that you allow curl’s connection to take. This only limits the connection phase, so if curl connects within the given period it willontinue – if not it will exit

    So I did something:

    urls="url1 url2 ... url100"
    for u in $urls
    do 
      curl --connect-timeout=7 -O $u
    done
    

    I hope it might help someone.

Leave a Reply

Your email address will not be published.

Use HTML <pre>...</pre>, <code>...</code> and <kbd>...</kbd> for code samples.