Linux ultra fast command line download accelerator

Usually I use wget, which is ultimate command line downloader. However, wget failed to accelerate my download speed. I was downloading 1.4GB file around 800KB/s download speed (this box is hooked to uplink port speed 10 Mbps ). However, remote server was restricting me.

Therefore, I decided to get rid of wget. I have downloaded axel – a light download accelerator for Linux command line.

How Axel works?

Axel does the same thing any other accelerator does: it opens more than one HTTP/FTP connection per download and each connection transfers its own, separate, part of the file. It may sound weird, but it works very well in practice. For example, some FTP sites limit the speed of each connection, therefore opening more than one connection at a time multiplies the allowable bandwidth. Be forewarned that some FTP operators don’t like it when you do this. It is better to open the additional connections on several servers and download from all of them simultaneously. Axel supports this feature too. You may either specify a list of mirrors to use or tell the program to search for mirrors.

This is a perfect tool over remote ssh session for downloading large file.

Install axel

If you are using Debian, type the following command to install axel:
# apt-get install axel

Or you can download axel from official website:
$ wget

Untar the axel:
$ tar -zxvf axel-1.0b.tar.gz

Configre and compile axel:
$ ./configure

Install axel:
# make install

On the other hand, just upload a newly build axel binary to remote Linux server using scp. Usually I do not install gcc c/c++ compilers collection on any of my production web/ftp/MySql servers for security reasons.

How do I use axel?

Just type the command as follows:
$ axel

Limit speed
You can also specify a speed (bytes per sec) for axel so that it will not eat up all your bandwidth. For example following will try to keep the average speed around 5242880 (5120 Kilobyte per/sec):
$ axel -s 5242880

Limit a number of connection
You can also specify number of connection you want to open. For example open 3 connections for downloading:
$ axel -n 3 -s 5242880

But how fast is axel?

Here is sample test that demonstrates how fast is axel

$ wget


           => `linux-'
Connecting to||:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 41,288,067 (39M) [application/x-bzip2]

100%[================================================================================>] 41,288,067     2.33M/s    ETA 00:00

12:10:48 (2.31 MB/s) - `linux-' saved [41288067/41288067]

$ axel


Initializing download:
File size: 41288067 bytes
Opening output file linux-
Starting download

[  0%]  .......... .......... .......... .......... ..........  [ 247.1KB/s]
[  0%]  .......... .......... .......... .......... ..........  [ 408.3KB/s]
[  0%]  .......... .......... .......... .......... ..........  [ 566.3KB/s]
[  0%]  .......... .......... .......... .......... ..........  [ 707.2KB/s]
[  0%]  .......... .......... .......... .......... ..........  [ 836.5KB/s]
[  0%]  .......... .......... .......... .......... ..........  [ 975.9KB/s]
[  0%]  .......... .......... .......... .......... ..........  [1079.9KB/s]
[  0%]  .......... .......... .......... .......... ..........  [1210.0KB/s]
[  0%]  .......... .......... .......... .......... ..........  [1303.1KB/s]
[  1%]  .......... .......... .......... .......... ..........  [1422.1KB/s]
[  1%]  .......... .......... .......... .......... ..........  [1508.0KB/s]
[  1%]  .......... .......... .......... .......... ..........  [1629.2KB/s]
[ 99%]  .......... .......... .......... .......... ..........  [8710.2KB/s]
[ 99%]  .......... .......... .......... .......... ..........  [8680.7KB/s]
[100%]  .......... ..........

Downloaded 39.4 megabytes in 4 seconds. (8681.65 KB/s)

As you see axel downloaded same file in 4 seconds. Another great thing its binary size, I can put axel on boot disk and replace a wget.

prozilla – another good program with GUI frontend

One of the drawback of axel is you can not specify ftp username and password. You can use prozilla program, which also makes multiple connections and downloads a file in multiple parts simultaneously thus enhancing the download speed and dwnloading the file faster than a download with a single connection.

FTP passwords can be specified with the URL, or can be obtained automatically from ~/.netrc if it exists.

Install prozilla

# yum install prozilla

Or download prozilla from official web site.

To use prozilla just type the following command (command line version):
$ proz

prozilla - another good program with GUI frontend
(click to enlrage image)

Further readings

🐧 Get the latest tutorials on Linux, Open Source & DevOps via RSS feed or Weekly email newsletter.

🐧 62 comments so far... add one
CategoryList of Unix and Linux commands
Disk space analyzersncdu pydf
File Managementcat
FirewallAlpine Awall CentOS 8 OpenSUSE RHEL 8 Ubuntu 16.04 Ubuntu 18.04 Ubuntu 20.04
Network UtilitiesNetHogs dig host ip nmap
OpenVPNCentOS 7 CentOS 8 Debian 10 Debian 8/9 Ubuntu 18.04 Ubuntu 20.04
Package Managerapk apt
Processes Managementbg chroot cron disown fg jobs killall kill pidof pstree pwdx time
Searchinggrep whereis which
User Informationgroups id lastcomm last lid/libuser-lid logname members users whoami who w
WireGuard VPNAlpine CentOS 8 Debian 10 Firewall Ubuntu 20.04
62 comments… add one
  • santi.kq Sep 2, 2006 @ 22:20

    another good download accelerator for linux is not a standalone program, but a firefox extension: downthemall. it works great, and the acceleration rate is impressive.

    • kimmieshallie Dec 21, 2010 @ 12:42

      Download downthemall is a very good download extention, I have a slow internet connection but it’s a real helper when downloading things. Also flashgot helps grabbing video files while browsing firefox

  • Michael Biddulph Sep 3, 2006 @ 0:51

    Bytes per sec….ffs…who thought that one up??

  • ajs Sep 3, 2006 @ 12:08

    can you repeat the test with a different download file, and
    do the axel download first and the wget second? it seems likely
    that the file was cached in a transparent web proxy which could
    account for the speed difference.

  • 🐧 nixCraft Sep 3, 2006 @ 12:27


    Noop, I am directly connected and not using a transparent proxy appliance or server. If I run wget axel wget axel I am getting same result.

    Appreciate your post.

  • Mike Schroll Sep 3, 2006 @ 13:45

    I’ve always used lftp for this purpose — and in my two informal tests on two different boxes — it seems to be superior to axel:

    Downloaded 39.4 megabytes in 15 seconds. (2673.72 KB/s)

    lftp :~> pget
    41292428 bytes transferred in 13 seconds (3.00M/s)

    Downloaded 39.4 megabytes in 16 seconds. (2376.57 KB/s)

    lftp :~> pget
    41295932 bytes transferred in 13 seconds (3.06M/s)
    lftp :~>

  • rembrandt Sep 3, 2006 @ 14:22

    you`re missing the fact that your ISP propably has a Proxy-Server wich has then a copy of the file in it`s cache so you don´t proof that anything gets speeded up.

  • nalley Sep 3, 2006 @ 14:41

    A transparent proxy is just that – transparent. You wouldn’t know if you were using one or not. That is why ajs recommendation of repeating the test with a different file and using axel first makes sense. Your ISP (or your ISP’s ISP) might be using a transparent proxy. Cisco routers use WCCP (Web Cache Control Protocol) to implement transparent caching (there are lots of other ways to do it as well).

    If transparent caching was in use, “wget” would have populated the cache and “axel” would have retrieved from the cache.

    Also, I agree that connecting to multiple mirrors and pulling a “piece” of the file from each could speed things up – but connecting multiple times to the same server to avoid “bandwidth limits per connection” is silly. A smart admin would limit bandwith per IP not per connection.

  • 🐧 nixCraft Sep 3, 2006 @ 15:11

    Nalley I am aware of both transparent proxy and WCCP. This entire article is not about your home dsl or adsl connection. My box is hooked to 10Mbps uplink and other few boxes I managed they are hooked to 1000Mbps.

    This is a collocated box and my dedicated hosting service provider does not use caching. In fact, they sell content caching as a different product (which is quite expensive ) . And yes I did the test as I said earlier in following order:

    And result are almost same only 1 second different.

  • JoJo Sep 3, 2006 @ 16:38

    You could at least put a caveat about the increased resource usage on servers that using an accelerator causes. If someone is providing something to you for free, don’t ve an asshat and just use wget/curl.

  • Wilmer Sep 3, 2006 @ 16:52

    Hmm, are you sure Axel can’t download from FTP sites that require a password? I wrote the program and I’d be very embarassed if that functionality would really be missing…

    It should be possible to just give it a URL like ftp://username:password@hostname/path/etc.

    It’s interesting to see an article about a program I don’t maintain for four years already. I don’t use it myself anymore, but it’s nice that other people are still happy about it. πŸ™‚

  • 🐧 nixCraft Sep 3, 2006 @ 17:12


    Thanks for comment.

    Sorry to say but whenever I use url, axel bumped my back with Segmentation fault error.

    That is why I am using proz. If you get time, try to fixΓƒΒ’Γ’β€šΒ¬Γ‚Β¦

    Axel is extremely tiny and part of my rescue disk as well.

  • dan Sep 3, 2006 @ 17:57


  • Son Nguyen Sep 3, 2006 @ 18:27

    Nice! FlashGet is a similar tool for Windows but this is definitely useful tool to use under Linux. Thanks for sharing.

  • jojomonk Sep 3, 2006 @ 19:23

    i got same results w/ both wget and axel – sticking to wget. Not a fan of the crazy status printouts done by axel.

  • Juan García Sep 3, 2006 @ 20:37

    In my opinion lftp is much better because it supports both http and ftp and has tons of options and commands. lftp just works!

    • Kenneth Endfinger Dec 5, 2011 @ 20:07

      lftp is alot faster for me. I love it!

  • Keith Sep 6, 2006 @ 21:26

    From all those posts above, I think I will still keep to using wget.

  • Pascal Bleser Sep 7, 2006 @ 10:27

    Another fast command-line download accelerator worth looking into is aria2.
    It also supports bittorrent and has a very, very low memory footpring (even less than rtorrent.

    For people using SUSE Linux, I maintain aria2 RPMs here.

  • 🐧 nixCraft Sep 7, 2006 @ 12:14

    Pascal Bleser

    Thanks for pointing out aria2!

  • bugeyedmonster Sep 8, 2006 @ 1:51

    Also works great on mac using Fink! A simple /sw/bin/apt-get axel had me up and running in no time. Thanks for the tip πŸ˜‰

  • Rob Swift Oct 14, 2006 @ 0:05

    I noticed you used url when in fact it would be correct to use the @ symbol between the password and server like this. url

  • daniel Jan 17, 2007 @ 13:43

    well something abou t download accelerators.
    1st the DON’T really accelerates some times BUT they ALLways try to use your MAX bandwidth.
    2nd They can accelerate your download because the make more than one single connection. Its because when you connect just once you connection may slow down and it´s slower to one connection speed up when the traffic is up again, so if you have 4 connections is fastter to reuse the bandwidth.
    3rd they resume support works better… πŸ™‚

    I love download accelerator cause I’m allways doing some kind download… and be moving to windows just to do that is boring. So I developed my own download acelerator, based on axel’s code the new download accelerator for linux has released called doKa, it is made for KDE (I love it) and is working pretty well with some problens that you can faind on the projects page…

    If you have interest check out…

  • 🐧 nixCraft Jan 17, 2007 @ 22:23


    Thanks for pointing out your porting. I will check it out later on. πŸ™‚

  • Dhruva K Jan 27, 2007 @ 19:36

    do any of the above programs tune the TCP stack to use maybe a greater window size? And there is that scale option too, if I recall right, to increase the granularity of the specified window….could be in gigs now i think… Wonder if doing that might help…?I do think there are some ‘knobs’ given by the stack to adjust these parameters….

  • Dhruva K Jan 27, 2007 @ 19:42

    And just to add to that ‘smart admin will limit BW per IP not connection’, smarter download accelerators could use multiple IP’s assigned to the same NIC and vary the connection at L3 instead of L4.

  • Matt Sep 14, 2007 @ 13:54

    Can you not specify a username pass via the URL?

  • henry Oct 10, 2007 @ 11:06

    how can I adjust or increase the size to be downloaded in my smoothwall linux ?

  • Rabiul Hassan Khan Dec 10, 2007 @ 2:26

    All Praises to Allah.

    I have found lftp is the only right tool. I tried prozilla (proz and prozgui), axel, aria2c and these are good but don’t have resume support. Prozilla has resume support but you have to quit the program mentioning your intention to resume later (for proz press Ctrl + R, and for prozgui click on Abort, resume later). If you press the computer’s reset button in the middle of a download and try to resume the broken download, it can’t be done with prozilla. Prozgui will go on downloading the rest but at the end it completes the download with wrong size.

    But with lftp you can download and accelerate download with multiconnection and resume a broken download later. I have tested with version 3.5.2 and earlier version may not work to resume download with pget (pget is needed for acceleration/opening more connection). So, get 3.5.2 or later version. Some lftp commands are as follwos:

    Get a file:
    lftp -e ‘pget http://ftp.file.tgz

    Continue broken download:
    lftp -e ‘pget -c http://ftp.file.tgz

    Get file with 7 connection:
    lftp -e ‘pget -n 7 -c http://ftp.file.tgz

    View setting:
    lftp -c set -a

    * lftp shell:
    Enter to lftp shell by entering command lftp and get a file by:
    pget http://ftp.file.tgz

    view setting:
    set -a

    change setting for saving downloading status teporarily (only available for the session, get back to default value after exit):
    set pget:save-status 5s

    change setting for number of downloading connection teporarily (only available for the session, get back to default value after exit):
    set pget:default-n 7

    * To change the setting permanently edit /etc/lftp.conf
    add line
    set pget:save-status 5s
    set pget:default-n 7

    Default time for pget save status is 10s, and connection number is 5

  • Boovarahan S Apr 5, 2008 @ 10:41

    What about d4x ? I have used aria / aria2c , axel , d4x , downthemall in firefox and I find aria2c highly fast and helpful.

  • bubo Aug 14, 2008 @ 4:37

    i use axel since half a year or so and i’m really quite happy with it. it does not spawn too many connections (you know i don’t wanna fall on sysadmin’s nerves) and never made a mistake until now. very reliable. i use it mostly to download iso images. the md5’s are always alright. i might give aria2c a try…

  • vinu Aug 15, 2008 @ 13:46

    i’m new to using prozilla.i followed the two above mentioned steps to install it,

    # yum install prozilla
    $ proz

    but i thought it was a GUI.and even then i need to know to download any file how do go about it.

    will the following work

    if it does where are the files being saved???
    please reply.

    • weman Mar 25, 2011 @ 16:32

      this doesnt make sense

    • Randy Oct 17, 2013 @ 9:05

      it is on home folder

  • azhen Nov 8, 2008 @ 8:46

    nice work

  • Markidi Nov 23, 2008 @ 17:04

    how can we use axel to download file from Rapidshare (how to put rapidshare username and password in axel?), any idea?

  • emiraga Apr 7, 2009 @ 12:47

    Here is my axel.2-3 patch for cookie loading from Firefox 3

    It is extremely buggy. πŸ™‚ I Hope you enjoy it.

  • AR May 4, 2009 @ 23:24

    I made scripts to download from ( and rapidshare (
    What I want to mention is prozilla (2.0.4) really does not work in my case! Neither it can download from youtube nor from rapidshare! With rapidshare it fails to persist the connection. I think prozilla is not well implemented and it has many bugs around.
    Anyway, I’ll look at the code of prozilla when I have time.

  • Soliman Alqubati May 18, 2009 @ 19:37

    I want the source code for some Linux comands

  • Olo Jun 12, 2009 @ 13:21

    Axel was working good but it has an 2gb file limit. I can see that aria2 can resume downloads, so that is my choice for now. The speeds are comperable.

  • Aug 24, 2009 @ 20:05

    No matter what tools we use, we can’t beat the ISP load balance and increase our download speeds.

  • Jimmy Dean Sep 7, 2009 @ 3:34

    I was just downloading a file over FTP through wget… their server slowed me down to 300KB/s, I used this program and set the connections to 5 and I am getting 1670.6KB/s works for me…

  • Rupesh Mishra Oct 21, 2009 @ 14:50

    Try the latest axel-2.4, the limit of 2Gb file is no more.

  • gaous afrizal Dec 6, 2009 @ 5:25

    hahaha,,, i think it’s just only taste πŸ˜€
    some people think aria2 is better, lftp, axel .It ‘s your choice to use it. remember, network has bursty traffik πŸ˜€

  • Xster Dec 7, 2009 @ 1:31

    Do agree that Axel’s printouts are a bit wild. But its in a visor on mac so doesn’t bother me

  • ayip.eiger Mar 23, 2010 @ 9:36

    Where i find axel downloaded files?

    • Randy Oct 17, 2013 @ 9:05

      home folder

  • ind daz Mar 30, 2010 @ 3:58

    I’m in a this type of situation,I dont know how to go working on Fedora 9.looking for ur help,, thnxzz!!!

    The strip option is enabled. This should not be a problem usually, but on some
    systems it breaks stuff.

    Configuration done:
    Internationalization disabled.
    Debugging disabled.
    Binary stripping enabled.
    [root@localhost axel-1.0b]# make install
    mkdir -p /usr/local/bin/
    cp axel /usr/local/bin/axel
    cp: cannot stat `axel’: No such file or directory
    make: *** [install-bin] Error 1

    • adibaskom Jul 2, 2011 @ 9:47

      do this
      #make install

  • Roman Apr 30, 2010 @ 12:36

    Hi ind daz.

    i had the same problem. try to type “apt-get -f install”. after that you can install it.

  • Long Jul 13, 2010 @ 12:10

    A nice one to try for linux would be SKDownloader. It is a download accelerator having an excellent gui with themes support (not sure how many would be using it though πŸ™‚ ). It is fully free and unlike other download accelerators, it allows you to choose the number of simultaneous connections you can make and it is not limited to 3 or 4 which is the case with most others. Their link is

  • Paul Ward Jul 25, 2010 @ 22:54

    Just tried wget on a file from a friends box in the UK to my server in NZ and was getting a total download time of 8 hours +
    Switched to my windows server and used firefox with downloadthemall and was getting 4 streams down and total download time est around 5 hours.
    Installed axel and tried however I was getting proxy issues and being requested for my domain and user + password, this is a pain as I had my http_proxy already exported but it did not use it.
    Then tried aria2 and amazing at this time it is sayinfg 2 hours 30 mins that blows all the above away and my windows firefox to boot.
    Yet to see if the file md5sums match and if the download time is real and not an extimate but it’s looking good for now especally as from the remote box I am lucky to get anywher near 50k usually around 30k πŸ™‚
    [ SIZE:19.7MiB/538.8MiB(3%) CN:5 SPD:58.2KiBs ETA:2h31m59s]]

  • Raam Jan 21, 2011 @ 23:07

    Thanks a lot . I also though wget is the ultimate downloader but axel is so much faster out the box. This really improved my life πŸ˜€

  • Neigyl Noval May 25, 2011 @ 16:13

    Hi. I’m using axel to download a 4 GB software. When it downloaded 98%, it suddenly gives “write error”. I tried it again, but it still gives write error. It says.

    File size: 4314152960 bytes
    Opening output file Xilinx_ISE_DS_Lin_13.1_O.40d.1.1.tar
    State file found: 4251837514 bytes downloaded, 62315446 to go.
    Starting download

    ,,,,,,,,,, ,,,,,,,,,, ,,,,,,,,,, ,,,,,….. …..
    Write error!

    Downloaded 10.7 kilobytes in 0 seconds. (24.90 KB/s)

    I still have more than 30 GB space and the partition is ext3.

    How to fix this? Thanks.

  • Arvind Oct 13, 2011 @ 2:33

    I tried axel — for me it works thrice as fast as wget. (I tried wget and axel on different huge files and measured the speed difference so the transparent proxy issue is not there.)

    This is ideal for someone who wants to download a huge file onto some remote Unix computer in the cloud. (1) Cannot fire-up mozilla on the remote computer even using ssh -X (painfully slow). (2) Cannot download huge file on to local lap-top and then re-up-load to remote compute in the cloud (idiotic).

    Therefore — go go go axel ! Love it!

    I haven’t tried aria and the other softwares mentioned here but they may well be just as good.

  • Dinesh Oct 30, 2011 @ 15:14

    Thats pretty amazing. Can download files at a speed more than my max download limit.

  • rafi Nov 27, 2011 @ 16:19

    Sorry i am unable to install axel.Plz can any one help me?

  • abdelouahab Dec 8, 2011 @ 8:22

    @rafi : use apt-get install axel
    dont try to use the GUI interface, it dont work here (ubuntu 11.10) use the command line:
    (look here)
    it really impressed me how it’s fast, i’ve unistalled it the first time i’ve used it because it dident show me anything! the console opened in black! but i’ve reinstalled it and used it directly from the bash using the command line “alex” (without quotes) and it worked πŸ˜€

  • Prescilla Jun 5, 2012 @ 1:20

    Does axel support resuming partial downloads, like wget -c???
    If so, how do I resume a partial download with axel???

    • nick Sep 23, 2012 @ 20:24

      for instance the connection was lost, cancel the download by CTRL+C
      after that, enter same command in your previous download of axel. it will resume automatically

  • AndresVia Jul 30, 2013 @ 12:29

    For the people complaining about the verbosity of axel, they should try the options
    –alternate, -a
    This will show an alternate progress indicator. A bar displays
    the progress and status of the different threads, along with
    current speed and an estimate for the remaining download time.
    –quiet, -q
    No output to stdout.

  • Saeed Apr 8, 2015 @ 18:09

    If you are using firefox , you can use “axel-downloader” firefox plugin.

Leave a Reply

Your email address will not be published.

Use HTML <pre>...</pre> for code samples. Still have questions? Post it on our forum