≡ Menu

Linux ultra fast command line download accelerator

Usually I use wget, which is ultimate command line downloader. However, wget failed to accelerate my download speed. I was downloading 1.4GB file around 800KB/s download speed (this box is hooked to uplink port speed 10 Mbps ). However, remote server was restricting me.

Therefore, I decided to get rid of wget. I have downloaded axel - a light download accelerator for Linux command line.

How Axel works?

Axel does the same thing any other accelerator does: it opens more than one HTTP/FTP connection per download and each connection transfers its own, separate, part of the file. It may sound weird, but it works very well in practice. For example, some FTP sites limit the speed of each connection, therefore opening more than one connection at a time multiplies the allowable bandwidth. Be forewarned that some FTP operators don't like it when you do this. It is better to open the additional connections on several servers and download from all of them simultaneously. Axel supports this feature too. You may either specify a list of mirrors to use or tell the program to search for mirrors.

This is a perfect tool over remote ssh session for downloading large file.

Install axel

If you are using Debian, type the following command to install axel:
# apt-get install axel

Or you can download axel from official website:
$ wget http://wilmer.gaast.net/downloads/axel-1.0b.tar.gz

Untar the axel:
$ tar -zxvf axel-1.0b.tar.gz

Configre and compile axel:
$ ./configure

Install axel:
# make install

On the other hand, just upload a newly build axel binary to remote Linux server using scp. Usually I do not install gcc c/c++ compilers collection on any of my production web/ftp/MySql servers for security reasons.

How do I use axel?

Just type the command as follows:
$ axel http://download.com/file.tar.gz

Limit speed
You can also specify a speed (bytes per sec) for axel so that it will not eat up all your bandwidth. For example following will try to keep the average speed around 5242880 (5120 Kilobyte per/sec):
$ axel -s 5242880 http://download.com/my.iso

Limit a number of connection
You can also specify number of connection you want to open. For example open 3 connections for downloading:
$ axel -n 3 -s 5242880 http://download.com/my.iso

But how fast is axel?

Here is sample test that demonstrates how fast is axel

$ wget http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2

Output:

--12:10:31--  http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2
           => `linux-2.6.17.11.tar.bz2'
Resolving kernel.org... 204.152.191.5, 204.152.191.37
Connecting to kernel.org|204.152.191.5|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 41,288,067 (39M) [application/x-bzip2]
100%[================================================================================>] 41,288,067     2.33M/s    ETA 00:00
12:10:48 (2.31 MB/s) - `linux-2.6.17.11.tar.bz2' saved [41288067/41288067]

$ axel http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2

Output:

Initializing download: http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2
File size: 41288067 bytes
Opening output file linux-2.6.17.11.tar.bz2.1
Starting download
[  0%]  .......... .......... .......... .......... ..........  [ 247.1KB/s]
[  0%]  .......... .......... .......... .......... ..........  [ 408.3KB/s]
[  0%]  .......... .......... .......... .......... ..........  [ 566.3KB/s]
[  0%]  .......... .......... .......... .......... ..........  [ 707.2KB/s]
[  0%]  .......... .......... .......... .......... ..........  [ 836.5KB/s]
[  0%]  .......... .......... .......... .......... ..........  [ 975.9KB/s]
[  0%]  .......... .......... .......... .......... ..........  [1079.9KB/s]
[  0%]  .......... .......... .......... .......... ..........  [1210.0KB/s]
[  0%]  .......... .......... .......... .......... ..........  [1303.1KB/s]
[  1%]  .......... .......... .......... .......... ..........  [1422.1KB/s]
[  1%]  .......... .......... .......... .......... ..........  [1508.0KB/s]
[  1%]  .......... .......... .......... .......... ..........  [1629.2KB/s]
..........
...
....
[ 99%]  .......... .......... .......... .......... ..........  [8710.2KB/s]
[ 99%]  .......... .......... .......... .......... ..........  [8680.7KB/s]
[100%]  .......... ..........
Downloaded 39.4 megabytes in 4 seconds. (8681.65 KB/s)

As you see axel downloaded same file in 4 seconds. Another great thing its binary size, I can put axel on boot disk and replace a wget.

prozilla - another good program with GUI frontend

One of the drawback of axel is you can not specify ftp username and password. You can use prozilla program, which also makes multiple connections and downloads a file in multiple parts simultaneously thus enhancing the download speed and dwnloading the file faster than a download with a single connection.

FTP passwords can be specified with the URL, or can be obtained automatically from ~/.netrc if it exists.

Install prozilla

# yum install prozilla

Or download prozilla from official web site.

To use prozilla just type the following command (command line version):
$ proz http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2

prozilla - another good program with GUI frontend
(click to enlrage image)

Further readings

Tweet itFacebook itGoogle+ itPDF itFound an error/typo on this page?

Comments on this entry are closed.

  • santi.kq September 2, 2006, 10:20 pm

    another good download accelerator for linux is not a standalone program, but a firefox extension: downthemall. it works great, and the acceleration rate is impressive.

    • kimmieshallie December 21, 2010, 12:42 pm

      Download downthemall is a very good download extention, I have a slow internet connection but it’s a real helper when downloading things. Also flashgot helps grabbing video files while browsing firefox

  • Michael Biddulph September 3, 2006, 12:51 am

    Bytes per sec….ffs…who thought that one up??

  • ajs September 3, 2006, 12:08 pm

    can you repeat the test with a different download file, and
    do the axel download first and the wget second? it seems likely
    that the file was cached in a transparent web proxy which could
    account for the speed difference.

  • nixCraft September 3, 2006, 12:27 pm

    Ajs,

    Noop, I am directly connected and not using a transparent proxy appliance or server. If I run wget axel wget axel I am getting same result.

    Appreciate your post.

  • Mike Schroll September 3, 2006, 1:45 pm

    I’ve always used lftp for this purpose — and in my two informal tests on two different boxes — it seems to be superior to axel:

    Downloaded 39.4 megabytes in 15 seconds. (2673.72 KB/s)

    lftp :~> pget http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2
    41292428 bytes transferred in 13 seconds (3.00M/s)

    Downloaded 39.4 megabytes in 16 seconds. (2376.57 KB/s)

    lftp :~> pget http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2
    41295932 bytes transferred in 13 seconds (3.06M/s)
    lftp :~>

  • rembrandt September 3, 2006, 2:22 pm

    you`re missing the fact that your ISP propably has a Proxy-Server wich has then a copy of the file in it`s cache so you don´t proof that anything gets speeded up.

  • nalley September 3, 2006, 2:41 pm

    A transparent proxy is just that – transparent. You wouldn’t know if you were using one or not. That is why ajs recommendation of repeating the test with a different file and using axel first makes sense. Your ISP (or your ISP’s ISP) might be using a transparent proxy. Cisco routers use WCCP (Web Cache Control Protocol) to implement transparent caching (there are lots of other ways to do it as well).

    If transparent caching was in use, “wget” would have populated the cache and “axel” would have retrieved from the cache.

    Also, I agree that connecting to multiple mirrors and pulling a “piece” of the file from each could speed things up – but connecting multiple times to the same server to avoid “bandwidth limits per connection” is silly. A smart admin would limit bandwith per IP not per connection.

  • nixCraft September 3, 2006, 3:11 pm

    Nalley I am aware of both transparent proxy and WCCP. This entire article is not about your home dsl or adsl connection. My box is hooked to 10Mbps uplink and other few boxes I managed they are hooked to 1000Mbps.

    This is a collocated box and my dedicated hosting service provider does not use caching. In fact, they sell content caching as a different product (which is quite expensive ) . And yes I did the test as I said earlier in following order:
    axel
    wget

    And result are almost same only 1 second different.

  • JoJo September 3, 2006, 4:38 pm

    You could at least put a caveat about the increased resource usage on servers that using an accelerator causes. If someone is providing something to you for free, don’t ve an asshat and just use wget/curl.

  • Wilmer September 3, 2006, 4:52 pm

    Hmm, are you sure Axel can’t download from FTP sites that require a password? I wrote the program and I’d be very embarassed if that functionality would really be missing…

    It should be possible to just give it a URL like ftp://username:password@hostname/path/etc.

    It’s interesting to see an article about a program I don’t maintain for four years already. I don’t use it myself anymore, but it’s nice that other people are still happy about it. :-)

  • nixCraft September 3, 2006, 5:12 pm

    Wilmer,

    Thanks for comment.

    Sorry to say but whenever I use ftp://username:password:ftp.myserver.com/file url, axel bumped my back with Segmentation fault error.

    That is why I am using proz. If you get time, try to fix…

    Axel is extremely tiny and part of my rescue disk as well.

  • dan September 3, 2006, 5:57 pm

    I LOVE IT,I LOVE IT, I LOVE IT

  • Son Nguyen September 3, 2006, 6:27 pm

    Nice! FlashGet is a similar tool for Windows but this is definitely useful tool to use under Linux. Thanks for sharing.

  • jojomonk September 3, 2006, 7:23 pm

    i got same results w/ both wget and axel – sticking to wget. Not a fan of the crazy status printouts done by axel.

  • Juan García September 3, 2006, 8:37 pm

    In my opinion lftp is much better because it supports both http and ftp and has tons of options and commands. lftp just works!

    • Kenneth Endfinger December 5, 2011, 8:07 pm

      lftp is alot faster for me. I love it!

  • Keith September 6, 2006, 9:26 pm

    From all those posts above, I think I will still keep to using wget.

  • Pascal Bleser September 7, 2006, 10:27 am

    Another fast command-line download accelerator worth looking into is aria2.
    It also supports bittorrent and has a very, very low memory footpring (even less than rtorrent.

    For people using SUSE Linux, I maintain aria2 RPMs here.

  • nixCraft September 7, 2006, 12:14 pm

    Pascal Bleser

    Thanks for pointing out aria2!

  • bugeyedmonster September 8, 2006, 1:51 am

    Also works great on mac using Fink! A simple /sw/bin/apt-get axel had me up and running in no time. Thanks for the tip ;)

  • Rob Swift October 14, 2006, 12:05 am

    I noticed you used ftp://username:password:ftp.myserver.com/file url when in fact it would be correct to use the @ symbol between the password and server like this. ftp://username:password@ftp.myserver.com/file url

  • daniel January 17, 2007, 1:43 pm

    well something abou t download accelerators.
    1st the DON’T really accelerates some times BUT they ALLways try to use your MAX bandwidth.
    2nd They can accelerate your download because the make more than one single connection. Its because when you connect just once you connection may slow down and it´s slower to one connection speed up when the traffic is up again, so if you have 4 connections is fastter to reuse the bandwidth.
    3rd they resume support works better… :-)

    I love download accelerator cause I’m allways doing some kind download… and be moving to windows just to do that is boring. So I developed my own download acelerator, based on axel’s code the new download accelerator for linux has released called doKa, it is made for KDE (I love it) and is working pretty well with some problens that you can faind on the projects page…

    If you have interest check out…

    http://sourceforge.net/projects/doka/

  • nixCraft January 17, 2007, 10:23 pm

    Daniel,

    Thanks for pointing out your porting. I will check it out later on. :)

  • Dhruva K January 27, 2007, 7:36 pm

    do any of the above programs tune the TCP stack to use maybe a greater window size? And there is that scale option too, if I recall right, to increase the granularity of the specified window….could be in gigs now i think… Wonder if doing that might help…?I do think there are some ‘knobs’ given by the stack to adjust these parameters….

  • Dhruva K January 27, 2007, 7:42 pm

    And just to add to that ‘smart admin will limit BW per IP not connection’, smarter download accelerators could use multiple IP’s assigned to the same NIC and vary the connection at L3 instead of L4.

  • Matt September 14, 2007, 1:54 pm

    Can you not specify a username pass via the URL?

  • henry October 10, 2007, 11:06 am

    how can I adjust or increase the size to be downloaded in my smoothwall linux ?

  • Rabiul Hassan Khan December 10, 2007, 2:26 am

    All Praises to Allah.

    I have found lftp is the only right tool. I tried prozilla (proz and prozgui), axel, aria2c and these are good but don’t have resume support. Prozilla has resume support but you have to quit the program mentioning your intention to resume later (for proz press Ctrl + R, and for prozgui click on Abort, resume later). If you press the computer’s reset button in the middle of a download and try to resume the broken download, it can’t be done with prozilla. Prozgui will go on downloading the rest but at the end it completes the download with wrong size.

    But with lftp you can download and accelerate download with multiconnection and resume a broken download later. I have tested with version 3.5.2 and earlier version may not work to resume download with pget (pget is needed for acceleration/opening more connection). So, get 3.5.2 or later version. Some lftp commands are as follwos:

    Get a file:
    lftp -e ‘pget http://ftp.file.tgz

    Continue broken download:
    lftp -e ‘pget -c http://ftp.file.tgz

    Get file with 7 connection:
    lftp -e ‘pget -n 7 -c http://ftp.file.tgz

    View setting:
    lftp -c set -a

    * lftp shell:
    Enter to lftp shell by entering command lftp and get a file by:
    pget http://ftp.file.tgz

    view setting:
    set -a

    change setting for saving downloading status teporarily (only available for the session, get back to default value after exit):
    set pget:save-status 5s

    change setting for number of downloading connection teporarily (only available for the session, get back to default value after exit):
    set pget:default-n 7

    * To change the setting permanently edit /etc/lftp.conf
    add line
    set pget:save-status 5s
    set pget:default-n 7

    Default time for pget save status is 10s, and connection number is 5

  • Boovarahan S April 5, 2008, 10:41 am

    What about d4x ? I have used aria / aria2c , axel , d4x , downthemall in firefox and I find aria2c highly fast and helpful.

  • bubo August 14, 2008, 4:37 am

    i use axel since half a year or so and i’m really quite happy with it. it does not spawn too many connections (you know i don’t wanna fall on sysadmin’s nerves) and never made a mistake until now. very reliable. i use it mostly to download iso images. the md5’s are always alright. i might give aria2c a try…

  • vinu August 15, 2008, 1:46 pm

    i’m new to using prozilla.i followed the two above mentioned steps to install it,

    # yum install prozilla
    $ proz http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2

    but i thought it was a GUI.and even then i need to know to download any file how do go about it.

    will the following work
    $proz

    if it does where are the files being saved???
    please reply.

    • weman March 25, 2011, 4:32 pm

      this doesnt make sense

    • Randy October 17, 2013, 9:05 am

      it is on home folder

  • azhen November 8, 2008, 8:46 am

    nice work

  • Markidi November 23, 2008, 5:04 pm

    how can we use axel to download file from Rapidshare (how to put rapidshare username and password in axel?), any idea?

  • emiraga April 7, 2009, 12:47 pm

    Here is my axel.2-3 patch for cookie loading from Firefox 3

    http://emiraga.wikispaces.com/Axel+Firefox+3+Cookies+Patch

    It is extremely buggy. :) I Hope you enjoy it.

  • AR May 4, 2009, 11:24 pm

    I made scripts to download from youtube.com (http://code.google.com/p/ytdownloader) and rapidshare (http://code.google.com/p/rdfu).
    What I want to mention is prozilla (2.0.4) really does not work in my case! Neither it can download from youtube nor from rapidshare! With rapidshare it fails to persist the connection. I think prozilla is not well implemented and it has many bugs around.
    Anyway, I’ll look at the code of prozilla when I have time.

  • Soliman Alqubati May 18, 2009, 7:37 pm

    I want the source code for some Linux comands

  • Olo June 12, 2009, 1:21 pm

    Axel was working good but it has an 2gb file limit. I can see that aria2 can resume downloads, so that is my choice for now. The speeds are comperable.

  • hemanth.hm August 24, 2009, 8:05 pm

    No matter what tools we use, we can’t beat the ISP load balance and increase our download speeds.

  • Jimmy Dean September 7, 2009, 3:34 am

    I was just downloading a file over FTP through wget… their server slowed me down to 300KB/s, I used this program and set the connections to 5 and I am getting 1670.6KB/s works for me…

  • Rupesh Mishra October 21, 2009, 2:50 pm

    Try the latest axel-2.4, the limit of 2Gb file is no more.

  • gaous afrizal December 6, 2009, 5:25 am

    hahaha,,, i think it’s just only taste :D
    some people think aria2 is better, lftp, axel .It ‘s your choice to use it. remember, network has bursty traffik :D

  • Xster December 7, 2009, 1:31 am

    Do agree that Axel’s printouts are a bit wild. But its in a visor on mac so doesn’t bother me

  • ayip.eiger March 23, 2010, 9:36 am

    Where i find axel downloaded files?

    • Randy October 17, 2013, 9:05 am

      home folder

  • ind daz March 30, 2010, 3:58 am

    hii..
    I’m in a this type of situation,I dont know how to go further.um working on Fedora 9.looking for ur help,, thnxzz!!!

    ./configure
    The strip option is enabled. This should not be a problem usually, but on some
    systems it breaks stuff.

    Configuration done:
    Internationalization disabled.
    Debugging disabled.
    Binary stripping enabled.
    [root@localhost axel-1.0b]# make install
    mkdir -p /usr/local/bin/
    cp axel /usr/local/bin/axel
    cp: cannot stat `axel': No such file or directory
    make: *** [install-bin] Error 1

    • adibaskom July 2, 2011, 9:47 am

      do this
      $./configure
      $make
      #make install
      enjoy

  • Roman April 30, 2010, 12:36 pm

    Hi ind daz.

    i had the same problem. try to type “apt-get -f install”. after that you can install it.

  • Long July 13, 2010, 12:10 pm

    A nice one to try for linux would be SKDownloader. It is a download accelerator having an excellent gui with themes support (not sure how many would be using it though :) ). It is fully free and unlike other download accelerators, it allows you to choose the number of simultaneous connections you can make and it is not limited to 3 or 4 which is the case with most others. Their link is
    Link

  • Paul Ward July 25, 2010, 10:54 pm

    Just tried wget on a file from a friends box in the UK to my server in NZ and was getting a total download time of 8 hours +
    Switched to my windows server and used firefox with downloadthemall and was getting 4 streams down and total download time est around 5 hours.
    Installed axel and tried however I was getting proxy issues and being requested for my domain and user + password, this is a pain as I had my http_proxy already exported but it did not use it.
    Then tried aria2 and amazing at this time it is sayinfg 2 hours 30 mins that blows all the above away and my windows firefox to boot.
    Yet to see if the file md5sums match and if the download time is real and not an extimate but it’s looking good for now especally as from the remote box I am lucky to get anywher near 50k usually around 30k :)
    [ SIZE:19.7MiB/538.8MiB(3%) CN:5 SPD:58.2KiBs ETA:2h31m59s]]

  • Raam January 21, 2011, 11:07 pm

    Thanks a lot . I also though wget is the ultimate downloader but axel is so much faster out the box. This really improved my life :D

  • Neigyl Noval May 25, 2011, 4:13 pm

    Hi. I’m using axel to download a 4 GB software. When it downloaded 98%, it suddenly gives “write error”. I tried it again, but it still gives write error. It says.

    File size: 4314152960 bytes
    Opening output file Xilinx_ISE_DS_Lin_13.1_O.40d.1.1.tar
    State file found: 4251837514 bytes downloaded, 62315446 to go.
    Starting download

    ,,,,,,,,,, ,,,,,,,,,, ,,,,,,,,,, ,,,,,….. …..
    Write error!

    Downloaded 10.7 kilobytes in 0 seconds. (24.90 KB/s)

    I still have more than 30 GB space and the partition is ext3.

    How to fix this? Thanks.

  • Arvind October 13, 2011, 2:33 am

    I tried axel — for me it works thrice as fast as wget. (I tried wget and axel on different huge files and measured the speed difference so the transparent proxy issue is not there.)

    This is ideal for someone who wants to download a huge file onto some remote Unix computer in the cloud. (1) Cannot fire-up mozilla on the remote computer even using ssh -X (painfully slow). (2) Cannot download huge file on to local lap-top and then re-up-load to remote compute in the cloud (idiotic).

    Therefore — go go go axel ! Love it!

    I haven’t tried aria and the other softwares mentioned here but they may well be just as good.

  • Dinesh October 30, 2011, 3:14 pm

    Thats pretty amazing. Can download files at a speed more than my max download limit.

  • rafi November 27, 2011, 4:19 pm

    Sorry i am unable to install axel.Plz can any one help me?

  • abdelouahab December 8, 2011, 8:22 am

    @rafi : use apt-get install axel
    dont try to use the GUI interface, it dont work here (ubuntu 11.10) use the command line:
    (look here) http://manpages.ubuntu.com/manpages/gutsy/man1/axel.1.html
    it really impressed me how it’s fast, i’ve unistalled it the first time i’ve used it because it dident show me anything! the console opened in black! but i’ve reinstalled it and used it directly from the bash using the command line “alex” (without quotes) and it worked :D

  • Prescilla June 5, 2012, 1:20 am

    Does axel support resuming partial downloads, like wget -c???
    If so, how do I resume a partial download with axel???

    • nick September 23, 2012, 8:24 pm

      yes!
      for instance the connection was lost, cancel the download by CTRL+C
      after that, enter same command in your previous download of axel. it will resume automatically

  • AndresVia July 30, 2013, 12:29 pm

    For the people complaining about the verbosity of axel, they should try the options
    –alternate, -a
    This will show an alternate progress indicator. A bar displays
    the progress and status of the different threads, along with
    current speed and an estimate for the remaining download time.
    –quiet, -q
    No output to stdout.

  • Saeed April 8, 2015, 6:09 pm

    Hi
    If you are using firefox , you can use “axel-downloader” firefox plugin.
    https://github.com/PHProir/axel-downloader-for-firefox