Linux ultra fast command line download accelerator

by on September 1, 2006 · 61 comments· LAST UPDATED August 10, 2007

in , ,

Usually I use wget, which is ultimate command line downloader. However, wget failed to accelerate my download speed. I was downloading 1.4GB file around 800KB/s download speed (this box is hooked to uplink port speed 10 Mbps ). However, remote server was restricting me.

Therefore, I decided to get rid of wget. I have downloaded axel - a light download accelerator for Linux command line.

How Axel works?

Axel does the same thing any other accelerator does: it opens more than one HTTP/FTP connection per download and each connection transfers its own, separate, part of the file. It may sound weird, but it works very well in practice. For example, some FTP sites limit the speed of each connection, therefore opening more than one connection at a time multiplies the allowable bandwidth. Be forewarned that some FTP operators don't like it when you do this. It is better to open the additional connections on several servers and download from all of them simultaneously. Axel supports this feature too. You may either specify a list of mirrors to use or tell the program to search for mirrors.

This is a perfect tool over remote ssh session for downloading large file.

Install axel

If you are using Debian, type the following command to install axel:
# apt-get install axel

Or you can download axel from official website:
$ wget http://wilmer.gaast.net/downloads/axel-1.0b.tar.gz

Untar the axel:
$ tar -zxvf axel-1.0b.tar.gz

Configre and compile axel:
$ ./configure

Install axel:
# make install

On the other hand, just upload a newly build axel binary to remote Linux server using scp. Usually I do not install gcc c/c++ compilers collection on any of my production web/ftp/MySql servers for security reasons.

How do I use axel?

Just type the command as follows:
$ axel http://download.com/file.tar.gz

Limit speed
You can also specify a speed (bytes per sec) for axel so that it will not eat up all your bandwidth. For example following will try to keep the average speed around 5242880 (5120 Kilobyte per/sec):
$ axel -s 5242880 http://download.com/my.iso

Limit a number of connection
You can also specify number of connection you want to open. For example open 3 connections for downloading:
$ axel -n 3 -s 5242880 http://download.com/my.iso

But how fast is axel?

Here is sample test that demonstrates how fast is axel

$ wget http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2

Output:

--12:10:31--  http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2
           => `linux-2.6.17.11.tar.bz2'
Resolving kernel.org... 204.152.191.5, 204.152.191.37
Connecting to kernel.org|204.152.191.5|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 41,288,067 (39M) [application/x-bzip2]
100%[================================================================================>] 41,288,067     2.33M/s    ETA 00:00
12:10:48 (2.31 MB/s) - `linux-2.6.17.11.tar.bz2' saved [41288067/41288067]

$ axel http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2

Output:

Initializing download: http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2
File size: 41288067 bytes
Opening output file linux-2.6.17.11.tar.bz2.1
Starting download
[  0%]  .......... .......... .......... .......... ..........  [ 247.1KB/s]
[  0%]  .......... .......... .......... .......... ..........  [ 408.3KB/s]
[  0%]  .......... .......... .......... .......... ..........  [ 566.3KB/s]
[  0%]  .......... .......... .......... .......... ..........  [ 707.2KB/s]
[  0%]  .......... .......... .......... .......... ..........  [ 836.5KB/s]
[  0%]  .......... .......... .......... .......... ..........  [ 975.9KB/s]
[  0%]  .......... .......... .......... .......... ..........  [1079.9KB/s]
[  0%]  .......... .......... .......... .......... ..........  [1210.0KB/s]
[  0%]  .......... .......... .......... .......... ..........  [1303.1KB/s]
[  1%]  .......... .......... .......... .......... ..........  [1422.1KB/s]
[  1%]  .......... .......... .......... .......... ..........  [1508.0KB/s]
[  1%]  .......... .......... .......... .......... ..........  [1629.2KB/s]
..........
...
....
[ 99%]  .......... .......... .......... .......... ..........  [8710.2KB/s]
[ 99%]  .......... .......... .......... .......... ..........  [8680.7KB/s]
[100%]  .......... ..........
Downloaded 39.4 megabytes in 4 seconds. (8681.65 KB/s)

As you see axel downloaded same file in 4 seconds. Another great thing its binary size, I can put axel on boot disk and replace a wget.

prozilla - another good program with GUI frontend

One of the drawback of axel is you can not specify ftp username and password. You can use prozilla program, which also makes multiple connections and downloads a file in multiple parts simultaneously thus enhancing the download speed and dwnloading the file faster than a download with a single connection.

FTP passwords can be specified with the URL, or can be obtained automatically from ~/.netrc if it exists.

Install prozilla

# yum install prozilla

Or download prozilla from official web site.

To use prozilla just type the following command (command line version):
$ proz http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2

prozilla - another good program with GUI frontend
(click to enlrage image)

Further readings

TwitterFacebookGoogle+PDF versionFound an error/typo on this page? Help us!

{ 61 comments… read them below or add one }

1 santi.kq September 2, 2006 at 10:20 pm

another good download accelerator for linux is not a standalone program, but a firefox extension: downthemall. it works great, and the acceleration rate is impressive.

Reply

2 kimmieshallie December 21, 2010 at 12:42 pm

Download downthemall is a very good download extention, I have a slow internet connection but it’s a real helper when downloading things. Also flashgot helps grabbing video files while browsing firefox

Reply

3 Michael Biddulph September 3, 2006 at 12:51 am

Bytes per sec….ffs…who thought that one up??

Reply

4 ajs September 3, 2006 at 12:08 pm

can you repeat the test with a different download file, and
do the axel download first and the wget second? it seems likely
that the file was cached in a transparent web proxy which could
account for the speed difference.

Reply

5 nixCraft September 3, 2006 at 12:27 pm

Ajs,

Noop, I am directly connected and not using a transparent proxy appliance or server. If I run wget axel wget axel I am getting same result.

Appreciate your post.

Reply

6 Mike Schroll September 3, 2006 at 1:45 pm

I’ve always used lftp for this purpose — and in my two informal tests on two different boxes — it seems to be superior to axel:

Downloaded 39.4 megabytes in 15 seconds. (2673.72 KB/s)

lftp :~> pget http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2
41292428 bytes transferred in 13 seconds (3.00M/s)

Downloaded 39.4 megabytes in 16 seconds. (2376.57 KB/s)

lftp :~> pget http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2
41295932 bytes transferred in 13 seconds (3.06M/s)
lftp :~>

Reply

7 rembrandt September 3, 2006 at 2:22 pm

you`re missing the fact that your ISP propably has a Proxy-Server wich has then a copy of the file in it`s cache so you don´t proof that anything gets speeded up.

Reply

8 nalley September 3, 2006 at 2:41 pm

A transparent proxy is just that – transparent. You wouldn’t know if you were using one or not. That is why ajs recommendation of repeating the test with a different file and using axel first makes sense. Your ISP (or your ISP’s ISP) might be using a transparent proxy. Cisco routers use WCCP (Web Cache Control Protocol) to implement transparent caching (there are lots of other ways to do it as well).

If transparent caching was in use, “wget” would have populated the cache and “axel” would have retrieved from the cache.

Also, I agree that connecting to multiple mirrors and pulling a “piece” of the file from each could speed things up – but connecting multiple times to the same server to avoid “bandwidth limits per connection” is silly. A smart admin would limit bandwith per IP not per connection.

Reply

9 nixCraft September 3, 2006 at 3:11 pm

Nalley I am aware of both transparent proxy and WCCP. This entire article is not about your home dsl or adsl connection. My box is hooked to 10Mbps uplink and other few boxes I managed they are hooked to 1000Mbps.

This is a collocated box and my dedicated hosting service provider does not use caching. In fact, they sell content caching as a different product (which is quite expensive ) . And yes I did the test as I said earlier in following order:
axel
wget

And result are almost same only 1 second different.

Reply

10 JoJo September 3, 2006 at 4:38 pm

You could at least put a caveat about the increased resource usage on servers that using an accelerator causes. If someone is providing something to you for free, don’t ve an asshat and just use wget/curl.

Reply

11 Wilmer September 3, 2006 at 4:52 pm

Hmm, are you sure Axel can’t download from FTP sites that require a password? I wrote the program and I’d be very embarassed if that functionality would really be missing…

It should be possible to just give it a URL like ftp://username:password@hostname/path/etc.

It’s interesting to see an article about a program I don’t maintain for four years already. I don’t use it myself anymore, but it’s nice that other people are still happy about it. :-)

Reply

12 nixCraft September 3, 2006 at 5:12 pm

Wilmer,

Thanks for comment.

Sorry to say but whenever I use ftp://username:password:ftp.myserver.com/file url, axel bumped my back with Segmentation fault error.

That is why I am using proz. If you get time, try to fix…

Axel is extremely tiny and part of my rescue disk as well.

Reply

13 dan September 3, 2006 at 5:57 pm

I LOVE IT,I LOVE IT, I LOVE IT

Reply

14 Son Nguyen September 3, 2006 at 6:27 pm

Nice! FlashGet is a similar tool for Windows but this is definitely useful tool to use under Linux. Thanks for sharing.

Reply

15 jojomonk September 3, 2006 at 7:23 pm

i got same results w/ both wget and axel – sticking to wget. Not a fan of the crazy status printouts done by axel.

Reply

16 Juan García September 3, 2006 at 8:37 pm

In my opinion lftp is much better because it supports both http and ftp and has tons of options and commands. lftp just works!

Reply

17 Kenneth Endfinger December 5, 2011 at 8:07 pm

lftp is alot faster for me. I love it!

Reply

18 Keith September 6, 2006 at 9:26 pm

From all those posts above, I think I will still keep to using wget.

Reply

19 Pascal Bleser September 7, 2006 at 10:27 am

Another fast command-line download accelerator worth looking into is aria2.
It also supports bittorrent and has a very, very low memory footpring (even less than rtorrent.

For people using SUSE Linux, I maintain aria2 RPMs here.

Reply

20 nixCraft September 7, 2006 at 12:14 pm

Pascal Bleser

Thanks for pointing out aria2!

Reply

21 bugeyedmonster September 8, 2006 at 1:51 am

Also works great on mac using Fink! A simple /sw/bin/apt-get axel had me up and running in no time. Thanks for the tip ;)

Reply

22 Rob Swift October 14, 2006 at 12:05 am

I noticed you used ftp://username:password:ftp.myserver.com/file url when in fact it would be correct to use the @ symbol between the password and server like this. ftp://username:password@ftp.myserver.com/file url

Reply

23 daniel January 17, 2007 at 1:43 pm

well something abou t download accelerators.
1st the DON’T really accelerates some times BUT they ALLways try to use your MAX bandwidth.
2nd They can accelerate your download because the make more than one single connection. Its because when you connect just once you connection may slow down and it´s slower to one connection speed up when the traffic is up again, so if you have 4 connections is fastter to reuse the bandwidth.
3rd they resume support works better… :-)

I love download accelerator cause I’m allways doing some kind download… and be moving to windows just to do that is boring. So I developed my own download acelerator, based on axel’s code the new download accelerator for linux has released called doKa, it is made for KDE (I love it) and is working pretty well with some problens that you can faind on the projects page…

If you have interest check out…

http://sourceforge.net/projects/doka/

Reply

24 nixCraft January 17, 2007 at 10:23 pm

Daniel,

Thanks for pointing out your porting. I will check it out later on. :)

Reply

25 Dhruva K January 27, 2007 at 7:36 pm

do any of the above programs tune the TCP stack to use maybe a greater window size? And there is that scale option too, if I recall right, to increase the granularity of the specified window….could be in gigs now i think… Wonder if doing that might help…?I do think there are some ‘knobs’ given by the stack to adjust these parameters….

Reply

26 Dhruva K January 27, 2007 at 7:42 pm

And just to add to that ‘smart admin will limit BW per IP not connection’, smarter download accelerators could use multiple IP’s assigned to the same NIC and vary the connection at L3 instead of L4.

Reply

27 Matt September 14, 2007 at 1:54 pm

Can you not specify a username pass via the URL?

Reply

28 henry October 10, 2007 at 11:06 am

how can I adjust or increase the size to be downloaded in my smoothwall linux ?

Reply

29 Rabiul Hassan Khan December 10, 2007 at 2:26 am

All Praises to Allah.

I have found lftp is the only right tool. I tried prozilla (proz and prozgui), axel, aria2c and these are good but don’t have resume support. Prozilla has resume support but you have to quit the program mentioning your intention to resume later (for proz press Ctrl + R, and for prozgui click on Abort, resume later). If you press the computer’s reset button in the middle of a download and try to resume the broken download, it can’t be done with prozilla. Prozgui will go on downloading the rest but at the end it completes the download with wrong size.

But with lftp you can download and accelerate download with multiconnection and resume a broken download later. I have tested with version 3.5.2 and earlier version may not work to resume download with pget (pget is needed for acceleration/opening more connection). So, get 3.5.2 or later version. Some lftp commands are as follwos:

Get a file:
lftp -e ‘pget http://ftp.file.tgz

Continue broken download:
lftp -e ‘pget -c http://ftp.file.tgz

Get file with 7 connection:
lftp -e ‘pget -n 7 -c http://ftp.file.tgz

View setting:
lftp -c set -a

* lftp shell:
Enter to lftp shell by entering command lftp and get a file by:
pget http://ftp.file.tgz

view setting:
set -a

change setting for saving downloading status teporarily (only available for the session, get back to default value after exit):
set pget:save-status 5s

change setting for number of downloading connection teporarily (only available for the session, get back to default value after exit):
set pget:default-n 7

* To change the setting permanently edit /etc/lftp.conf
add line
set pget:save-status 5s
set pget:default-n 7

Default time for pget save status is 10s, and connection number is 5

Reply

30 Boovarahan S April 5, 2008 at 10:41 am

What about d4x ? I have used aria / aria2c , axel , d4x , downthemall in firefox and I find aria2c highly fast and helpful.

Reply

31 bubo August 14, 2008 at 4:37 am

i use axel since half a year or so and i’m really quite happy with it. it does not spawn too many connections (you know i don’t wanna fall on sysadmin’s nerves) and never made a mistake until now. very reliable. i use it mostly to download iso images. the md5′s are always alright. i might give aria2c a try…

Reply

32 vinu August 15, 2008 at 1:46 pm

i’m new to using prozilla.i followed the two above mentioned steps to install it,

# yum install prozilla
$ proz http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2

but i thought it was a GUI.and even then i need to know to download any file how do go about it.

will the following work
$proz

if it does where are the files being saved???
please reply.

Reply

33 weman March 25, 2011 at 4:32 pm

this doesnt make sense

Reply

34 Randy October 17, 2013 at 9:05 am

it is on home folder

Reply

35 azhen November 8, 2008 at 8:46 am

nice work

Reply

36 Markidi November 23, 2008 at 5:04 pm

how can we use axel to download file from Rapidshare (how to put rapidshare username and password in axel?), any idea?

Reply

37 emiraga April 7, 2009 at 12:47 pm

Here is my axel.2-3 patch for cookie loading from Firefox 3

http://emiraga.wikispaces.com/Axel+Firefox+3+Cookies+Patch

It is extremely buggy. :) I Hope you enjoy it.

Reply

38 AR May 4, 2009 at 11:24 pm

I made scripts to download from youtube.com (http://code.google.com/p/ytdownloader) and rapidshare (http://code.google.com/p/rdfu).
What I want to mention is prozilla (2.0.4) really does not work in my case! Neither it can download from youtube nor from rapidshare! With rapidshare it fails to persist the connection. I think prozilla is not well implemented and it has many bugs around.
Anyway, I’ll look at the code of prozilla when I have time.

Reply

39 Soliman Alqubati May 18, 2009 at 7:37 pm

I want the source code for some Linux comands

Reply

40 Olo June 12, 2009 at 1:21 pm

Axel was working good but it has an 2gb file limit. I can see that aria2 can resume downloads, so that is my choice for now. The speeds are comperable.

Reply

41 hemanth.hm August 24, 2009 at 8:05 pm

No matter what tools we use, we can’t beat the ISP load balance and increase our download speeds.

Reply

42 Jimmy Dean September 7, 2009 at 3:34 am

I was just downloading a file over FTP through wget… their server slowed me down to 300KB/s, I used this program and set the connections to 5 and I am getting 1670.6KB/s works for me…

Reply

43 Rupesh Mishra October 21, 2009 at 2:50 pm

Try the latest axel-2.4, the limit of 2Gb file is no more.

Reply

44 gaous afrizal December 6, 2009 at 5:25 am

hahaha,,, i think it’s just only taste :D
some people think aria2 is better, lftp, axel .It ‘s your choice to use it. remember, network has bursty traffik :D

Reply

45 Xster December 7, 2009 at 1:31 am

Do agree that Axel’s printouts are a bit wild. But its in a visor on mac so doesn’t bother me

Reply

46 ayip.eiger March 23, 2010 at 9:36 am

Where i find axel downloaded files?

Reply

47 Randy October 17, 2013 at 9:05 am

home folder

Reply

48 ind daz March 30, 2010 at 3:58 am

hii..
I’m in a this type of situation,I dont know how to go further.um working on Fedora 9.looking for ur help,, thnxzz!!!

./configure
The strip option is enabled. This should not be a problem usually, but on some
systems it breaks stuff.

Configuration done:
Internationalization disabled.
Debugging disabled.
Binary stripping enabled.
[root@localhost axel-1.0b]# make install
mkdir -p /usr/local/bin/
cp axel /usr/local/bin/axel
cp: cannot stat `axel’: No such file or directory
make: *** [install-bin] Error 1

Reply

49 adibaskom July 2, 2011 at 9:47 am

do this
$./configure
$make
#make install
enjoy

Reply

50 Roman April 30, 2010 at 12:36 pm

Hi ind daz.

i had the same problem. try to type “apt-get -f install”. after that you can install it.

Reply

51 Long July 13, 2010 at 12:10 pm

A nice one to try for linux would be SKDownloader. It is a download accelerator having an excellent gui with themes support (not sure how many would be using it though :) ). It is fully free and unlike other download accelerators, it allows you to choose the number of simultaneous connections you can make and it is not limited to 3 or 4 which is the case with most others. Their link is
Link

Reply

52 Paul Ward July 25, 2010 at 10:54 pm

Just tried wget on a file from a friends box in the UK to my server in NZ and was getting a total download time of 8 hours +
Switched to my windows server and used firefox with downloadthemall and was getting 4 streams down and total download time est around 5 hours.
Installed axel and tried however I was getting proxy issues and being requested for my domain and user + password, this is a pain as I had my http_proxy already exported but it did not use it.
Then tried aria2 and amazing at this time it is sayinfg 2 hours 30 mins that blows all the above away and my windows firefox to boot.
Yet to see if the file md5sums match and if the download time is real and not an extimate but it’s looking good for now especally as from the remote box I am lucky to get anywher near 50k usually around 30k :)
[ SIZE:19.7MiB/538.8MiB(3%) CN:5 SPD:58.2KiBs ETA:2h31m59s]]

Reply

53 Raam January 21, 2011 at 11:07 pm

Thanks a lot . I also though wget is the ultimate downloader but axel is so much faster out the box. This really improved my life :D

Reply

54 Neigyl Noval May 25, 2011 at 4:13 pm

Hi. I’m using axel to download a 4 GB software. When it downloaded 98%, it suddenly gives “write error”. I tried it again, but it still gives write error. It says.

File size: 4314152960 bytes
Opening output file Xilinx_ISE_DS_Lin_13.1_O.40d.1.1.tar
State file found: 4251837514 bytes downloaded, 62315446 to go.
Starting download

,,,,,,,,,, ,,,,,,,,,, ,,,,,,,,,, ,,,,,….. …..
Write error!

Downloaded 10.7 kilobytes in 0 seconds. (24.90 KB/s)

I still have more than 30 GB space and the partition is ext3.

How to fix this? Thanks.

Reply

55 Arvind October 13, 2011 at 2:33 am

I tried axel — for me it works thrice as fast as wget. (I tried wget and axel on different huge files and measured the speed difference so the transparent proxy issue is not there.)

This is ideal for someone who wants to download a huge file onto some remote Unix computer in the cloud. (1) Cannot fire-up mozilla on the remote computer even using ssh -X (painfully slow). (2) Cannot download huge file on to local lap-top and then re-up-load to remote compute in the cloud (idiotic).

Therefore — go go go axel ! Love it!

I haven’t tried aria and the other softwares mentioned here but they may well be just as good.

Reply

56 Dinesh October 30, 2011 at 3:14 pm

Thats pretty amazing. Can download files at a speed more than my max download limit.

Reply

57 rafi November 27, 2011 at 4:19 pm

Sorry i am unable to install axel.Plz can any one help me?

Reply

58 abdelouahab December 8, 2011 at 8:22 am

@rafi : use apt-get install axel
dont try to use the GUI interface, it dont work here (ubuntu 11.10) use the command line:
(look here) http://manpages.ubuntu.com/manpages/gutsy/man1/axel.1.html
it really impressed me how it’s fast, i’ve unistalled it the first time i’ve used it because it dident show me anything! the console opened in black! but i’ve reinstalled it and used it directly from the bash using the command line “alex” (without quotes) and it worked :D

Reply

59 Prescilla June 5, 2012 at 1:20 am

Does axel support resuming partial downloads, like wget -c???
If so, how do I resume a partial download with axel???

Reply

60 nick September 23, 2012 at 8:24 pm

yes!
for instance the connection was lost, cancel the download by CTRL+C
after that, enter same command in your previous download of axel. it will resume automatically

Reply

61 AndresVia July 30, 2013 at 12:29 pm

For the people complaining about the verbosity of axel, they should try the options
–alternate, -a
This will show an alternate progress indicator. A bar displays
the progress and status of the different threads, along with
current speed and an estimate for the remaining download time.
–quiet, -q
No output to stdout.

Reply

Leave a Comment

Tagged as: , , , , , , , , ,

Previous post:

Next post: