Usually I use wget, which is ultimate command line downloader. However, wget failed to accelerate my download speed. I was downloading 1.4GB file around 800KB/s download speed (this box is hooked to uplink port speed 10 Mbps ). However, remote server was restricting me.
Therefore, I decided to get rid of wget. I have downloaded axel – a light download accelerator for Linux command line.
How Axel works?
Axel does the same thing any other accelerator does: it opens more than one HTTP/FTP connection per download and each connection transfers its own, separate, part of the file. It may sound weird, but it works very well in practice. For example, some FTP sites limit the speed of each connection, therefore opening more than one connection at a time multiplies the allowable bandwidth. Be forewarned that some FTP operators don’t like it when you do this. It is better to open the additional connections on several servers and download from all of them simultaneously. Axel supports this feature too. You may either specify a list of mirrors to use or tell the program to search for mirrors.
This is a perfect tool over remote ssh session for downloading large file.
Install axel
If you are using Debian, type the following command to install axel:
# apt-get install axel
Or you can download axel from official website:
$ wget http://wilmer.gaast.net/downloads/axel-1.0b.tar.gz
Untar the axel:
$ tar -zxvf axel-1.0b.tar.gz
Configre and compile axel:
$ ./configure
Install axel:
# make install
On the other hand, just upload a newly build axel binary to remote Linux server using scp. Usually I do not install gcc c/c++ compilers collection on any of my production web/ftp/MySql servers for security reasons.
How do I use axel?
Just type the command as follows:
$ axel http://download.com/file.tar.gz
Limit speed
You can also specify a speed (bytes per sec) for axel so that it will not eat up all your bandwidth. For example following will try to keep the average speed around 5242880 (5120 Kilobyte per/sec):
$ axel -s 5242880 http://download.com/my.iso
Limit a number of connection
You can also specify number of connection you want to open. For example open 3 connections for downloading:
$ axel -n 3 -s 5242880 http://download.com/my.iso
But how fast is axel?
Here is sample test that demonstrates how fast is axel
$ wget http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2
Output:
--12:10:31-- http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2
=> `linux-2.6.17.11.tar.bz2'
Resolving kernel.org... 204.152.191.5, 204.152.191.37
Connecting to kernel.org|204.152.191.5|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 41,288,067 (39M) [application/x-bzip2]
100%[================================================================================>] 41,288,067 2.33M/s ETA 00:00
12:10:48 (2.31 MB/s) - `linux-2.6.17.11.tar.bz2' saved [41288067/41288067]
$ axel http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2
Output:
Initializing download: http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2
File size: 41288067 bytes
Opening output file linux-2.6.17.11.tar.bz2.1
Starting download
[ 0%] .......... .......... .......... .......... .......... [ 247.1KB/s]
[ 0%] .......... .......... .......... .......... .......... [ 408.3KB/s]
[ 0%] .......... .......... .......... .......... .......... [ 566.3KB/s]
[ 0%] .......... .......... .......... .......... .......... [ 707.2KB/s]
[ 0%] .......... .......... .......... .......... .......... [ 836.5KB/s]
[ 0%] .......... .......... .......... .......... .......... [ 975.9KB/s]
[ 0%] .......... .......... .......... .......... .......... [1079.9KB/s]
[ 0%] .......... .......... .......... .......... .......... [1210.0KB/s]
[ 0%] .......... .......... .......... .......... .......... [1303.1KB/s]
[ 1%] .......... .......... .......... .......... .......... [1422.1KB/s]
[ 1%] .......... .......... .......... .......... .......... [1508.0KB/s]
[ 1%] .......... .......... .......... .......... .......... [1629.2KB/s]
..........
...
....
[ 99%] .......... .......... .......... .......... .......... [8710.2KB/s]
[ 99%] .......... .......... .......... .......... .......... [8680.7KB/s]
[100%] .......... ..........
Downloaded 39.4 megabytes in 4 seconds. (8681.65 KB/s)
As you see axel downloaded same file in 4 seconds. Another great thing its binary size, I can put axel on boot disk and replace a wget.
prozilla – another good program with GUI frontend
One of the drawback of axel is you can not specify ftp username and password. You can use prozilla program, which also makes multiple connections and downloads a file in multiple parts simultaneously thus enhancing the download speed and dwnloading the file faster than a download with a single connection.
FTP passwords can be specified with the URL, or can be obtained automatically from ~/.netrc if it exists.
Install prozilla
# yum install prozilla
Or download prozilla from official web site.
To use prozilla just type the following command (command line version):
$ proz http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2
(click to enlrage image)
Further readings
- Read axel and prozilla man pages
- Axel web site
- Prozilla web site
🐧 Get the latest tutorials on Linux, Open Source & DevOps via:
- RSS feed or Weekly email newsletter
- Share on Twitter • Facebook • 62 comments... add one ↓
Category | List of Unix and Linux commands |
---|---|
File Management | cat |
Firewall | Alpine Awall • CentOS 8 • OpenSUSE • RHEL 8 • Ubuntu 16.04 • Ubuntu 18.04 • Ubuntu 20.04 |
Network Utilities | dig • host • ip • nmap |
OpenVPN | CentOS 7 • CentOS 8 • Debian 10 • Debian 8/9 • Ubuntu 18.04 • Ubuntu 20.04 |
Package Manager | apk • apt |
Processes Management | bg • chroot • cron • disown • fg • jobs • killall • kill • pidof • pstree • pwdx • time |
Searching | grep • whereis • which |
User Information | groups • id • lastcomm • last • lid/libuser-lid • logname • members • users • whoami • who • w |
WireGuard VPN | Alpine • CentOS 8 • Debian 10 • Firewall • Ubuntu 20.04 |
Hi
If you are using firefox , you can use “axel-downloader” firefox plugin.
https://github.com/PHProir/axel-downloader-for-firefox
For the people complaining about the verbosity of axel, they should try the options
–alternate, -a
This will show an alternate progress indicator. A bar displays
the progress and status of the different threads, along with
current speed and an estimate for the remaining download time.
–quiet, -q
No output to stdout.
Does axel support resuming partial downloads, like wget -c???
If so, how do I resume a partial download with axel???
yes!
for instance the connection was lost, cancel the download by CTRL+C
after that, enter same command in your previous download of axel. it will resume automatically
@rafi : use apt-get install axel
dont try to use the GUI interface, it dont work here (ubuntu 11.10) use the command line:
(look here) http://manpages.ubuntu.com/manpages/gutsy/man1/axel.1.html
it really impressed me how it’s fast, i’ve unistalled it the first time i’ve used it because it dident show me anything! the console opened in black! but i’ve reinstalled it and used it directly from the bash using the command line “alex” (without quotes) and it worked 😀
Sorry i am unable to install axel.Plz can any one help me?
Thats pretty amazing. Can download files at a speed more than my max download limit.
I tried axel — for me it works thrice as fast as wget. (I tried wget and axel on different huge files and measured the speed difference so the transparent proxy issue is not there.)
This is ideal for someone who wants to download a huge file onto some remote Unix computer in the cloud. (1) Cannot fire-up mozilla on the remote computer even using ssh -X (painfully slow). (2) Cannot download huge file on to local lap-top and then re-up-load to remote compute in the cloud (idiotic).
Therefore — go go go axel ! Love it!
I haven’t tried aria and the other softwares mentioned here but they may well be just as good.
Hi. I’m using axel to download a 4 GB software. When it downloaded 98%, it suddenly gives “write error”. I tried it again, but it still gives write error. It says.
File size: 4314152960 bytes
Opening output file Xilinx_ISE_DS_Lin_13.1_O.40d.1.1.tar
State file found: 4251837514 bytes downloaded, 62315446 to go.
Starting download
,,,,,,,,,, ,,,,,,,,,, ,,,,,,,,,, ,,,,,….. …..
Write error!
Downloaded 10.7 kilobytes in 0 seconds. (24.90 KB/s)
I still have more than 30 GB space and the partition is ext3.
How to fix this? Thanks.
Thanks a lot . I also though wget is the ultimate downloader but axel is so much faster out the box. This really improved my life 😀
Just tried wget on a file from a friends box in the UK to my server in NZ and was getting a total download time of 8 hours +
Switched to my windows server and used firefox with downloadthemall and was getting 4 streams down and total download time est around 5 hours.
Installed axel and tried however I was getting proxy issues and being requested for my domain and user + password, this is a pain as I had my http_proxy already exported but it did not use it.
Then tried aria2 and amazing at this time it is sayinfg 2 hours 30 mins that blows all the above away and my windows firefox to boot.
Yet to see if the file md5sums match and if the download time is real and not an extimate but it’s looking good for now especally as from the remote box I am lucky to get anywher near 50k usually around 30k 🙂
[ SIZE:19.7MiB/538.8MiB(3%) CN:5 SPD:58.2KiBs ETA:2h31m59s]]
A nice one to try for linux would be SKDownloader. It is a download accelerator having an excellent gui with themes support (not sure how many would be using it though 🙂 ). It is fully free and unlike other download accelerators, it allows you to choose the number of simultaneous connections you can make and it is not limited to 3 or 4 which is the case with most others. Their link is
Link
Hi ind daz.
i had the same problem. try to type “apt-get -f install”. after that you can install it.
hii..
I’m in a this type of situation,I dont know how to go further.um working on Fedora 9.looking for ur help,, thnxzz!!!
./configure
The strip option is enabled. This should not be a problem usually, but on some
systems it breaks stuff.
Configuration done:
Internationalization disabled.
Debugging disabled.
Binary stripping enabled.
[root@localhost axel-1.0b]# make install
mkdir -p /usr/local/bin/
cp axel /usr/local/bin/axel
cp: cannot stat `axel’: No such file or directory
make: *** [install-bin] Error 1
do this
$./configure
$make
#make install
enjoy
Where i find axel downloaded files?
home folder
Do agree that Axel’s printouts are a bit wild. But its in a visor on mac so doesn’t bother me
hahaha,,, i think it’s just only taste 😀
some people think aria2 is better, lftp, axel .It ‘s your choice to use it. remember, network has bursty traffik 😀
Try the latest axel-2.4, the limit of 2Gb file is no more.
I was just downloading a file over FTP through wget… their server slowed me down to 300KB/s, I used this program and set the connections to 5 and I am getting 1670.6KB/s works for me…
No matter what tools we use, we can’t beat the ISP load balance and increase our download speeds.
Axel was working good but it has an 2gb file limit. I can see that aria2 can resume downloads, so that is my choice for now. The speeds are comperable.
I want the source code for some Linux comands
I made scripts to download from youtube.com (http://code.google.com/p/ytdownloader) and rapidshare (http://code.google.com/p/rdfu).
What I want to mention is prozilla (2.0.4) really does not work in my case! Neither it can download from youtube nor from rapidshare! With rapidshare it fails to persist the connection. I think prozilla is not well implemented and it has many bugs around.
Anyway, I’ll look at the code of prozilla when I have time.
Here is my axel.2-3 patch for cookie loading from Firefox 3
http://emiraga.wikispaces.com/Axel+Firefox+3+Cookies+Patch
It is extremely buggy. 🙂 I Hope you enjoy it.
how can we use axel to download file from Rapidshare (how to put rapidshare username and password in axel?), any idea?
nice work
i’m new to using prozilla.i followed the two above mentioned steps to install it,
# yum install prozilla
$ proz http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2
but i thought it was a GUI.and even then i need to know to download any file how do go about it.
will the following work
$proz
if it does where are the files being saved???
please reply.
this doesnt make sense
it is on home folder
i use axel since half a year or so and i’m really quite happy with it. it does not spawn too many connections (you know i don’t wanna fall on sysadmin’s nerves) and never made a mistake until now. very reliable. i use it mostly to download iso images. the md5’s are always alright. i might give aria2c a try…
What about d4x ? I have used aria / aria2c , axel , d4x , downthemall in firefox and I find aria2c highly fast and helpful.
All Praises to Allah.
I have found lftp is the only right tool. I tried prozilla (proz and prozgui), axel, aria2c and these are good but don’t have resume support. Prozilla has resume support but you have to quit the program mentioning your intention to resume later (for proz press Ctrl + R, and for prozgui click on Abort, resume later). If you press the computer’s reset button in the middle of a download and try to resume the broken download, it can’t be done with prozilla. Prozgui will go on downloading the rest but at the end it completes the download with wrong size.
But with lftp you can download and accelerate download with multiconnection and resume a broken download later. I have tested with version 3.5.2 and earlier version may not work to resume download with pget (pget is needed for acceleration/opening more connection). So, get 3.5.2 or later version. Some lftp commands are as follwos:
Get a file:
lftp -e ‘pget http://ftp.file.tgz‘
Continue broken download:
lftp -e ‘pget -c http://ftp.file.tgz‘
Get file with 7 connection:
lftp -e ‘pget -n 7 -c http://ftp.file.tgz‘
View setting:
lftp -c set -a
* lftp shell:
Enter to lftp shell by entering command lftp and get a file by:
pget http://ftp.file.tgz
view setting:
set -a
change setting for saving downloading status teporarily (only available for the session, get back to default value after exit):
set pget:save-status 5s
change setting for number of downloading connection teporarily (only available for the session, get back to default value after exit):
set pget:default-n 7
* To change the setting permanently edit /etc/lftp.conf
add line
set pget:save-status 5s
set pget:default-n 7
Default time for pget save status is 10s, and connection number is 5
how can I adjust or increase the size to be downloaded in my smoothwall linux ?
Can you not specify a username pass via the URL?
And just to add to that ‘smart admin will limit BW per IP not connection’, smarter download accelerators could use multiple IP’s assigned to the same NIC and vary the connection at L3 instead of L4.
do any of the above programs tune the TCP stack to use maybe a greater window size? And there is that scale option too, if I recall right, to increase the granularity of the specified window….could be in gigs now i think… Wonder if doing that might help…?I do think there are some ‘knobs’ given by the stack to adjust these parameters….
Daniel,
Thanks for pointing out your porting. I will check it out later on. 🙂
well something abou t download accelerators.
1st the DON’T really accelerates some times BUT they ALLways try to use your MAX bandwidth.
2nd They can accelerate your download because the make more than one single connection. Its because when you connect just once you connection may slow down and it´s slower to one connection speed up when the traffic is up again, so if you have 4 connections is fastter to reuse the bandwidth.
3rd they resume support works better… 🙂
I love download accelerator cause I’m allways doing some kind download… and be moving to windows just to do that is boring. So I developed my own download acelerator, based on axel’s code the new download accelerator for linux has released called doKa, it is made for KDE (I love it) and is working pretty well with some problens that you can faind on the projects page…
If you have interest check out…
http://sourceforge.net/projects/doka/
I noticed you used ftp://username:password:ftp.myserver.com/file url when in fact it would be correct to use the @ symbol between the password and server like this. ftp://username:password@ftp.myserver.com/file url
Also works great on mac using Fink! A simple /sw/bin/apt-get axel had me up and running in no time. Thanks for the tip 😉
Pascal Bleser
Thanks for pointing out aria2!
Another fast command-line download accelerator worth looking into is aria2.
It also supports bittorrent and has a very, very low memory footpring (even less than rtorrent.
For people using SUSE Linux, I maintain aria2 RPMs here.
From all those posts above, I think I will still keep to using wget.
In my opinion lftp is much better because it supports both http and ftp and has tons of options and commands. lftp just works!
lftp is alot faster for me. I love it!
i got same results w/ both wget and axel – sticking to wget. Not a fan of the crazy status printouts done by axel.
Nice! FlashGet is a similar tool for Windows but this is definitely useful tool to use under Linux. Thanks for sharing.
I LOVE IT,I LOVE IT, I LOVE IT
Wilmer,
Thanks for comment.
Sorry to say but whenever I use ftp://username:password:ftp.myserver.com/file url, axel bumped my back with Segmentation fault error.
That is why I am using proz. If you get time, try to fix…
Axel is extremely tiny and part of my rescue disk as well.
Hmm, are you sure Axel can’t download from FTP sites that require a password? I wrote the program and I’d be very embarassed if that functionality would really be missing…
It should be possible to just give it a URL like ftp://username:password@hostname/path/etc.
It’s interesting to see an article about a program I don’t maintain for four years already. I don’t use it myself anymore, but it’s nice that other people are still happy about it. 🙂
You could at least put a caveat about the increased resource usage on servers that using an accelerator causes. If someone is providing something to you for free, don’t ve an asshat and just use wget/curl.
Nalley I am aware of both transparent proxy and WCCP. This entire article is not about your home dsl or adsl connection. My box is hooked to 10Mbps uplink and other few boxes I managed they are hooked to 1000Mbps.
This is a collocated box and my dedicated hosting service provider does not use caching. In fact, they sell content caching as a different product (which is quite expensive ) . And yes I did the test as I said earlier in following order:
axel
wget
And result are almost same only 1 second different.
A transparent proxy is just that – transparent. You wouldn’t know if you were using one or not. That is why ajs recommendation of repeating the test with a different file and using axel first makes sense. Your ISP (or your ISP’s ISP) might be using a transparent proxy. Cisco routers use WCCP (Web Cache Control Protocol) to implement transparent caching (there are lots of other ways to do it as well).
If transparent caching was in use, “wget” would have populated the cache and “axel” would have retrieved from the cache.
Also, I agree that connecting to multiple mirrors and pulling a “piece” of the file from each could speed things up – but connecting multiple times to the same server to avoid “bandwidth limits per connection” is silly. A smart admin would limit bandwith per IP not per connection.
you`re missing the fact that your ISP propably has a Proxy-Server wich has then a copy of the file in it`s cache so you don´t proof that anything gets speeded up.
I’ve always used lftp for this purpose — and in my two informal tests on two different boxes — it seems to be superior to axel:
Downloaded 39.4 megabytes in 15 seconds. (2673.72 KB/s)
lftp :~> pget http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2
41292428 bytes transferred in 13 seconds (3.00M/s)
Downloaded 39.4 megabytes in 16 seconds. (2376.57 KB/s)
lftp :~> pget http://kernel.org/pub/linux/kernel/v2.6/linux-2.6.17.11.tar.bz2
41295932 bytes transferred in 13 seconds (3.06M/s)
lftp :~>
Ajs,
Noop, I am directly connected and not using a transparent proxy appliance or server. If I run wget axel wget axel I am getting same result.
Appreciate your post.
can you repeat the test with a different download file, and
do the axel download first and the wget second? it seems likely
that the file was cached in a transparent web proxy which could
account for the speed difference.
Bytes per sec….ffs…who thought that one up??
another good download accelerator for linux is not a standalone program, but a firefox extension: downthemall. it works great, and the acceleration rate is impressive.
Download downthemall is a very good download extention, I have a slow internet connection but it’s a real helper when downloading things. Also flashgot helps grabbing video files while browsing firefox