A good backup plan is essential in order to have the ability to recover from

  • Human errors
  • RAID or disk failure
  • File system corruption
  • Data center destruction and more.

In this post I’m going to list amazingly awesome open source Backup software for you.

What to look for when choosing backup software for an enterprise?

Make sure the following features are supported backup software you deploy:

  1. Open source software – You must use software for which the original source code is made freely available and may be and modified. This ensures that you can recover your data in case vendor/project stopped working on software or refused to provide patches.
  2. Cross-platform support – Make sure backup software works well on the OS deployed on all desktop and server operating systems.
  3. Data format – Open data format ensures that you can recover data in case vendor or project stopped working on software.
  4. Autochangers – Autochangers are nothing but a variety of backup devices, including library, near-line storage, and autoloader. Autochangers allows you to automate the task of loading, mounting, and labeling backup media such as tape.
  5. Backup media – Make sure you can backup data on tape, disk, DVD and in cloud storage such as AWS.
  6. Encryption datastream – Make sure all client-to-server traffic will be encrypted to ensure transmission integrity over the LAN/WAN/Internet.
  7. Database support – Make sure backup software can backup database server such as MySQL or Oracle.
  8. Backup span multiple volumes – Backup software can split each backup (dumpfile) into a series of parts, allowing for different parts to existing on different volumes. This ensures that large backups (such as 100TB file) can be stored on larger than a single backup device such as disk or tape volume.
  9. VSS (Volume Shadow Copy) – It is Microsoft’s Volume Shadow Copy Service (VSS) and it is used to create snapshots of data that is to be backed up. Make sure backup software support VSS for MS-Windows client/server.
  10. Deduplication – It is a data compression technique for eliminating duplicate copies of repeating data (for example, images).
  11. License and cost – Make sure you understand and use of open source license under which the original backup software is made available to you.
  12. Commercial support – Open source software can provide community based (such as email list or fourm) or professional (such as subscriptions provided at additional cost) based support. You can use paid professional support for training and consulting purpose.
  13. Reports and alerts – Finally, you must able to see backup reports, current job status, and get alert when something goes wrong while making backups.

Bacula – Client/server backup tool for heterogeneous networks

I personally use this software to manage backup and recovery across a network of computers including Linux, OSX and Windows. You can configure it via a CLI, GUI or web interface.

Operating system : Cross-platform
Backup Levels : Full, differential, incremental, and consolidation.
Data format: Custom but fully open.
Autochangers: Yes
Backup media: Tape/Disk/DVD
Encryption datastream: Yes
Database support: MSSQL/PostgreSQL/Oracle/
Backup span multiple volumes: Yes
VSS: Yes
License : Affero General Public License v3.0
Download url : bacula.org

Amanda – Another good client/server backup tool

AMANDA is an acronym for Advanced Maryland Automatic Network Disk Archiver. It allows the sysadmin to set up a single backup server to back up other hosts over network to tape drives or disk or authchangers.

Operating system : Cross-platform
Backup Levels : Full, differential, incremental, and consolidation.
Data format: Open (can be recovered using tool such as tar).
Autochangers: Yes
Backup media: Tape/Disk/DVD
Encryption datastream: Yes
Database support: MSSQL/Oracle
Backup span multiple volumes: Yes
VSS: Yes
License : GPL, LGPL, Apache, Amanda License
Download url : amanda.org

Backupninja – Lightweight backup system

Backupninja is a simple and easy to use backup system. You can simply drop a config files into /etc/backup.d/ to backup multiple hosts.

Operating system : Linux/Unix
Backup Levels : Full and incremental (rsync+hard links)
Data format: Open
Autochangers: N/A
Backup media: Disk/DVD/CD/ISO images
Encryption datastream: Yes (ssh) and encrypted remote backups via duplicity
Database support: MySQL/PostgreSQL/OpenLDAP and subversion or trac repositories.
Backup span multiple volumes: ??
VSS: ??
License : GPL
Download url : riseup.net

Backuppc – High-performance client/server tool

Backuppc is can be used to backup Linux and Windows based systems to a master server’s disk. It comes with a clever pooling scheme minimizes disk storage, disk I/O and network I/O.

Operating system : Linux/Unix and Windows
Backup Levels : Full and incremental (rsync+hard links and pooling scheme)
Data format: Open
Autochangers: N/A
Backup media: Disk/RAID storage
Encryption datastream: Yes
Database support: Yes (via custom shell scripts)
Backup span multiple volumes: ??
VSS: ??
License : GPL
Download url : backuppc.sourceforge.net

UrBackup – Easy to setup client/server system

It is an easy to setup open source client/server backup system, that through a combination of image and file backups accomplishes both data safety and a fast restoration time. Your files can be restored through the web interface or the Windows Explorer while the backups of drive volumes can be restored with a bootable CD or USB-Stick (bare metal restore). A web interface makes setting up your own backup server really easy.

Operating system : Linux/FreeBSD/Unix/Windows/several Linux based NAS operating systems. Client only runs on Linux and Windows.
Backup Levels : Full and incremental
Data format: Open
Autochangers: N/A
Backup media: Disk/Raid storage/DVD
Encryption datastream: Yes
Database support: ??
Backup span multiple volumes: ??
VSS: ??
License : GPL v3+
Download url : urbackup.org

Other awesome open source backup software for your consideration

The Amanda, Bacula and above-mentioned software are feature rich but can be complicated to set for small network or a single server. I recommend that you study and use the following backup software:

  1. Rsnapshot – I recommend this tool for local and remote filesystem snapshot utility. See how to set and use this tool on Debian/Ubuntu Linux and CentOS/RHEL based systems.
  2. rdiff-backup – Another great remote incremental backup tool for Unix-like systems.
  3. Burp – Burp is a network backup and restore program. It uses librsync in order to save network traffic and to save on the amount of space that is used by each backup. It also uses VSS (Volume Shadow Copy Service) to make snapshots when backing up Windows computers.
  4. Duplicity – Great encrypted bandwidth-efficient backup for Unix-like system. See how to Install Duplicity for encrypted backup in cloud for more infomation.
  5. SafeKeep – SafeKeep is a centralized and easy to use backup application that combines the best features of a mirror and an incremental backup.
  6. DREBS – DREBS is a tool for taking periodic snapshots of EBS volumes. It is designed to be run on the EC2 host which the EBS volumes to be snapshoted are attached.
  7. Old good unix programs like rsync, tar, cpio, mt and dump.

I hope you will find this post useful to backup your important data. Do not forgot to verify your backups and make multiple backup copies of your data. Also, RAID is not a backup solution. Use any one of the above-mentioned programs to backup your servers, desktop/laptop and personal mobile devices. If you know of any other open source backup software I didn’t mention, share them in the comments below.

🐧 Get the latest tutorials on Linux, Open Source & DevOps via RSS feed or Weekly email newsletter.

🐧 44 comments so far... add one

CategoryList of Unix and Linux commands
Disk space analyzersdf ncdu pydf
File Managementcat tree
FirewallAlpine Awall CentOS 8 OpenSUSE RHEL 8 Ubuntu 16.04 Ubuntu 18.04 Ubuntu 20.04
Network UtilitiesNetHogs dig host ip nmap
OpenVPNCentOS 7 CentOS 8 Debian 10 Debian 8/9 Ubuntu 18.04 Ubuntu 20.04
Package Managerapk apt
Processes Managementbg chroot cron disown fg jobs killall kill pidof pstree pwdx time
Searchinggrep whereis which
User Informationgroups id lastcomm last lid/libuser-lid logname members users whoami who w
WireGuard VPNAlpine CentOS 8 Debian 10 Firewall Ubuntu 20.04
44 comments… add one
  • Nilesh Nov 6, 2014 @ 10:10

    rsnapshot is the best, very easy to configure.
    Just deploy rsyncd (with restricted firewall access, of course) on each source you want to backup (in case of network) and let it rsync to another machine.

    From that machine you can run tar and upload elsewhere.

  • Rajgopal Nov 6, 2014 @ 11:05

    I like rsnapshot for Linux to Linux backup. I have automated backup process in some of the critical servers. Beauty is, it is fast and easy to restore.

    • Guest Mar 12, 2015 @ 19:57

      I freelance from comfort of my home, completing some simple jobs which only require desktop or laptop computer and internet access and I am happier than ever… 6 months have passed since i started this and i made total of 36,000 dollars… Basicly i earn close to 80 dollars/hourly and work for 3-4 hrs a day.And the best part about the job is that you can determine your own working hours and the payments are weekly.

    • janet.tovar Jul 26, 2015 @ 12:59


  • Johan Bjäreholt Nov 6, 2014 @ 11:37

    Don’t forget back in time!

  • Andrea Nov 6, 2014 @ 12:20

    What about bareos?

    • Erathiel Dec 1, 2014 @ 8:40

      Bareos, as it stands, is effectively a Bacula clone right now, with next to no new features included. Plus, given the controversy on the (alleged) copyright infringement, the project’s future is not as certain as one might wish. I was investigating the case some 6 months back, searching for a backup solution and Bacula seemed to be a more viable choice than Bareos, at least to me. Using version 5.2.6 right now (being the latest available in Debian repositories) and it works like charm (though was a pain in the butt to set up, mind you) :) If you’re considering those two, I would personally say: stick with Bacula.

      • Erathiel Dec 1, 2014 @ 14:21

        Anyway, I’m pretty sure this article was on nixCraft at the time of my research and was just re-posted recently. Most probably Bareos was not yet around at all at the time the original article was written.

  • Andrea Nov 6, 2014 @ 14:48

    did you take into consideration bareos?

  • Wd40 Nov 6, 2014 @ 18:02

    There is one very cool solution but unfortunetly not open source. It’s a RAID that has evolved. I mean logical volume mirroring on AIX. Data can be mirrored across datacenters far away from each other (up to ~250KM). This can be easily administered. I’d really love to see such thing in Linux. There is something alike already but i didn’t test it yet. This is not a substitute of backups but really helps.
    Another thing in AIX is mksysb which is a backup solution that makes a bootable image of the operating system ready for a full or partial restore. There are similar solutions on Linux like REAR and Mondo Backup. I guess their reporting features are not too rich but it can save a ton of time if used.

    • Adam Goryachev Nov 10, 2014 @ 23:58

      Try DRBD and/or DRBD proxy. I use DRBD successfully between 2 servers in the same data centre, haven’t tried with a remote one yet. DRBD proxy is commercial, but it may/may not be required depending on the latency and bandwidth of your remote data centre.
      I also use BackupPC and it works really well, strongly recommend it for disk based backup solutions, though it also support archive to tape/etc

  • Mike Hanby Nov 7, 2014 @ 2:09

    Don’t leave out Ghetto-Timemachine :-) https://github.com/flakrat/ghetto-timemachine

  • Patrik Uytterhoeven Nov 7, 2014 @ 17:15

    Another one that is missing in this list is SEP http://www.sep.de

  • cybernard Nov 10, 2014 @ 15:04

    A good backup system should have some parity data to protect against bad sectors on the backup media. Especially for optical media.

  • Dan Stromberg Nov 10, 2014 @ 17:12

    I did a rather similar comparison a while back. It’s still available at http://stromberg.dnsalias.org/~strombrg/backshift/documentation/comparison/index.html

  • bunkobugsy Nov 10, 2014 @ 20:45

    How about “rotating mirror” RAID1, would you call that a backup solution?

    • Erathiel Dec 1, 2014 @ 8:47

      What you mean is creating a RAID1 array with 2+ disks and then pulling one of them out every now and then and replacing it with a fresh drive to rebuild the array, but keep the old (ripped-out) drive as a frozen copy? Might work, but is possibly tricky. You’d have to test how your RAID controller behaves in the process. If it’s an always-on type of system, that would require a hot-swap type of controller, otherwise you would need to turn the server off at every disk change. Not to mention the need to have physical access to the server, which limits the method to on-premises machines. Plus, it doesn’t scale well. Actually, it doesn’t scale at all ;)

      • colin mcdermott Dec 23, 2014 @ 19:37

        IMHO, this is probably one of the worst kinds of backups. You will have to check your RAID consistency before HDD removal, if consistent remove the drive, insert new drive and thrash your controller while it gets written (who knows what production impact). Don’t give yourself the heartache!

        Remember HDD fail, you don’t want to be caught with 1 working disk at the wrong moment.

  • MisterT Nov 10, 2014 @ 22:22

    Great analysis. But what about storing backup files on Clouds ? Did anybody tried to use the impressive volume of disk space available for free to store safely (encrypted) the backup shrunks ?
    Is there any such feature in those tools ?

  • Idntfllow Nov 11, 2014 @ 0:40

    IMHO Attic beats them all (https://attic-backup.org/)
    and anyway a modern backup program must have deduplication, without that there is an immense waste of computation, throughput and storage

    • Le Balladeer Jul 18, 2015 @ 6:06

      What kind of de-duplication Attic has?

      • Idntfllow Jul 18, 2015 @ 11:29

        It has the best that I know, variable block level dedup, but what it counts is that on the road it truly shows to be very fast and efficient, plus in the end it also compresses the deduped blocks

        • Le Balladeer Jul 19, 2015 @ 5:04

          Wow. That’s awesome! I thought it only had file level de-dup. I have some more questions I’ll throw your way :-)

          Also is there any good resource for not so enlightened with CLI folks? I mean let’s say I am on OSX and I would like to back some folders from my /home/, not all and store the backup on a remote VPS where I have remote access where should I begin – i.e. from installing attic on my home machine to setting up the remote VPS server.

          Also can I have my own chosen encryption keys instead of system generated – even if I have to select long ones I am fine?

          De-dup happens before sending the incremental compressed changes to destination/server or after that?

          Can I (or does it automatically) periodically check whether my backups are in order – anything corrupted?

          Can I do like – “Okay, keep everything backed up – all the versions”, then, “Now, delete everything up till time T”?

          Are there backup sets? I.e. “Set 1: Very personal files” — backup every day at 6pm; “Set 2: Very critical files” — backup every hour; “Set 3: code” — backup every 15 minutes. I assume that’s was repositories are for, right?

  • Maren Nov 11, 2014 @ 11:12

    I work for Univention and I like to say that I totally agree with the recommendation of Bacula and also Bareos. We’ve got many satisfied customers working with one of these solutions. They also integrate seamlessly in our product, the Linux distro Univention Corporate Server (UCS) via the our app center. Both are very reliable, performing and professional open source backup solutions. For anyone who is interested in these and a wide variety of other great open source solutions, can visit our app center and test them at:

  • Sumit Singh Nov 26, 2014 @ 18:06

    Very Impressive Dude, Using Linux From a Long Decade but not sure about my Backup Files, But after reading your post i get the Solution. Thanks for the information. Love this Website.

  • enders t Dec 13, 2014 @ 6:41


  • csirac2 Dec 24, 2014 @ 1:14

    What bothers me with most of these solutions is that they can’t leverage native ZFS or BTRFS features to do their work. ZFS and BTRFS both have fantastically efficient copy-on-write snapshotting and filesystem send/receive functionality which makes syncing snapshots over a network or raw IO comparatively instantaneous.

    On the other hand there certainly is a use-case for dedicated backup servers: for one, a DIY cron job using btrfs/zfs features is going to be awkward to secure against a rogue machine stomping all over historical snapshots or simply filling up a backup server.

    • Tim Small May 18, 2015 @ 12:50

      snapper.io will manage native snapshots (or lvm snapshots) – then just use rsync (etc.) on top of it. ZFS support may be along at some time in the future.

      • csirac2 May 19, 2015 @ 23:39

        Appreciate your comment! Since writing that, I’ve pushed my “snazzer” (inspired by snapper) project to github: https://github.com/csirac2/snazzer – it has a few less dependencies, uses isodate snapshot naming, “nicer” pruning (according to me at least), does snapshot measurements (sha512sums of entire directories so I can detect any corruption shenanigans whether on btrfs, ext4 or whatever) and finally, it makes use of btrfs send/receive to efficiently shunt snapshots of all my hosts around the place.

  • Le Balladeer Dec 26, 2014 @ 7:02

    Anything that works great with Amzoon Glacier with features like

    1. Versioning
    2. De-duplication

    while sitting in the background and monitoring changes at the time interval specified?

  • Kayvlim Jan 21, 2015 @ 3:35

    There’s also storeBackup, it’s pretty good for what it does.

  • Mauricio Feb 6, 2015 @ 15:19

    Much info, bery bakup, so thanks. :)

  • Hunkah Jun 29, 2015 @ 20:05

    Why isn’t fwbackups given more street cred? I just tested it out and, holy cow, it’s awesome!

  • saranya Aug 22, 2016 @ 10:18

    what type of chunking is used in urbackup?and what methodology of urbackup?

  • Rampriya Aug 24, 2016 @ 9:39

    what type of compression algorithm used in AceBackup tool?

  • Dan Alec Nov 18, 2016 @ 13:57

    in addition, there is [duplicati](https://github.com/duplicati/duplicati) which is aimed for encrypted backup on cloud storage services

  • Richard Reijmers Nov 19, 2016 @ 8:02

    I would like to suggest Relax and Recover as an excellent Disaster Recovery tool, but also for granular backup and recovery.

Leave a Reply

Your email address will not be published.

Use HTML <pre>...</pre> for code samples. Still have questions? Post it on our forum