Linux: Local / Remote Backup For Large Files

last updated in Categories Backup, data center, Data recovery, Howto, Linux

You can easily backup large files using Linux backup software combination – tar, split and md5sum program. From the article:


I have a directory containing some files for a virtual computer. The files that hold the data for the virtual computer’s hard discs are not only big, they are also “sparse” files, which means that they use only enough disc space for the data that was actually written to the file. For example, the virtual computer may have a 30GB drive of which 2GB has been used. Even though it uses only 2GB on my hard disc, a program that reads the file may see it as a 30GB file. This type of file can be tricky to back up because, when you copy it, you can end up with a 30GB file, or it might simply fail to copy, depending on the type of file system used on the backup storage.

=> Backing up Large Files


Posted by: Vivek Gite

The author is the creator of nixCraft and a seasoned sysadmin, DevOps engineer, and a trainer for the Linux operating system/Unix shell scripting. Get the latest tutorials on SysAdmin, Linux/Unix and open source topics via RSS/XML feed or weekly email newsletter.

2 comment

  1. Vivek,

    Thanks for the link to the very nice article.

    I strongly believe in taking md5 check-sum of all my backups (database, configuration etc.). I definitely do a check-sum before the restore.

    I have seen in one situation of DB restore, the restore itself went successfully and there was a corruption in some of the blocks that affected a functionality of the DB.

    Doing a md5 check-sum after backup and before restore, would’ve pointed out the issue immediately.

    The Geek Stuff

    Have a question? Post it on our forum!