Linux: Local / Remote Backup For Large Files

by on June 30, 2008 · 2 comments· LAST UPDATED June 30, 2008

in , ,

You can easily backup large files using Linux backup software combination - tar, split and md5sum program. From the article:

I have a directory containing some files for a virtual computer. The files that hold the data for the virtual computer's hard discs are not only big, they are also "sparse" files, which means that they use only enough disc space for the data that was actually written to the file. For example, the virtual computer may have a 30GB drive of which 2GB has been used. Even though it uses only 2GB on my hard disc, a program that reads the file may see it as a 30GB file. This type of file can be tricky to back up because, when you copy it, you can end up with a 30GB file, or it might simply fail to copy, depending on the type of file system used on the backup storage.

=> Backing up Large Files

TwitterFacebookGoogle+PDF versionFound an error/typo on this page? Help us!

{ 2 comments… read them below or add one }

1 shakaran June 30, 2008 at 12:08 pm

The entire post can not be seen in rss. This would be useful.

Greetings

Reply

2 Ramesh | The Geek Stuff July 1, 2008 at 7:09 am

Vivek,

Thanks for the link to the very nice article.

I strongly believe in taking md5 check-sum of all my backups (database, configuration etc.). I definitely do a check-sum before the restore.

I have seen in one situation of DB restore, the restore itself went successfully and there was a corruption in some of the blocks that affected a functionality of the DB.

Doing a md5 check-sum after backup and before restore, would’ve pointed out the issue immediately.

Ramesh
The Geek Stuff

Reply

Leave a Comment

Tagged as: , , , , , , , , ,

Previous post:

Next post: