≡ Menu

Backup

Debian 4.0 has been released. It is recommended that you upgrade the system to latest version. Upgrading remote Debian server is a piece of cake :D

Currently many of our boxes are powered by Debian 3.1 Sarge. For example typical web server may have following packages only:
=> Apache
=> PHP
=> Postfix and other mail server software
=> Iptables and backup scripts
=> MySQL 5.x etc

Procedure

Following are essential steps to upgrade your system:
1. Verify current system
2. Update package list
3. Update distribution
4. Update /etc/apt/sources.list file
5. Reboot system
6. Test everything is working

Backup your system

Before upgrading your Debian systems make sure you have backup (I’m assuming that you make backup copies of all important data/files everyday:):

  1. User data / files / Emails (/home, /var/www etc)
  2. Important system files and configuration file stored in /etc
  3. MySQL and other database backup
  4. Backup Installed package list [Get list of installed software for reinstallation / restore software]

Step # 1: Verify current system

File /etc/debian_version stores current Debian version number :
$ cat /etc/debian_version
Output:

3.1

Find out kernel version
$ uname -mrs
Output:

Linux 2.6.8-3-386 i686

Step #2: Update package list

Use apt-get command:
# apt-get update

Step #3 : Update distribution

Pass dist-upgrade option to apt-get command. This will upgrade Sarge to Etch. dist-upgrade' in addition to performing the function of upgrade, also intelligently handles changing dependencies with new versions of packages; apt-get has a "smart" conflict resolution system, and it will attempt to upgrade the most important packages at the expense of less important ones if necessary.
# apt-get dist-upgrade

This upgrade procedure takes time. Depend upon installed softwares and other factors such as network-speed you may need to wait from 10 minutes to 1+ hour(s).

Step #4 : Update /etc/apt/sources.list file

There seems to be a small bug in upgrade procedure. You need to manually update Debian security source line. You will see an error as follows:
W: Conflicting distribution: http://security.debian.org stable/updates Release (expected stable but got sarge)
W: You may want to run apt-get update to correct these problems

Just open /etc/apt/sources.list file:
# vi /etc/apt/sources.list
Find line that read as follows:
deb http://security.debian.org/ stable/updates main contrib
Replace with :
deb http://security.debian.org/ etch/updates main contrib non-free

Save and close the file. Now type the following command:
# apt-get update

Step #5: Reboot system

You are done. Just reboot the system:
# reboot

Step #6: Make sure everything is working...

See Debian distro version:
$ cat /etc/debian_version
Output:

4.0

Make sure all services are running, just go thought all log files once.
# netstat -tulpn
# tail -f /var/log/log-file-name
# less /var/log/dmesg
# top
....
...
....

Use apt-key command to manage the list of keys used by apt to authenticate packages. Packages which have been authenticated using these keys will be considered trusted. Make sure you see package etch related keys:
# apt-key list

/etc/apt/trusted.gpg
--------------------
pub   1024D/2D230C5F 2006-01-03 [expired: 2007-02-07]
uid                  Debian Archive Automatic Signing Key (2006) <ftpmaster@debian.org>
pub   1024D/6070D3A1 2006-11-20 [expires: 2009-07-01]
uid                  Debian Archive Automatic Signing Key (4.0/etch) <ftpmaster@debian.org>
pub   1024D/ADB11277 2006-09-17
uid                  Etch Stable Release Key <debian-release@lists.debian.org>

If there is a problem use following command to update the local keyring with the keyring of Debian archive keys and removes from the keyring the archive keys which are no longer valid.
# apt-key update
# apt-key list

Finally just see if any new updates/security updates are available:
# apt-get update
# apt-get upgrade

Further readings

How do you install and use rsync to synchronize files and directories from one location (or one server) to another location? - A common question asked by new sys admin.
[click to continue…]

Almost 2 years back I wrote about recovering deleted text file with grep command under UNIX or Linux.

Michael Stutz shows us how to recover deleted files using lsof command.

From the article:
There you are, happily playing around with an audio file you've spent all afternoon tweaking, and you're thinking, "Wow, doesn't it sound great? Lemme just move it over here." At that point your subconscious chimes in, "Um, you meant mv, not rm, right?" Oops. I feel your pain -- this happens to everyone. But there's a straightforward method to recover your lost file, and since it works on every standard Linux system, everyone ought to know how to do it.

Briefly, a file as it appears somewhere on a Linux filesystem is actually just a link to an inode, which contains all of the file's properties, such as permissions and ownership, as well as the addresses of the data blocks where the file's content is stored on disk. When you rm a file, you're removing the link that points to its inode, but not the inode itself; other processes (such as your audio player) might still have it open. It's only after they're through and all links are removed that an inode and the data blocks it pointed to are made available for writing.

This delay is your key to a quick and happy recovery: if a process still has the file open, the data's there somewhere, even though according to the directory listing the file already appears to be gone.

Read more at Linux.com

However recovering files under Linux is still hard work for new admins. I highly recommend backing up files regularly and storing backup offsite.

BBC news has published top 10 data disasters. Hard drives kept in dirty socks and the dangers of oiling your PC feature in a top 10 list of data disasters.

Ontrack Data Recovery has unveiled its annual Top Ten list of remarkable data loss disasters in 2006. Taken from a global poll of Ontrack data recovery experts, this year's list of data disasters is even more incredible when you consider that in every case cited, Ontrack successfully recovered the data.

It is recommended that you always follow these backup tips.

Use amanda to backup your Linux server – Howto

Amanda is one the best open source Linux backup software.

The Advanced Maryland Automatic Network Disk Archiver (AMANDA), is a backup system that allows the administrator to set up a single master backup server to back up multiple hosts over network to tape drives/changers or disks or optical media.

Novell Cool Solutions has published a small how to about Amanda. From the article:
Amanda is simple to use but tricky to configure. It is worth sticking with Amanda until you get a fully working system. Don't be put off by the lack of a slick GUI interface or the need to configure the software via a console shell, once it is configured Amanda does its job very efficiently.

The configuration files for Amanda are stored in the /etc/amanda directory. Each configuration that you create goes into a separate directory within /etc/amanda. There is a sample configuration in the "example" directory that you can use as a jumping off point for configuring Amanda

Contents

* Background
* Installing Amanda
* Configuring Amanda
* Creating file and directories
* Preparing tapes for use with Amanda
* Checking the configuration
* Performing the backup
* Checking the backup
* Automating the backup
* Conclusions

Read more, Using Amanda to Backup Your Linux Server at Novell.com...

How Linux Live CD could save your life

Yet another reason to carry a Live Linux cd :)

From the article:
Everyone's worst nightmare; the normal comforting hum of your computer is disturbed by clicking, pranging, banging...

It happens to everyone because it's inevitable (hard drives are mechanical, as sure as a car will break down your hard drive will fail eventually). However, no matter how often you see it you never quite get used to it happening, the heartache of all the files you lose forever because you were "just about to back it up, honestly". This is not a matter of explaining to you how you can best avoid data loss or how to protect against your hard drive dying, this is an article outlining how Ubuntu (or any other LiveCD available distro) could save your life...

Perform backups for the Linux operating system

This question asked again and again by a new Linux sys admins:

How do I perform backups for my Linux operating system?

So I am putting up all necessary information you ever need to know about backup. The main aim is to provide you necessary software, links and commands to get started as soon as possible.

Backup is essential

First a backup is essential. You need a good backup strategy to:

  • Minimize time from disaster such as server failure or human error (file deleted) or acts of God
  • To avoid downtime
  • Save money and time
  • And ultimately to save your job ;)

A backup must provide

  • Restoration of a single/individual files
  • Restoration of file systems

What to backup?

  • User files and dynamic data [databases] (stored in /home or specially configured partitions or /var etc).
  • Application software (stored in /usr)
  • OS files
  • Application configuration files (stored in /etc, /usr/local/etc or /home/user/.dotfiles)

Different types of backups

  • Full backups: Each file and directory is written to backup media
  • Incremental backups (Full + Incremental backup): This backups are used in conjunction with full backup. These backups will be incremental if each original piece of backed up information is stored only once , and then successive backups only contain the information that changed since the previous one. It use file's modification time to determine which file need to backup.

So when you restore incremental backup:

  1. First restore the last full backup
  2. Next every subsequent incremental backup you need to restore

Preferred Backup Media

  1. Tape (old and trusted method)
  2. Network (ftp, nas, rsync etc)
  3. Disk (hard disk, optical disk etc)

Test backups

Please note that whichever backup media you choose, you need to test your backup. Perform tests to make sure that data can be read from media.

Backup Recommendation

My years of experience show that if you follow following formulas you are most likely to get back your data in worst scenario:
(a) Rotate backup media
(b) Use multiple backup media for same data such as ftp and tape
(c) Keep old copies of backups offsite

In short create good disaster recovery plan.

General procedure to restore a Linux/UNIX box

There is not golden rule or procedure but I follow these two methods:

Method # 1: Reinstall everything, restore everything, and secure everything

Use this method (bare metal recovery) if your server is cracked or hacked or hard drive is totally out of order:

  1. Format everything
  2. Reinstall os
  3. Configure data partitions (if any)
  4. Install drivers
  5. Restore data from backup media
  6. Configure security

Method # 2: Use of recovery CD/DVD rom

Use this method if your box is not hacked and system cannot boot or MBR damaged or accidental file deletion etc:

  1. Boot into rescue mode.
  2. Debug (or troubleshoot) the problem
  3. Verify that disk partitions stable enough (use fsck) to put backup data
  4. Install drivers
  5. Restore data from backup media
  6. Configure security

Linux (and other UNIX oses) backup tools

Luckily Linux/UNIX provides good set of tools for backup. We have almost covered each and every tool mentioned below. Just follow the link to get more information about each command and its usage:

It is also recommended that you use RAID or LVM (see consistent backup with LVM) or combination of both to increase reliability of data.

A note about MySQL or Oracle database backup

Backing up database server such as MySQL or Oracle needs more planning. Generally you can apply a table write lock and use mysql database dump utility to backup database. You can also use LVM volume to save database data.

A note about large scale backup

As I said earlier tar is good if you need to backup small amount of data that does not demands high CPU or I/O. Following are recommended tools for backup that demands high CPU or I/O rate:

(a) amanda - AMANDA, the Advanced Maryland Automatic Network Disk Archiver, is a backup system (open source software) that allows the administrator to set up a single master backup server to back up multiple hosts over network to tape drives/changers or disks or optical media.

(b) Third party commercial proprietary solutions:
Top three excellent commercial solutions:

If you are looking to perform the tasks of protecting large-scale computer systems use above solutions and following two books will give you good idea:

Recommended further readings

I hope this small how to provide enough information to anyone to kick start your backup operation. Tell me if I am missing something or if you have a better backup solution or strategy, please comment back.