How to backup the remote files in Linux / UNIX

Posted on in Categories , , , , , , , , , , , , last updated August 24, 2007

Q. How do I make remote backups under Linux? I’ve CentOS 5 Linux server located in remote data center and I’d like to backup it to local or another server?

A. Both Linux / UNIX come with handy tools to make secure remote backups. You can use tool called rsync for automating remote backups of your Linux, UNIX, Windows server, Mac OS X and BSD systems. rsync is a program with many more options and uses the rsync remote-update protocol to greatly speed up file transfers when the destination file is being updated.

Task: Copy files / backup files from remote Linux server

Let us say you would like to backup files from remote server called server.nixcraft.in and directory called /home/vivek to local directory called /backup, type the command as follows on local system:
$ rsync -avz -e ssh [email protected]:/home/vivek/ /backup
You need to supply password for vivek user.

Task: Exclude files from backup

You can also skip few files from backup. Let us say you don’t want to backup all C source code file, enter:
$ rsync --exclude '*.cpp' -avz -e ssh [email protected]:/home/vivek/ /backup

Task: Automatic backup using a shell script

SSH always prompts for a password. To automate process via a shell script you need to remove password using SSH key i.e. generate passphraseless keys, enter (type at local system):
$ ssh-keygen -t dsa
When asked for to enter passphrase, just press [ENTER] key twice. Now copy public key to remote server:
$ scp ~/.ssh/id_dsa.pub [email protected]:.ssh/authorized_keys
Now you can login without a password. For more information see – howto setup SSH with DSA public key authentication and RSA key authentication for password less login.

Now create a simple shell script as follows:
$ vi backup.sh
Append code:
#!/bin/bash
rsync --exclude '*.cpp' --exclude '*.log' -avz -e ssh [email protected]:/home/vivek/ /backup

Setup executable permission using chmod command:
$ chmod +x backup.sh
Use cron to command to backup remote server:
$ crontab -e
Make a backup everyday:
@daily /path/to/backup.sh
Save and close the file.

Linux Changing DNS Search Order

Posted on in Categories , , , , last updated October 11, 2007

Q. How do I change DNS search order in Linux? In order to improve performance I need to make changes.

A. Under Linux you need to use /etc/nsswitch.conf file which is used by system databases and Name Service Switch configuration file

Various functions in the C Library need to be configured to work correctly in the local environment. Tradition ally, this was done by using files (e.g., ‘/etc/passwd’), but other nameservices (like the Network Information Service (NIS) and the Domain Name Service (DNS)) became popular, and were hacked into the C library, usually with a fixed search order.

Step # 1: /etc/nsswitch.conf

Open /etc/nsswitch.conf file using text editor:
# vi /etc/nsswitch.conf

Look for hosts:
hosts: files dns mdns4

Set above order as per your requirement. Close and save the file.

host.conf – resolver configuration file

The file /etc/host.conf contains configuration information specific to the resolver library. It should contain one configuration keyword per line, followed by appropriate configuration information.

Open /etc/host.conf file
# vi /etc/host.conf
Find order line which specifies how host lookups are to be performed. It should be followed by one or more lookup methods, separated by commas. Valid methods are bind (dns server), hosts (/etc/hosts file), and nis (old method).
order hosts,bind

Save and close the file.

See the nsswitch.conf and host.conf man pages for details.

Samba mount and access large 2GB+ files from share or NAS device

Posted on in Categories , , , , , , , last updated March 6, 2007

Q. I’m using NAS server share to make a backup of my webserver. However samba is not allowing me to backup of large files (2GB+).

My Samba share mount command is as follows:
smbmount //nas1.domain.com/sharename /datanas -o username=LOGINNAME,password=LOGINPASSWORD

Now if I copy file called /var/log/httpd/access.log (which is 3.5 GB file):
cp /var/log/httpd/access.log /datanas

I get ab error – file size limit exceeded

How do I solve this problem and copy large files to NAS samba share?

A. Linux kernel use smbfs.ko/smbfs.o module and it does not support file sizes greater than 2 GB. This is well known problem.

Mount your NAS samba share with lfs (large file system) option. General syntax is as follows:
smbmount //Hostname/Username /local/mountpoint -o username=username,password=password,lfs

For example:
# smbmount //nas1.domain.com/sharename /datanas -o username=LOGINNAME,password=LOGINPASSWORD,lfs

Where,

  • //nas1.domain.com/sharename : Server and Share name
  • /datanas : Local mount point
  • -o username=LOGINNAME,password=LOGINPASSWORD,lfs : Specify login, password and lfs options

Now you should be able to copy large files w/o a problem.