HowTo: Backup MySQL Databases, Web server Files to a FTP Server Automatically

by on August 10, 2006 · 123 comments· LAST UPDATED January 22, 2010

in , ,

This is a simple backup solution for people who run their own web server and MySQL database server on a dedicated or VPS server. Most dedicated hosting provider provides backup service using NAS or FTP servers. These service providers will hook you to their redundant centralized storage array over private VLAN. Since, I manage couple of boxes, here is my own automated solution. If you just want a shell script, go here (you just need to provided appropriate input and it will generate FTP backup script for you on fly, you can also grab my php script generator code).

Making Incremental Backups With tar

You can make tape backups. However, sometime tape is not an option. GNU tar allows you to make incremental backups with -g option. In this example, tar command will make incremental backup of /var/www/html, /home, and /etc directories, run:
# tar -g /var/log/tar-incremental.log -zcvf /backup/today.tar.gz /var/www/html /home /etc

Where,

  • -g: Create/list/extract new GNU-format incremental backup and store information to /var/log/tar-incremental.log file.

Making MySQL Databases Backup

mysqldump is a client program for dumping or backing up mysql databases, tables and data. For example, the following command displays the list of databases:
$ mysql -u root -h localhost -p -Bse 'show databases'

Output:

Enter password:
brutelog
cake
faqs
mysql
phpads
snews
test
tmp
van
wp

Next, you can backup each database with the mysqldump command:
$ mysqldump -u root -h localhost -pmypassword faqs | gzip -9 > faqs-db.sql.gz

Creating A Simple Backup System For Your Installation

The main advantage of using FTP or NAS backup is a protection from data loss. You can use various protocols to backup data:

  1. FTP
  2. SSH
  3. RSYNC
  4. Other Commercial solutions

However, I am going to write about FTP backup solution here. The idea is as follows:

  • Make a full backup every Sunday night i.e. backup everything every Sunday
  • Next backup only those files that has been modified since the full backup (incremental backup).
  • This is a seven-day backup cycle.

Our Sample Setup

   Your-server     ===>       ftp/nas server
IP:202.54.1.10   ===>       208.111.2.5

Let us assume that your ftp login details are as follows:

  • FTP server IP: 208.111.2.5
  • FTP Username: nixcraft
  • FTP Password: somepassword
  • FTP Directory: /home/nixcraft (or /)

You store all data as follows:
=> /home/nixcraft/full/mm-dd-yy/files - Full backup
=> /home/nixcraft/incremental/mm-dd-yy/files - Incremental backup

Automating Backup With tar

Now, you know how to backup files and mysql databases using the tar and mysqldump commands. It is time to write a shell script that will automate entire procedure:

  1. First, our script will collect all data from both MySQL database server and file system into a temporary directory called /backup using a tar command.
  2. Next, script will login to your ftp server and create a directory structure as discussed above.
  3. Script will dump all files from /backup to the ftp server.
  4. Script will remove temporary backup from /backup directory.
  5. Script will send you an email notification if ftp backups failed due to any reason.

You must have the following commands installed (use yum or apt-get package manager to install ftp client called ncftp):

  • ncftp ftp client
  • mysqldump command
  • GNU tar command

Here is the sample script:

#!/bin/sh
# System + MySQL backup script
# Full backup day - Sun (rest of the day do incremental backup)
# Copyright (c) 2005-2006 nixCraft <http://www.cyberciti.biz/fb/>
# This script is licensed under GNU GPL version 2.0 or above
# Automatically generated by http://bash.cyberciti.biz/backup/wizard-ftp-script.php
# ---------------------------------------------------------------------
### System Setup ###
DIRS="/home /etc /var/www"
BACKUP=/tmp/backup.$$
NOW=$(date +"%d-%m-%Y")
INCFILE="/root/tar-inc-backup.dat"
DAY=$(date +"%a")
FULLBACKUP="Sun"
### MySQL Setup ###
MUSER="admin"
MPASS="mysqladminpassword"
MHOST="localhost"
MYSQL="$(which mysql)"
MYSQLDUMP="$(which mysqldump)"
GZIP="$(which gzip)"
### FTP server Setup ###
FTPD="/home/vivek/incremental"
FTPU="vivek"
FTPP="ftppassword"
FTPS="208.111.11.2"
NCFTP="$(which ncftpput)"
### Other stuff ###
EMAILID="admin@theos.in"
### Start Backup for file system ###
[ ! -d $BACKUP ] && mkdir -p $BACKUP || :
### See if we want to make a full backup ###
if [ "$DAY" == "$FULLBACKUP" ]; then
  FTPD="/home/vivek/full"
  FILE="fs-full-$NOW.tar.gz"
  tar -zcvf $BACKUP/$FILE $DIRS
else
  i=$(date +"%Hh%Mm%Ss")
  FILE="fs-i-$NOW-$i.tar.gz"
  tar -g $INCFILE -zcvf $BACKUP/$FILE $DIRS
fi
### Start MySQL Backup ###
# Get all databases name
DBS="$($MYSQL -u $MUSER -h $MHOST -p$MPASS -Bse 'show databases')"
for db in $DBS
do
 FILE=$BACKUP/mysql-$db.$NOW-$(date +"%T").gz
 $MYSQLDUMP -u $MUSER -h $MHOST -p$MPASS $db | $GZIP -9 > $FILE
done
### Dump backup using FTP ###
#Start FTP backup using ncftp
ncftp -u"$FTPU" -p"$FTPP" $FTPS<<EOF
mkdir $FTPD
mkdir $FTPD/$NOW
cd $FTPD/$NOW
lcd $BACKUP
mput *
quit
EOF
### Find out if ftp backup failed or not ###
if [ "$?" == "0" ]; then
 rm -f $BACKUP/*
else
 T=/tmp/backup.fail
 echo "Date: $(date)">$T
 echo "Hostname: $(hostname)" >>$T
 echo "Backup failed" >>$T
 mail  -s "BACKUP FAILED" "$EMAILID" <$T
 rm -f $T
fi
 

How Do I Setup a Cron Job To Backup Data Automatically?

Just add cron job as per your requirements:
13 0 * * * /home/admin/bin/ftpbackup.sh >/dev/null 2>&1

Generate FTP backup script

Since I setup many Linux boxes, here is my own FTP backup script generator. You just need to provided appropriate input and it will generate FTP backup script for you on fly.

TwitterFacebookGoogle+PDF versionFound an error/typo on this page? Help us!

{ 123 comments… read them below or add one }

1 Sean August 11, 2006 at 2:00 am

On the mysqldump, there’s a -A option to do all databases at once (and –opt to make things more efficient). As you get a bigger database, look at mysqlhotcopy, only good for MyIASM tables but it’s a lot faster than mysqldumping.

Sean

Reply

2 nixCraft August 11, 2006 at 7:16 am

Sean,

I agree with you -opt is a nice option that adds locking and does extended insert and other stuff. I will update the script with -opt option.

-A is good option but I prefer to backup individual database, as it offers the option of restoring individual databases.

mysqldump –help says –opt option is Enabled by default :) so no need to change script

Appreciate your post.

Reply

3 SaM August 11, 2006 at 9:47 am

Why is the size of the files created using mysqldump are far more than the database files itself?

Rather if someone simply copies all database files to the backup directory..is there any harm…jst a question…

Reply

4 nixCraft August 11, 2006 at 11:38 am

SaM,

You need to compress sql files using gunzip or other available utilities.

mysqldump makes it easy to move data from one server to another.

Hope this helps

Reply

5 SaM August 14, 2006 at 10:52 am

Nix,

You may be right… but I am using a bit un-usual way of backing up tha data.. what I have done is that, I have created the database structure sql file of all that dtabases we are using. And copies the data files on regular basis.

when it comes to retrieve the data, its a simple copy operation…off-cource need to change the ownership to mysql:mysql.

In case I need to trash the datanase and recreate the database, i have the sql file of the structures.

Reply

6 nixCraft August 14, 2006 at 12:04 pm

mysqldump also generates database, table structure along with data. Try out mysqldump command any one of the database you will see the difference.

Reply

7 Ugo Bellavance August 15, 2006 at 1:29 am

This script does ftp and e-mail backups.

Reply

8 Ugo Bellavance August 15, 2006 at 1:30 am
9 jishin January 26, 2007 at 12:02 pm

Another good solution for MySQL database backup is “AutoMySQL Backup” . It does all the things you need for daily backup.
http://members.lycos.co.uk/wipe_out/automysqlbackup/

Reply

10 nixCraft January 27, 2007 at 7:38 pm

jishin,

Thanks for autmmysqlbackup script link :)

Reply

11 Shawn January 29, 2007 at 4:28 am

Thanks for the script. I’ll give it a whirl.

Reply

12 dicky February 2, 2007 at 7:53 am

if i also want to enter into database in this command “$ mysql -u root -h localhost -p -Bse ‘show databases'” what can i do??

Reply

13 Charlie February 4, 2007 at 3:49 pm

What about MySQLDumper? (backup, restore, FTP, mail, multipart, …)

Well the bad thing is the vulnerabilities, like the common XSS thing:
http://secunia.com/product/12282/?task=advisories

Reply

14 shaun April 19, 2007 at 3:58 am

nixcraft, your script backs-up everything needed for my setup. It was instant success using the generator.

Thank-you

Shaun Prince

Reply

15 Chris May 21, 2007 at 11:33 am

Thanks for the script, is brilliant and does exactly what i need it to do.

Just one slight real easy thing i’m sure, but how can i set the FTP port rather than using the standard port 21. If i ncftp manually, i can specify the -P option. Is it easy to put in the script?

Thank you for the script.

Reply

16 jim May 22, 2007 at 2:09 pm

So I backed up my MySQL database. How do I restore it? I have a mybackup.sql.gzip in my home. What do I do to restore it?

Can I restore it to another machine with the same MySQL Version?

Reply

17 nixCraft May 22, 2007 at 3:00 pm

Type the following commands restore the same:
gunzip mybackup.sql.gzip
mysql -u USER -p dbname < mybackup.sql

You can copy file mybackup.sql.gzip using scp to another machine:
scp mybackup.sql.gzip user@machinetwo:/tmp
Login to machinetwo:
cd /tmp
gunzip mybackup.sql.gzip
mysql -u USER -p dbname < mybackup.sql
rm mybackup.sql

Further readings
(a) How can I restore a backup of a MySQL database?
(b) Copy MySQL database from one server to another remote server

HTH

Reply

18 Atasa Rossios June 11, 2007 at 11:32 am

Hi,
this is very nice but i get the following errors:
[: 47: ==: unexpected operator
at the start of the script
and
[: 79: ==: unexpected operator
at the end

therefor I always get email with BACKUP FAILED!.\

Line 47 is “fi”
and Line 79 is the last “fi” at the very end.

Thanx

Reply

19 nixCraft June 11, 2007 at 3:09 pm

Copy and paste script again from wizard

HTH

Reply

20 Atasa Rossios June 12, 2007 at 8:39 pm

Thanx for the reply but I had more troubles
with the long names with the mysqldump.
Filezilla Server did not like the
(mysql-LongDBname.$NOW-$(date +”%T”).gz) but was very happy when I took out the $(date +”%T”) part, which is fine with me I don’t care for the exact time so much.

For the unexpected operator Error:
I have the feeling that the == sign is not the case with “$var1″ == “$var2″.
Therefore my system was complaining for syntax error(Ubuntu Server 7.04).
From what I have seen on the net:
you can say “$var1″ = “$var2″ OR $var1 == $var2.
I haven’t taste the second option myself.
But the first behaves very nice.
Finally I just took out the -g option from tar because I could not restore from a windows machine when I try it.
7zip could not understand the archive and could not extract the contents.

Lastly I added a Open Ldap database backup too, and send a mail also in success!. I simply wanna know what is is happening with the backup process.

Here it is now:

#!/bin/sh
### System Setup ###
DIRS="/etc /srv/www /var/www /home"
BACKUP="/tmp/backup.${$}"
NOW=$(date +%d-%m-%Y)
INCFILE="/root/tar-inc-backup.dat"
DAY=$(date +%a)
FULLBACKUP="Fri"

### MySQL Setup ###
MUSER="user"
MPASS="passwd"
MHOST="localhost"

### FTP server Setup ###
FTPD="/backup/Ubuntu-Server/incremental"
FTPU="user"
FTPP="passwd"
FTPS="192.168.2.50"
NCFTP=$(which ncftpput)

### Other stuff ###
EMAILID="user@domain.com"

### Start Backup for file system ###
[ ! -d $BACKUP ] && mkdir -p $BACKUP || :

### See if we want to make a full backup ###
if [ "$DAY" = "$FULLBACKUP" ];.
then
FTPD="/backup/Ubuntu-Server/full"
FILE="fs-full-${NOW}.tar.gz"
tar -zcvf $BACKUP/$FILE $DIRS
else
i=$(date +%Hh%Mm%Ss)
FILE="fs-i-${NOW}-${i}.tar.gz"
tar zcvf $BACKUP/$FILE $DIRS
fi
### Start MySQL Backup ###
# Get all databases name
DBS="$(mysql -u ${MUSER} -h ${MHOST} -p${MPASS} -Bse 'show databases')"
for db in $DBS
do
FILE=$BACKUP/mysql-$db.$NOW.gz
mysqldump -u $MUSER -h $MHOST -p$MPASS $db | gzip -9 > $FILE.
done

##Backup the Ldap Directory Database
slapcat -v -n 1 -l $BACKUP/LdapDirectory.ldif

## Dump backup using FTP ###
#Start FTP backup using ncftp
ncftp -u$FTPU -p$FTPP $FTPS$T
echo "Hostname: $(hostname)" >>$T
echo "Backup succesfully Completed!" >>$T
mail -s "BACKUP COMPLETED" "$EMAILID" $T
echo "Hostname: $(hostname)" >>$T
echo "Backup failed" >>$T
mail -s "BACKUP FAILED" "$EMAILID"

Cheers A.

Reply

21 Atasa Rossios June 12, 2007 at 8:44 pm

Sorry Here is the complete script hope this time is posted clear.
#!/bin/sh
### System Setup ###
DIRS="/etc /srv/www /var/www /home"
BACKUP="/tmp/backup.${$}"
NOW=$(date +%d-%m-%Y)
INCFILE="/root/tar-inc-backup.dat"
DAY=$(date +%a)
FULLBACKUP="Fri"

### MySQL Setup ###
MUSER="user"
MPASS="passwd"
MHOST="localhost"

### FTP server Setup ###
FTPD="/backup/Ubuntu-Server/incremental"
FTPU="user"
FTPP="passwd"
FTPS="192.168.2.50"
NCFTP=$(which ncftpput)

### Other stuff ###
EMAILID="user@domain.com"

### Start Backup for file system ###
[ ! -d $BACKUP ] && mkdir -p $BACKUP || :

### See if we want to make a full backup ###
if [ "$DAY" = "$FULLBACKUP" ];.
then
FTPD="/backup/Ubuntu-Server/full"
FILE="fs-full-${NOW}.tar.gz"
tar -zcvf $BACKUP/$FILE $DIRS
else
i=$(date +%Hh%Mm%Ss)
FILE="fs-i-${NOW}-${i}.tar.gz"
tar zcvf $BACKUP/$FILE $DIRS
fi
### Start MySQL Backup ###
# Get all databases name
DBS="$(mysql -u ${MUSER} -h ${MHOST} -p${MPASS} -Bse 'show databases')"
for db in $DBS
do
FILE=$BACKUP/mysql-$db.$NOW.gz
mysqldump -u $MUSER -h $MHOST -p$MPASS $db | gzip -9 > $FILE.
done

##Backup the Ldpa Directory Database
slapcat -v -n 1 -l $BACKUP/LdapDirectory.ldif

## Dump backup using FTP ###
#Start FTP backup using ncftp
ncftp -u$FTPU -p$FTPP $FTPS$T
echo "Hostname: $(hostname)" >>$T
echo "Backup succesfully Completed!" >>$T
mail -s "BACKUP COMPLETED" "$EMAILID" $T
echo "Hostname: $(hostname)" >>$T
echo "Backup failed" >>$T
mail -s "BACKUP FAILED" "$EMAILID"

Reply

22 Atasa Rossios June 12, 2007 at 8:45 pm

However you got the my point!

Reply

23 Atasa Rossios June 12, 2007 at 8:48 pm

maybe cut it in pieces
#!/bin/sh
### System Setup ###
DIRS="/etc /srv/www /var/www /home"
BACKUP="/tmp/backup.${$}"
NOW=$(date +%d-%m-%Y)
INCFILE="/root/tar-inc-backup.dat"
DAY=$(date +%a)
FULLBACKUP="Fri"

### MySQL Setup ###
MUSER="user"
MPASS="passwd"
MHOST="localhost"

### FTP server Setup ###
FTPD="/backup/Ubuntu-Server/incremental"
FTPU="user"
FTPP="passwd"
FTPS="192.168.2.50"
NCFTP=$(which ncftpput)

### Other stuff ###
EMAILID="user@domain.com"

### Start Backup for file system ###
[ ! -d $BACKUP ] && mkdir -p $BACKUP || :

### See if we want to make a full backup ###
if [ "$DAY" = "$FULLBACKUP" ];.
then
FTPD="/backup/Ubuntu-Server/full"
FILE="fs-full-${NOW}.tar.gz"
tar -zcvf $BACKUP/$FILE $DIRS
else
i=$(date +%Hh%Mm%Ss)
FILE="fs-i-${NOW}-${i}.tar.gz"
tar zcvf $BACKUP/$FILE $DIRS
fi

And the Second part:
### Start MySQL Backup ###
# Get all databases name
DBS="$(mysql -u ${MUSER} -h ${MHOST} -p${MPASS} -Bse 'show databases')"
for db in $DBS
do
FILE=$BACKUP/mysql-$db.$NOW.gz
mysqldump -u $MUSER -h $MHOST -p$MPASS $db | gzip -9 > $FILE.
done

##Backup the Ldap Directory Database
slapcat -v -n 1 -l $BACKUP/LdapDirectory.ldif

## Dump backup using FTP ###
#Start FTP backup using ncftp
ncftp -u$FTPU -p$FTPP $FTPS

And the Third Part:
### Find out if ftp backup failed or not ###
if [ "$?"="0" ];.
then
rm -f $BACKUP/*
echo "Date: $(date)">$T
echo "Hostname: $(hostname)" >>$T
echo "Backup succesfully Completed!" >>$T
mail -s "BACKUP COMPLETED" "$EMAILID" $T
echo "Hostname: $(hostname)" >>$T
echo "Backup failed" >>$T
mail -s "BACKUP FAILED" "$EMAILID"
Hope now it works to see the hole script.

Reply

24 IAn November 2, 2007 at 11:17 am

About date localization

The command DAY=$(date +%a) is sensible to the system language,
I prefer to use DAY=$(date +%u) to know the number of the day of the week

Reply

25 dk November 7, 2007 at 8:33 pm

I was looking for something like this…

just one question: to send the email of a failed backup what program do I have installed in my pc?

Thankyou

Reply

26 nixCraft November 7, 2007 at 9:37 pm

dk,

use normal unix mail command

Reply

27 WR November 9, 2007 at 5:16 pm

Thanks for the info.. just wondering, does it work with a regular ftp client? I’m on a shared hosting shell account and it doesn’t look like ncftp is installed.

Reply

28 Kjell November 15, 2007 at 8:43 am

Hi!

Thank you for this nice tutorial!!

I have a setup with 2 HDDs, having the second as active mysql database storage and another partition for this backup script.

I have modified it to just move the files to the second HDD instead of transfering via FTP and made it so that folders with $NOW are created before moving.

I am wondering how the $INCFILE works? If i want to restore the “fs” from a week before, do i have to use the initial (bigger) TAR which is in the first folder, or does it work with the smaller ones that are i.e. in the folder of today?

Reply

29 nixCraft November 15, 2007 at 12:40 pm

You need to restore the last full backup first (
$FULLBACKUP day), followed by each of the subsequent incremental backups to the present day in the correct order.

Reply

30 kvz November 30, 2007 at 9:44 am

Here’s a script that can transfer all mysql databases to another server.

Reply

31 Gustavo Hartmann December 4, 2007 at 4:24 pm

Hello,

When I run this script from the command line it works fine but when I run it as a cron job, the FTP connection seems to drop at some point and not all files are not copied over. The FTP server says “disconnected” so as ncftp client logs.

It happens all the time and it is a bit weird. Any clues?

Thanks,G

Reply

32 Chip December 9, 2007 at 2:22 am

Heelo,

tks for the script.

I have a problem when running the script with a cron job, it will do the backup but it will not upload via ftp. Any idea why?

Best regards,
Nuno

Reply

33 robertvvv January 9, 2008 at 6:02 am

Why not try to use mysqlhotcopy to make a backup

Reply

34 Berto January 9, 2008 at 6:02 pm

This looks great, but in Ubuntu 7.10, ncftp isn’t working.

the command
ncftp -u”$FTPU” -p”$FTPP” $FTPS<

gives an error:
wwwbackup.sh: 53: Syntax error: newline unexpected

If I remove the < at the end, it will log in, but then it won’t send the commands to ncftp (after NCFTP quits, it will try running those commands on my own shell!)

Are there ny other way to get ncftp to take commands from the shell script?

Reply

35 Berto January 9, 2008 at 6:14 pm

OK I think I solved my problem by using ncftpput*

I used this line instead of the ncftp lines you have:
ncftpput -u “$FTPU” -p “$FTPP” -m “$FTPS” $FTPD/$NOW $BACKUP/*

and commented out the ncftp command through EOF command.

* by the way, why do you get the path of ncftpput and never use it?

Reply

36 nixCraft January 9, 2008 at 6:19 pm

Berto,

It is a bug, I will try to fix it. Thanks for the heads up.

Reply

37 Berto January 9, 2008 at 6:23 pm

Even more information on my humiliations in Ubuntu:

They made dash the default shell script, not bash!* So you need to specify $!/bin/bash (not $!/bin/sh) for this to work properly. dash also had problems with the equivalency (==) operator at the end.

This will probably fix the ncftp problems, but since I already have ncftpput working, I’ll keep it.

*From http://ubuntuforums.org/showthread.php?t=265391

Reply

38 dave January 12, 2008 at 11:45 pm

Firstly, Thank you so much for this script. I’ve learnt so much just from following it through, and it works perfectly on CentOS 4 after installing ncftp.

Is there a way you can tell it to exclude files? For example, I’ve set it to back up the whole of a certain folder, but there is one sub folder within that I do not want backing up… is this possible?

Thanks!

Reply

39 Ken Edwards February 24, 2008 at 7:38 am

Hi,

I have two questions. I would have multiple databases to back up in most cases, and multiple users assigned to multiple databases. How would that work?

Many thanks!

Ken

Reply

40 Randy Keyes March 4, 2008 at 7:38 pm

I am getting an error

line53: syntax error near unexpected token ‘do
line53: ‘do

Reply

41 Internet Marketing Legacy March 10, 2008 at 5:32 pm

Very helpful. I’m experimenting with setting up a couple servers of my own, and this script is exactly what I was looking for. Thanks for sharing.

Reply

42 Tim1981 April 2, 2008 at 8:33 am

You can also use:

/usr/bin/md5sum -b $BACKUP/* >$BACKUP/backup$NOW-$i.md5

Between backup and transfer, so you can verify if the files aren’t modified during transfer or something.

Reply

43 Adrian April 14, 2008 at 6:56 pm

This seems to work well, except when I run it it just does incremental backups. Of what, I’m not sure, as there has not been a full backup yet. Invariably it generates just a 48kb file. The database alone should be at least 800kb… Is there any way to force a full backup? How can I check that this script is working properly?

Thanks,
Adrian

Reply

44 nixCraft April 14, 2008 at 7:48 pm

Backup is compressed using gzip; just uncompressed and verify data. By default full backup is made on Sunday; Change FULLBACKUP=”Tue” variable.

Reply

45 wulfman April 24, 2008 at 4:07 am

thank you for a wonderful script. works perfectly

Reply

46 Chris May 6, 2008 at 11:07 pm

In this:

if [ “$?” == “0” ]; then

you are checking the return of ncftp, but it always returns “0”

I even altered the password to force it to error and still it returned “0”.

Has anyone confirmed that this will show anything other than “0”?

I did this to check before the if statement

echo “Return Code = $?”

Reply

47 Sean May 8, 2008 at 6:12 pm

for some unknown reason if i run the script manually it works as it should. everything is zipped up and sent to the ftp server.

However if i set up a cron job all the files are zipped up and placed in the /tmp dir (so i know the script is running) but the files are never sent to the ftp server.

i’ve tried this with several ftp servers etc. etc. with no luck.

Reply

48 oh4real June 14, 2008 at 6:40 pm

Thanks heaps. Clever script. Worked fully once I gave mysql user LOCK privileges. Very helpful.

Might suggest putting php.txt link up at top of wizard page. I entered dummy info, not wanting to send IPs and logins to php script. I now have the script itself and could generate again.

Nice work and thanks for sharing.

Reply

49 Ali September 28, 2008 at 7:55 am

Hi
Well ncpftp is not available on my hosting I am using this on shared hosting.

I did change the bash script to

### Dump backup using FTP ###
#Start FTP backup using ncftp
#ncftp -u”$FTPU” -p”$FTPP” $FTPS<<EOF
# Login, run get files
ftp -inv $FTPS <<END_SCRIPT
quote USER $FTPU
quote PASS $FTPP
mkdir $FTPD
mkdir $FTPD/$NOW
cd $FTPD/$NOW
lcd $BACKUP
mput *
quit
EOF

Reply

50 Noel October 3, 2008 at 4:44 pm

Great script, but am having some challenges getting it to work. First I edited the file on a windows machine but that produced a /bin/sh^M error, so I had to change the line endings to just \n instead of \r\n

Then I had to install ncftp

But now I am getting a “username and/or password was not accepted for login” error on the FTP. I know I am inputting the correct details, and have tried several.

When I used Ali’s version, using FTP instead of ncftp I noticed that it seemed to be sending the username as ‘username_’ instead of just ‘username’. Does the script change the details in some way? Any other ideas why I cannot log in with the script?

I would love to get this going!

Thanks

Reply

51 Noel October 3, 2008 at 4:46 pm

Hmm, don’t know what it did there, putting in that link, but I think you still get the gist of the problem

Reply

52 Teck October 13, 2008 at 10:40 am

Using it for a lot of time and still working perfect!

Now I need exclude some dirs, there are a way to exclude directories?

Tnx ;)

Reply

53 Asolar June 13, 2009 at 8:01 am

I get following error:
line 60: ncftp: command not found

how shall I fix it?

Reply

54 nixCraft June 13, 2009 at 8:26 am

Install ncftp package.

Reply

55 Energy Drink July 3, 2009 at 7:57 pm

I just use MySQL administrator to run my daily backups. I can’t think of an easier solution. I have over 30 clients on a variety of shared MySQL databases, and VPS databases. I have the task scheduled to run at midnight every night. However I am searching for an automated web server solution for backups. Thus far I haven’t found anything. Would be great if I could find something like norton ghost but works via FTP. Making incremental backups every night. My hosting provider charges $25 / month for FTP backup. Its cheaper to go out and buy a 2 TB hard drives and run your own backups. If anyone has a more user friendly suggestion let me know.

Reply

56 Mike August 7, 2009 at 3:16 am

Hi,

I’m having some problem with the script. I seem to be having problem with the ncftp usage.

The backup files are being created on the temp folder however it will not upload it on my FTP and I am receiving the “Failed backup” email.

I run the script and here’s what I’m getting:


/var/lib/mysql/eximstats/sends.MYI
/var/lib/mysql/eximstats/smtp.MYD
/var/lib/mysql/eximstats/smtp.MYI
mysqldump: Got error: 1033: Incorrect information in file: ‘./horde/horde_sessionhandler.frm’ when using LOCK TABLES
mysqldump: Got error: 1033: Incorrect information in file: ‘./roundcube/cache.frm’ when using LOCK TABLES
/root/ncftpd-2.8.6/glibc2.5/ncftpd: illegal option — i
Usage: ncftpd [flags]

Optional Flags (usually set in general config file):
-p XX : Use port XX for control connection (and XX – 1 for data).
-n XX[,YY] : Allow maximum of XX concurrent server users (max-users); keep
at least YY processes running to serve users (min-users).
-v : Increase logging verbosity.
-q : Do not echo log messages to the screen.
-Q : Force echo of log messages to the screen, even if not a tty
(Default is to echo automatically if it is a terminal).
-e : Print the detected hostname and exit.
-b : Print the version information and exit.
-d : Run as background daemon.
Exiting.

Any help will be greatly appreciated.

Thanks

Reply

57 myousuf August 13, 2009 at 8:13 am

i using server os my computer particion c & d & e my c driver only 20gb . d& E 120 gb my computer c drive full i compair e drive and c drive how to compair

Reply

58 myousuf August 13, 2009 at 8:14 am

how to setup the ie7 auto refreshing
friends share with me

Reply

59 myousuf August 13, 2009 at 8:17 am

how to mysql auto backup setting
i daily setting

Reply

60 Mike September 8, 2009 at 3:28 pm

FYI. Since my remote FTP is a windows box, file format for the sql backups had to be modified to exclude the colons in the time format. Changed FILE=$BACKUP/mysql-$db.$NOW-$(date +”%T”).gz to FILE=$BACKUP/mysql-$db.$NOW-$(date +”%Hh%Mm%Ss”).gz and now it works great.

Perhaps this will help someone out that has the same issue.

Thanks!

Reply

61 Jaw September 8, 2009 at 6:25 pm

Thank to Ali for modifying this to work with ftp, Ali ftp script above
But I found that my files were coming out corrupt after transfer.
I had to add binary before mput * to switch it to binary mode before transferring the files. Hope this helps someone.

### Dump backup using FTP ###
#Start FTP backup using ncftp
#ncftp -u $FTPU -p $FTPP $FTPS<<EOF
# Login, run get files
ftp -inv $FTPS <<EOF
quote USER $FTPU
quote PASS $FTPP
mkdir $FTPD
mkdir $FTPD/$NOW
cd $FTPD/$NOW
lcd $BACKUP
binary
mput *
quit
EOF

Reply

62 giny8i8 October 11, 2009 at 9:25 am

Perfect script generator (and script) thanks a lot NixCraft, just what I was looking for. Saved me my half life! Respect! You rock!

some questions…

# ftp connection drop
What would happen if during the FTP transfer procedure my second server – which receives the transfer of the backed up files – goes down, or drops the FTP connection. Will the whole process be terminated, or will it try to reconnect and try to continue uploading the files? Will I get any notification about any of those? (Sorry if it is obvious, but I am a noob :)
(( My second server is a shared hosting at HostGator, with an unlimited storage plan. Just ideal fore storing the files, but the FTP connection is crappy and fluctuating, sometimes drops because of the heavy shared usage ))

# free space requirements
please correct me (orr approve) if I get it wrong:
the script copies the files directly into the tar-ed gzip-ed archive, so the required free space on the originating server is equal to the space requirement of the gzip-ed files. For instance:
If the size of all my website files (altogether) is 1 GB and i have some smaller mysql (e.g. joomla) databases lets say 100 MB altogether, than i would need approximately 2,2 GB free space to back them up successfully. Is this rule ( [occupied space] x 2 ) a good way to estimate the free space needed for the backup process?
((On my primary server i have very limited space, since it is a virtual server, and the storage enhancement is expensive, so i would like to buy as small storage space as possible ))

Reply

63 Bas Maree October 21, 2009 at 7:30 am

Script works perfect. Thanks!

I was only wondering if there is a possibilyt to exclude file extensions like .zip

Reply

64 James November 17, 2009 at 6:36 am

Thanks for the script – a good starting point for me. I ended up commenting out #NCFTP="$(which ncftpput)" under FTP server setup and then used the following, as the password kept failing for some reason:

### Dump backup using FTP ###
#Start FTP backup using ncftp
#ncftp -u"$FTPU" -p"$FTPP" $FTPS<<EOF
# Login, run get files
ftp -inv <<EOF
open $FTPS
user $FTPU $FTPP
mkdir $FTPD
mkdir $FTPD/$NOW
cd $FTPD/$NOW
lcd $BACKUP
binary
mput *
quit
EOF

Reply

65 mahipal solanki December 12, 2009 at 8:54 am

sir how to configure ftp server

Reply

66 Nicholas January 18, 2010 at 8:25 pm

Hi,

The backup script is great. But i have very limited space in the ftp backup. If there a way to have an addon to the script to remove old backups and keep 7 incremental backups and 4 weekly backups?

Reply

67 Michael Peters January 22, 2010 at 5:17 pm

Ok, I am at a loss. First, thanks for the great script. I think it will do wonders for me once I have it up and running.

My fs-i tar’s transfer just great. They complete, and it begins to transfer mysql.XX-XX-2010-HH:MM;SS.gz and finally I get an error “lost data connection to remote host: Broken pipe.”

Then it tries to tranfer others and i get a “put *: could not send file to remote host.” error.

Any tips?

Reply

68 Nnyan January 22, 2010 at 5:22 pm

This is fine but doesn’t mysqldump stop the databse from being upated? What if we have a very busy site and can not stop mysqld or allow the db to be blocked? What solutions are there for a hot backup?

Reply

69 nixCraft January 22, 2010 at 5:39 pm

Use LVM snapshots or specialized backup software such as zmanda open source or enterprise.

Reply

70 nixCraft January 22, 2010 at 5:44 pm

@Michael, a few ftp servers do not allow special characters in a file name. Update the following line to remove %T part and try again:

FILE=$BACKUP/mysql-$db.$NOW-$(date +"%T").gz

i.e. set time in hour_minute_seconds_am_OR_pm format

FILE=$BACKUP/mysql-$db.$NOW-$(date +"%l_%M_%S_%P").gz

Reply

71 Michael Peters January 22, 2010 at 6:06 pm

BRILLIANT!!! That did the trick! BTW, the server I was FTPing to was using FIleZilla server, if anyone else has this problem!

Reply

72 Wladimir January 27, 2010 at 5:39 pm

Hi,

I have a server with a lot of domains and a bigger space ocupied. I want to make something like this batch but I need a separated tar file per domain.

I have a vhost directory with a various subdirectoris (one per domain), ¿is it possible, with tar, to make one different file per directory?

Sorry for my poor english :-(

Reply

73 savin April 7, 2010 at 8:19 am

i want to backup (otherwise ftp) the tmp directory of ubuntu server to windows machine…
(without network sharing.. it must be available in windows even ubuntu server is turned off)
help me on tis….also guide in automating the above issue..

Reply

74 Pieter April 8, 2010 at 10:26 pm

As Nicholas, I do only have limited space on the on-site backup. Is there any script that would remove backups older than a specified number of days / weeks?

Would be very grateful if you could help me out!

Reply

75 Data Do IT April 22, 2010 at 5:39 pm

What exactly is this condition checking for?

if [ “$?” == “0” ]; then

I’m finding that no matter what, it’s always true.

Reply

76 andrea April 23, 2010 at 5:30 pm

doesn’t work for me shit.
[: 47: ==: unexpected operator

put *: server said: mysql-wpsusat.23-04-2010-13:29:41.gz: The parameter is incorrect.
[: 79: ==: unexpected operator

Reply

77 Andrea April 23, 2010 at 7:25 pm

Ok i just used = instead of == because of “posix somwhat”
and my “put error” was due to server filesystem, my storage server unfortunately is winnt based and doesn’t support that file format xxx-xxx-12:23:34 Tue xxx

Working on a function to keep the backup of the last month only
Thanks for the script it works like a charm

Reply

78 Laurence July 22, 2010 at 8:19 pm

Can you help me please? I keep getting errors that the directory doesn’t exist, even though the script is trying to create it:

ProFTPD 1.3.1rc2 Server (ProFTPD Default Installation)
Logging in…
User xxx logged in
Logged in to xxxxxxx.
MKD /home/user/backup failed; [/home/user/backup: No such file or directory]
Could not mkdir /home/user/backup: server said: /home/user/backup: No such file or directory
MKD /home/user/backup/22-07-2010 failed; [/home/user/backup/22-07-2010: No such file or directory]
Could not mkdir /home/user/backup/22-07-2010: server said: /home/user/backup/22-07-2010: No such file or directory
Could not chdir to /home/user/backup/22-07-2010: server said: home: No such file or directory

Any ideas? It’s able to log in properly because it transfers the files to /home/user instead of /home/user/backup, so I don’t know why it’s not properly creating those directories.

Reply

79 jecro August 5, 2010 at 3:58 pm

Perfect tutorial and script! Works flawless on my CentOS 5.5 Server. Only had to install ncftp and gcc.

Reply

80 Michael August 7, 2010 at 8:26 pm

This is exactly what I am going to use to being periodic backups of my webserver. Thanks you!!

Reply

81 Sam Critchley August 19, 2010 at 2:05 pm

Great technique and also great to have an auto-generated script of your own. When using it, I ran into the following error from mysqldump:

ERROR: Access denied for user ‘root’@’localhost’ to database ‘information_schema’ when using LOCK TABLES

This is because the ‘information_schema’ table in the latest versions of MySQL isn’t really a table (I think). So you can modify the mysqldump command in the script to add the –skip-lock-tables option. See ‘man mysqldump’ for details.

Thanks, Sam

Reply

82 Prekshu Ajmera September 22, 2010 at 5:48 am

Thanks Vivek. Just got exactly what I was looking for.

Reply

83 ka73ka September 22, 2010 at 2:16 pm

because if [ “$?” == “0” ]; then is allways 0 i use:

ncftpput -m -u”$FTPU” -p”$FTPP” $FTPS $FTPD/$NOW/ $BACKUP/*

this give 1 if fails

Reply

84 Edukatr October 1, 2010 at 5:53 am

How do I do this on amazon ec2 where I can only use sftp using ssh instead of ftp?

Reply

85 Vivek October 6, 2010 at 7:02 am

Thanks for this lovely script , I have a VPS and a reseller account & now I do regular 1 click backups from my VPS to reseller acct. with this script , Thanks again . by the way , I’m Vivek too :)

Reply

86 Hamed October 24, 2010 at 12:39 pm

hi vivek.
i need to use this script but everytime i put this on the crontab , when im check the log in /var/log/cron only show me the “Crond [4631] : (root) CMD (run-parts /etc/cron.hourly)” . what is this mean by the way and how much time its get to do the job ?

regards
hamed

Reply

87 Mitchell December 30, 2010 at 6:07 pm

Hi. The page at https://bash.cyberciti.biz/backup/wizard-ftp-script.php (the SSL script generator) seems to be broken.

Reply

88 shahid parvez January 24, 2011 at 6:39 pm

i have a shared hosting. can anybody please help me that how can i set my cronjob for shared hosting.

Reply

89 mordaha March 18, 2011 at 3:39 pm

Your script making monday-inc.tgz as incrementary backup from saturday, not from sunday full backup.

When you are doing FULL backup, you must first delete $INCFILE

if [ "$DAY" == "$FULLBACKUP" ]; then
  FTPD="/home/vivek/full"
  FILE="fs-full-$NOW.tar.gz"
  rm $INCFILE
  tar -g $INCFILE -zcvf $BACKUP/$FILE $DIRS
else
  i=$(date +"%Hh%Mm%Ss")
  FILE="fs-i-$NOW-$i.tar.gz"
  tar -g $INCFILE -zcvf $BACKUP/$FILE $DIRS
fi

Reply

90 mordaha March 18, 2011 at 3:49 pm

And this is modification – using lftp mirroring (so you always have a copy of backup locally) and automaticaly deleting old backup files (you can have 1-2-4-… weeks of backup as you wish)

### System Setup ###
DIRS="/etc /var /home"
BACKUP=/backup/backup
NOW=$(date +"%Y%m%d")
INCFILE="/root/tar-inc-backup.dat"
DAY=$(date +"%a")
FULLBACKUP="Sun"
### FTP server Setup ###
FTPU="uu"
FTPP="ppp"
FTPS="ftps"
NCFTP="$(which ncftpput)"
### Other stuff ###
EMAILID="mail@gmail.com"
### Start Backup for file system ###
[ ! -d $BACKUP ] && mkdir -p $BACKUP || :
### See if we want to make a full backup ###
if [ "$DAY" == "$FULLBACKUP" ]; then
  FTPD="//full"
  FILE="full-$NOW.tar.gz"
  rm $INCFILE
  tar -g $INCFILE -zcvf $BACKUP/$FILE $DIRS
else
  i=$(date +"%Hh%Mm%Ss")
  FILE="inc-$NOW-$i.tar.gz"
  tar -g $INCFILE -zcvf $BACKUP/$FILE $DIRS
fi
### Dump backup using FTP ###
#Start FTP backup using ncftp
lftp <<EOF
open -u $FTPU,$FTPP $FTPS
mkdir //backup
mirror -c -e -R /backup/backup //backup
quit
EOF
### Find out if ftp backup failed or not ###
if [ "$?" == "0" ]; then
 find $BACKUP/*gz -ctime +8 -delete
else
 T=/tmp/backup.fail
 echo "Date: $(date)">$T
 echo "Hostname: $(hostname)" >>$T
 echo "Backup failed" >>$T
 mail  -s "BACKUP FAILED" "$EMAILID" <$T
 rm -f $T
fi

Reply

91 Josh March 30, 2011 at 11:37 pm

Is there a way to have it email you on a successful backup not just if it failed?

Thanks!!

Reply

92 Brad July 11, 2011 at 7:20 pm

I would also like to have it email me every day, successful or not. How can I make this happen?

Reply

93 Josh April 1, 2011 at 5:17 am

Also is there a way to split the files in say 50-100mb files before they go to FTP?

Thanks Again!

Reply

94 Josh April 2, 2011 at 3:19 am

Well I have moded the script to do what I want. I used rar for compression and splitting the files up as its being made so I dont have a 100 gig file then have to call split to finish the job. Here it is. Hope this helps someone.

### System Setup ###
DIRS=”/home/data”
BACKUP=/olddrive/backup
NOW=$(date +”%Y%m%d”)
INCFILE=”/olddrive/backup/tar-inc-backup.dat”
DAY=$(date +”%a”)
FULLBACKUP=”FRI”

### FTP server Setup ###
FTPU=”YOUR_USER”
FTPP=”YOUR_PASS”
FTPS=”YOUR_SERVER”
NCFTP=”$(which ncftpput)”

### Other stuff ###
EMAILID=”admin@barrettnetworks.com”

### Start Backup for file system ###
[ ! -d $BACKUP ] && mkdir -p $BACKUP || :

### See if we want to make a full backup ###
if [ “$DAY” == “$FULLBACKUP” ]; then
FTPD=”//full”
FILE=”full-$NOW.rar”
rm $INCFILE
rar a -m5 -v50m $INCFILE $BACKUP/$FILE $DIRS
else
i=$(date +”%Hh%Mm%Ss”)
rar a -m5 -v50m FILE=”inc-$NOW-$i.rar”
rar a -m5 -v50m $INCFILE $BACKUP/$FILE $DIRS
fi

### Dump backup using FTP ###
#Start FTP backup using ncftp
ncftp -u”$FTPU” -p”$FTPP” $FTPS<$T
echo “Hostname: $(hostname)” >>$T
echo “Backup failed” >>$T
mail -s “BACKUP FAILED” “$EMAILID” <$T
rm -f $T
fi

Reply

95 artware April 21, 2011 at 9:28 am

Hello,
… is there a way to put in script a rule to delete backups older then,…. let say 3 months?,…for limited ftp space!

Reply

96 Pat O'Brien May 16, 2011 at 1:01 am

I’m not an expert by any means, but I wanted a way to inform me of failures as well. The script above only notified on successes. So I removed the ncftp portion of the script, and replaced it with this:

ncftpput -E -u $FTPU -p $FTPP -m $FTPS $FTPD/$NOW $BACKUP/*

I had to use the -E option because I can’t use passive mode at this time. Remove -E if you can use passive mode.

I then modified the last portion of the script to include my text for when the backup failed. It looks like this

### Find out if ftp backup failed or not ###
if [ "$?" == "0" ]; then
 rm -rf $BACKUP
 T=/tmp/backup.good
 echo "Date: $(date)">$T
 echo "Hostname: $(hostname)" >>$T
 echo "" >>$T
 echo "This email is to inform you that the latest backup was a success" >>$T
 mail  -s "[host.example.com] - Backup Successful" "$EMAILID" $T
 echo "Hostname: $(hostname)" >>$T
 echo "" >>$T
 echo "This email is to inform you that the latest backup has failed" >>$T
 echo "" >>$T
 echo "Please investigate why the backup failed. The backup files are still available at $BACKUP" >>$T
 echo "" >>$T
 echo "Don't forget to delete this directory when done" >>$T
 mail  -s "[host.example.com] - Backup Failed" "$EMAILID" <$T
 rm -f $T
fi

Reply

97 Shane July 20, 2011 at 3:57 pm

If you want to delete old backups:

In my case, I only wanted to keep backups around for 1 week. So every day, when this runs, it removes the backup from 7 days ago. This is very easy to accomplish. Just add this code Before the ### Dump backup using FTP ### line:

REMOVAL_DATE=$(date -d “1 week ago” +”%d-%m-%Y”)

Then add this right after you open NCFTP, like so:

ncftp -u”$FTPU” -p”$FTPP” $FTPS<<EOF
rmdir $FTPD/$REMOVAL_DATE

You can change "1 week ago" as needed. "1 day ago" or "1 month ago" or "3 months ago" etc etc etc are all fine.

Reply

98 Alex March 7, 2012 at 5:07 pm

Can you expand on this? I can’t find the resources. Linux will interpret “1 day ago” vs. “1 month ago”?

Is ncftp required? I installed VSFTP.

Reply

99 Andrea July 20, 2011 at 6:06 pm

Listen. After many (many) configurations and techniques (included the one above that seems to be not so 100% portable to me).
The best solution for me was to use duplicity along with automysqlbackup and amazon s3 Service (totally cheap but optional).
In this way I can backup the entire database and all files using a 100% incremental backup in a really easy and transparent way.

Just have a look here:

http://duplicity.nongnu.org/

I used rsync, rsnapshot and suff but I never managed to get something really stable for all my servers like the solution I’ve found now.
With duplicity I can automatically restore a little portion of my backup in no time and is really easy to use.

Reply

100 Sanya July 25, 2011 at 12:21 pm

Hello, your script is that what i need. Thanks! But i have a question: i have several databases on my host, how can i set single database, which i need to backup?

Reply

101 MKZA August 17, 2011 at 10:25 am

This script is the BEES KNEES !!! I have this running on 3 servers like an absolute charm and now I can sleep at night knowing it’s all taken care of.

I tried Shane’s suggestion above for removing older backup folders but all I get is “Directory not Empty” back from ncftp. Will have to experiment a bit more with this.

What I found is a nice addition for is to include the following command in your ftpbackup.sh script just below the line ### Other Stuff ###

Add the following:

### Backup DPKG Software List ###
dpkg –get-selections > /etc/installed-software-dpkg.log

Then you also have your list of all your installed applications and you can use that to restore them too as per this article: http://www.cyberciti.biz/tips/linux-get-list-installed-software-reinstallation-restore.html#comment-173345

Reply

102 MKZA August 19, 2011 at 1:52 pm

I had a problem last night and I hope someone can help with a solution. This script went rogue on one of my servers and did not complete but instead got into some horrible loop.

This morning I started noticing errors on my web sites and eventually tracked that down to a lack of disk space. I then looked in the /tmp/ folder on the server and found about 200 backup.xxxx directories created by this script.

What could have caused this script to have gone haywire like that?? I deleted the cron job, recreated it and reset it to run at midnight and will see if it happens again but it does worry me somewhat. The same script works perfectly on my other 2 servers.

Is there some way to force this script to die and email me if it runs into an error of ANY sort ????

Any help or suggestions would be greatly appreciated.

Reply

103 Kabuwa Mbulo August 21, 2011 at 2:38 pm

Would like to backup a database running on linux to windows server where I would like to use data protection manager to backing the database!! How do I script this do dumps my database to a windows folder and then put it to tape? Help

Reply

104 Andrew October 4, 2011 at 9:57 am

I am getting an issue when it goes to save the mysql dump file

NcFTP 3.2.2 (Aug 18, 2008) by Mike Gleason (http://www.NcFTP.com/contact/).
Connecting to 137.219.74.64…
FileZilla Server version 0.9.39 beta
written by Tim Kosse (Tim.Kosse@gmx.de)
Please visit http://sourceforge.net/projects/filezilla/
Logging in…
Logged on
Logged in to 137.219.74.64.
Usage: mkdir dir1 [dir2…]
CWD successful. “/04-10-2011″ is current directory.
fs-i-04-10-2011-14h39m01s.tar.gz: 160.00 B 112.98 kB/s
…mysql.04-10-2011-14:39:02.gz: ETA: 0:00 32.00/131.51 kB 133.55 MB/s Lost data connection to remote host: Broken pipe.
mysql-mysql.04-10-2011-14:39:02.gz: 128.00/131.51 kB 7.95 MB/s
put *: socket write error.
/tmp/backup.3661/fs-i-04-10-2011-14h39m01s.tar.gz /tmp/backup.3661/mysql-mysql.04-10-2011-14:39:02.gz

The first file is saved fine, the database dump is not.

Any ideas?

Reply

105 Andrew October 5, 2011 at 3:57 am

The answer is sitting right in front of me. The file naming of the mysql files is different to the normal files. The time function used to create the name of the mysql files puts in “:” in the time segment of the name which is not allowed as part of a file name. Trying to figure out how to fix this up apart from just copying the way the time is done with the files, but this would change the way the incremental backups are done I presume.

Reply

106 Ronsky January 17, 2012 at 7:41 am

I got an error “syntax error near unexpected token ‘do”. What could be the possible cause of this error message?

Your reply will be greatly appreciated :)

Reply

107 Seldon January 23, 2012 at 12:38 am

I keep getting the following error, I have copied the script three times now. I would also like to add that I had to add the following to the script to make the mySQL work. It also looks like it is not doing the file system backup. For the “DIRS” i have var/www

Error: “[: 79: 0: unexpected operator”

Reply

108 Seldon January 23, 2012 at 11:44 pm

Ok, It is doing the file system backup so that works. Also the “DIRS” is set to “/var/www” not “var/www” (That was a typo sorry). So it is doing the backup but still failing because of this error. Also why cant I uncompress the data? For example I downloaded the FS file to my mac and tried to extract it, but I got a permissions error.

Reply

109 George February 12, 2012 at 8:24 am

Hi,
Your scripts work great, but i need to upload Mysql backup files to server via SMB.
I have Apple TimeCapsule and i want to store all backups there.The device is in same LAN

Reply

110 Jon February 16, 2012 at 8:45 am

Spot on! Thanks for this great script.

Reply

111 Vladimir March 10, 2012 at 8:59 am

If your script don`t work via cron? you mast replace ncftp command to full path to ncftp. For me it: /usr/local/bin/ncftp
so command line like:
/usr/local/bin/ncftp -u$FTPU -p$FTPP $FTPS ….

Reply

112 Thomas May 12, 2012 at 3:31 pm

Hi Vivek
Great site. Thanks for all the hard work.
The way I read this script, it is not actually doing what you set out to do. It will do a weekly full backup fine. However as I read your code, the increment section will be endlessly incrementing off of the first incremental backup. You must get your increment file off of the full (level 0) backup (using the -g option), and erase the increment file before each full backup to reset it. That will then do a level 1-6 increment on each incremental backup and reset for the full backup which creates a new increment file. Would this not be the actual code to do what you set out to do, or am I missing something?
`
if [ “$DAY” == “$FULLBACKUP1″ ]; then
rm $INCFILE
FTPD=”/home/user/backupdir/full”
FILE=”fs-full-$NOW.tar.gz”
tar -g $INCFILE -zcvf $BACKUP/$FILE $DIRS
else
i=$(date +”%Hh%Mm%Ss”)
FILE=”fs-i-$NOW-$i.tar.gz”
tar -g $INCFILE -zcvf $BACKUP/$FILE $DIRS
fi
`
Regards,
Thomas

Reply

113 Thomas May 14, 2012 at 12:01 pm

Hi vivek,
The$ FULLBACKUP1 variable in the above code is a red herring and should have remained $FULLBACKUP. I’m using a variation of this very helpful script to to do a level 0 and then two weeks of level 1s before doing a level 0 again, and left the extra variable in the code above by mistake when I cut and pasted to adapt it back to your script to post.
Nonetheless, could you point out where I’m wrong, if I am, that your script will endlessly increment off of the first incremental backup never making reference to the full backups. I can’t see how tar will increment off of the first full backup when you’ve set no increment file from it and never reset the increment file each full backup, instead the first incremental backup will be a full backup and the increments will be based off of that forever (level 0,1,2,3,4,5,6,7,8,9…on to infinity) which would be a nightmare to try and reconstitute. I’ll very happily be wrong, but can’t see where I am.
Regards,
Thomas

Reply

114 nixCraft May 14, 2012 at 1:14 pm

Yes the script need to be fixed. After level 0, it will go to level 1…beyond and infinity. What you can do is modify the script and create only 7 archives (0 full and rest as incremental). On Sunday (full backup day), you can force full backups either by removing the $INCFILE before running tar, or by supplying the ‘–level=0’ option. On the ftp server you will have dirs like:

backup.full-level.0.tar.gz
backup.Mon-level.1.tar.gz
backup.Wed-level.3.tar.gz
....
....
.
backup.Sat-level.6.tar.gz

A sample script (this is just for dry run and to get you started):

#!/bin/bash
_fullday="Sun"
_dirs="/etc /var/www/html"
_snapshot="/root/snap.dat"
_now=$(date +"%d-%m-%Y")
_day=$(date +"%a")
_lpath="/backup"
_prefix="$_day"
_incarg='--listed-incremental=/root/tarbackup.dat'
# set empty to avoid compression #
_compress=".gz"
_tarargs=""
_file=""
_level=0
_init(){
	export PATH=$PATH:/usr/local/bin:/usr/local/sbin
	[ ! -d "$_lpath" ] && mkdir -p "$_lpath"
	[ "$_day" == "$_fullday" ] && _prefix="full"
	[ "$_compress" == ".gz" ] && _tarargs="-zcf" || _tarargs="-cf"
	case $_day in
		Mon) _level=1;;
		Tue) _level=2;;
		Wed) _level=3;;
		Thu) _level=4;;
		Fri) _level=5;;
		Sat) _level=6;;
	esac
	_file="$_lpath/backup.${_prefix}-level.${_level}.tar${_compress}"
	case $_day in
		Sun) _fullbackup;;
		Mon|Tue|Wed|Thu|Fri|Sat) _incbackup $_day;;
	esac
}
_fullbackup(){
	echo "Starting full backup @ level # $_level..."
	echo "tar $_incarg --level=0 $_tarargs \"$_file\" \"$_dirs\""
}
_incbackup(){
	echo "Starting incremental backup @ level # $_level..."
	echo "tar $_incarg $_tarargs \"$_file\" \"$_dirs\""
}
## main logic ##
_init

Dry run output:

Starting full backup @ level # 0...
tar --listed-incremental=/root/tarbackup.dat --level=0 -zcf "/backup/backup.full-level.0.tar.gz" "/etc /var/www/html"
Starting incremental backup @ level # 1...
tar --listed-incremental=/root/tarbackup.dat -zcf "/backup/backup.Mon-level.1.tar.gz" "/etc /var/www/html"
Starting incremental backup @ level # 2...
tar --listed-incremental=/root/tarbackup.dat -zcf "/backup/backup.Tue-level.2.tar.gz" "/etc /var/www/html"
Starting incremental backup @ level # 3...
tar --listed-incremental=/root/tarbackup.dat -zcf "/backup/backup.Wed-level.3.tar.gz" "/etc /var/www/html"
Starting incremental backup @ level # 4...
tar --listed-incremental=/root/tarbackup.dat -zcf "/backup/backup.Thu-level.4.tar.gz" "/etc /var/www/html"
Starting incremental backup @ level # 5...
tar --listed-incremental=/root/tarbackup.dat -zcf "/backup/backup.Fri-level.5.tar.gz" "/etc /var/www/html"
Starting incremental backup @ level # 6...
tar --listed-incremental=/root/tarbackup.dat -zcf "/backup/backup.Sat-level.6.tar.gz" "/etc /var/www/html"

Feel free to modify the script as you need to add code for mysql and ftp. Hope this helps!

Edit: I will post tested script later on, but I’m dam sure the logic is on the correct path, as I’ve hardcoded tar 7 days only.

Reply

115 forgulencia July 12, 2012 at 1:20 pm

For the people getting this kind of error: “syntax error near unexpected token ‘do””
The solution is simple. Change the first line of the script to #!/bin/bash from #!/bin/sh.

Tested on Debian Squeeze.

Reply

116 unixer November 6, 2012 at 1:48 pm

Hello, I need to specific instance backup not all db (show databases) How can I do this?

Thanks,

Reply

117 Andreas December 25, 2012 at 7:34 pm

Hey all,

maybe take a look on the backup2l script (http://backup2l.sourceforge.net/)
Its easy to setup, and have great flexibility.

You only need to add the mysqldump in the pre-backup section thats all.

Reply

118 Stuart June 25, 2013 at 11:11 pm

I’ve used this as the basis for backups on a few of my server – I’ve had some restrictions on them. The first didn’t support the “ncftp” command, so had to change to “ftp”. The other didn’t support “ncftp” or “ftp”, so had to change to CURL.

If you want to use CURL, just comment out the ftp/ncftp command lines, and add:

curl -T $BACKUP/$FILE ftp://$FTPS/$FTPD/ –user “$FTPU”:”$FTPP”
curl -T $FILE2 ftp://$FTPS/$FTPD/ –user “$FTPU”:”$FTPP”

And set the MySQL back variable to $FILE2 instead of $FILE

HTH

Reply

119 xeero July 12, 2013 at 1:50 pm

I’m getting error

————
[: 47: Fri: unexpected operator
————-
even If I use sunday or monday, i get same error with those days
and script is making increamental backup all the time.

Reply

120 shanavas July 23, 2013 at 12:52 pm

Hi,
I want to make only the mysql database . I have removed the codes for the webfolder part. But i want the option of full back up and incremental back up. How can i do this ?

Reply

121 Alim July 30, 2013 at 12:04 pm

Hi there! Personally I don’t use any scripts to backup mysql databases, I prefere GUI tool like dbForge Studio for MySQL, but it also allows to make backups through command-line interface.

Reply

122 Max January 15, 2014 at 10:21 pm

I fond you script very usefull and I had give it a try.
It wirks, but about to restore compressed file ?
I tried to deflat in different os (Mac) to see what has been saved, but can’t be able to do that (permission error issues)
Any idea or help ?
THanks

Reply

123 dan February 1, 2014 at 10:38 am

How do u make the full backups an everyday thing? instead of incremental?

Reply

Leave a Comment

Tagged as: , , , , , , , , , , , , , , , , , , , ,

Previous post:

Next post: