HowTo: Backup MySQL Databases, Web server Files to a FTP Server Automatically

Posted on in Categories Data recovery, Linux, MySQL, Shell scripting last updated January 22, 2010

This is a simple backup solution for people who run their own web server and MySQL database server on a dedicated or VPS server. Most dedicated hosting provider provides backup service using NAS or FTP servers. These service providers will hook you to their redundant centralized storage array over private VLAN. Since, I manage couple of boxes, here is my own automated solution. If you just want a shell script, go here (you just need to provided appropriate input and it will generate FTP backup script for you on fly, you can also grab my php script generator code).

Making Incremental Backups With tar

You can make tape backups. However, sometime tape is not an option. GNU tar allows you to make incremental backups with -g option. In this example, tar command will make incremental backup of /var/www/html, /home, and /etc directories, run:
# tar -g /var/log/tar-incremental.log -zcvf /backup/today.tar.gz /var/www/html /home /etc

Where,

  • -g: Create/list/extract new GNU-format incremental backup and store information to /var/log/tar-incremental.log file.

Making MySQL Databases Backup

mysqldump is a client program for dumping or backing up mysql databases, tables and data. For example, the following command displays the list of databases:
$ mysql -u root -h localhost -p -Bse 'show databases'

Output:

Enter password:
brutelog
cake
faqs
mysql
phpads
snews
test
tmp
van
wp

Next, you can backup each database with the mysqldump command:
$ mysqldump -u root -h localhost -pmypassword faqs | gzip -9 > faqs-db.sql.gz

Creating A Simple Backup System For Your Installation

The main advantage of using FTP or NAS backup is a protection from data loss. You can use various protocols to backup data:

  1. FTP
  2. SSH
  3. RSYNC
  4. Other Commercial solutions

However, I am going to write about FTP backup solution here. The idea is as follows:

  • Make a full backup every Sunday night i.e. backup everything every Sunday
  • Next backup only those files that has been modified since the full backup (incremental backup).
  • This is a seven-day backup cycle.

Our Sample Setup

   Your-server     ===>       ftp/nas server
IP:202.54.1.10   ===>       208.111.2.5

Let us assume that your ftp login details are as follows:

  • FTP server IP: 208.111.2.5
  • FTP Username: nixcraft
  • FTP Password: somepassword
  • FTP Directory: /home/nixcraft (or /)

You store all data as follows:
=> /home/nixcraft/full/mm-dd-yy/files – Full backup
=> /home/nixcraft/incremental/mm-dd-yy/files – Incremental backup

Automating Backup With tar

Now, you know how to backup files and mysql databases using the tar and mysqldump commands. It is time to write a shell script that will automate entire procedure:

  1. First, our script will collect all data from both MySQL database server and file system into a temporary directory called /backup using a tar command.
  2. Next, script will login to your ftp server and create a directory structure as discussed above.
  3. Script will dump all files from /backup to the ftp server.
  4. Script will remove temporary backup from /backup directory.
  5. Script will send you an email notification if ftp backups failed due to any reason.

You must have the following commands installed (use yum or apt-get package manager to install ftp client called ncftp):

  • ncftp ftp client
  • mysqldump command
  • GNU tar command

Here is the sample script:

#!/bin/sh
# System + MySQL backup script
# Full backup day - Sun (rest of the day do incremental backup)
# Copyright (c) 2005-2006 nixCraft <http://www.cyberciti.biz/fb/>
# This script is licensed under GNU GPL version 2.0 or above
# Automatically generated by http://bash.cyberciti.biz/backup/wizard-ftp-script.php
# ---------------------------------------------------------------------
### System Setup ###
DIRS="/home /etc /var/www"
BACKUP=/tmp/backup.$$
NOW=$(date +"%d-%m-%Y")
INCFILE="/root/tar-inc-backup.dat"
DAY=$(date +"%a")
FULLBACKUP="Sun"
### MySQL Setup ###
MUSER="admin"
MPASS="mysqladminpassword"
MHOST="localhost"
MYSQL="$(which mysql)"
MYSQLDUMP="$(which mysqldump)"
GZIP="$(which gzip)"
### FTP server Setup ###
FTPD="/home/vivek/incremental"
FTPU="vivek"
FTPP="ftppassword"
FTPS="208.111.11.2"
NCFTP="$(which ncftpput)"
### Other stuff ###
EMAILID="[email protected]"
### Start Backup for file system ###
[ ! -d $BACKUP ] && mkdir -p $BACKUP || :
### See if we want to make a full backup ###
if [ "$DAY" == "$FULLBACKUP" ]; then
  FTPD="/home/vivek/full"
  FILE="fs-full-$NOW.tar.gz"
  tar -zcvf $BACKUP/$FILE $DIRS
else
  i=$(date +"%Hh%Mm%Ss")
  FILE="fs-i-$NOW-$i.tar.gz"
  tar -g $INCFILE -zcvf $BACKUP/$FILE $DIRS
fi
### Start MySQL Backup ###
# Get all databases name
DBS="$($MYSQL -u $MUSER -h $MHOST -p$MPASS -Bse 'show databases')"
for db in $DBS
do
 FILE=$BACKUP/mysql-$db.$NOW-$(date +"%T").gz
 $MYSQLDUMP -u $MUSER -h $MHOST -p$MPASS $db | $GZIP -9 > $FILE
done
### Dump backup using FTP ###
#Start FTP backup using ncftp
ncftp -u"$FTPU" -p"$FTPP" $FTPS<<EOF
mkdir $FTPD
mkdir $FTPD/$NOW
cd $FTPD/$NOW
lcd $BACKUP
mput *
quit
EOF
### Find out if ftp backup failed or not ###
if [ "$?" == "0" ]; then
 rm -f $BACKUP/*
else
 T=/tmp/backup.fail
 echo "Date: $(date)">$T
 echo "Hostname: $(hostname)" >>$T
 echo "Backup failed" >>$T
 mail  -s "BACKUP FAILED" "$EMAILID" <$T
 rm -f $T
fi

How Do I Setup a Cron Job To Backup Data Automatically?

Just add cron job as per your requirements:
13 0 * * * /home/admin/bin/ftpbackup.sh >/dev/null 2>&1

Generate FTP backup script

Since I setup many Linux boxes, here is my own FTP backup script generator. You just need to provided appropriate input and it will generate FTP backup script for you on fly.

129 comment

  1. On the mysqldump, there’s a -A option to do all databases at once (and –opt to make things more efficient). As you get a bigger database, look at mysqlhotcopy, only good for MyIASM tables but it’s a lot faster than mysqldumping.

    Sean

  2. Sean,

    I agree with you -opt is a nice option that adds locking and does extended insert and other stuff. I will update the script with -opt option.

    -A is good option but I prefer to backup individual database, as it offers the option of restoring individual databases.

    mysqldump –help says –opt option is Enabled by default :) so no need to change script

    Appreciate your post.

  3. Why is the size of the files created using mysqldump are far more than the database files itself?

    Rather if someone simply copies all database files to the backup directory..is there any harm…jst a question…

  4. Nix,

    You may be right… but I am using a bit un-usual way of backing up tha data.. what I have done is that, I have created the database structure sql file of all that dtabases we are using. And copies the data files on regular basis.

    when it comes to retrieve the data, its a simple copy operation…off-cource need to change the ownership to mysql:mysql.

    In case I need to trash the datanase and recreate the database, i have the sql file of the structures.

  5. if i also want to enter into database in this command “$ mysql -u root -h localhost -p -Bse ‘show databases'” what can i do??

  6. Thanks for the script, is brilliant and does exactly what i need it to do.

    Just one slight real easy thing i’m sure, but how can i set the FTP port rather than using the standard port 21. If i ncftp manually, i can specify the -P option. Is it easy to put in the script?

    Thank you for the script.

  7. So I backed up my MySQL database. How do I restore it? I have a mybackup.sql.gzip in my home. What do I do to restore it?

    Can I restore it to another machine with the same MySQL Version?

  8. Type the following commands restore the same:
    gunzip mybackup.sql.gzip
    mysql -u USER -p dbname < mybackup.sql

    You can copy file mybackup.sql.gzip using scp to another machine:
    scp mybackup.sql.gzip [email protected]:/tmp
    Login to machinetwo:
    cd /tmp
    gunzip mybackup.sql.gzip
    mysql -u USER -p dbname < mybackup.sql rm mybackup.sql

    Further readings
    (a) How can I restore a backup of a MySQL database?
    (b) Copy MySQL database from one server to another remote server

    HTH

  9. Hi,
    this is very nice but i get the following errors:
    [: 47: ==: unexpected operator
    at the start of the script
    and
    [: 79: ==: unexpected operator
    at the end

    therefor I always get email with BACKUP FAILED!.\

    Line 47 is “fi”
    and Line 79 is the last “fi” at the very end.

    Thanx

  10. Thanx for the reply but I had more troubles
    with the long names with the mysqldump.
    Filezilla Server did not like the
    (mysql-LongDBname.$NOW-$(date +”%T”).gz) but was very happy when I took out the $(date +”%T”) part, which is fine with me I don’t care for the exact time so much.

    For the unexpected operator Error:
    I have the feeling that the == sign is not the case with “$var1” == “$var2”.
    Therefore my system was complaining for syntax error(Ubuntu Server 7.04).
    From what I have seen on the net:
    you can say “$var1” = “$var2” OR $var1 == $var2.
    I haven’t taste the second option myself.
    But the first behaves very nice.
    Finally I just took out the -g option from tar because I could not restore from a windows machine when I try it.
    7zip could not understand the archive and could not extract the contents.

    Lastly I added a Open Ldap database backup too, and send a mail also in success!. I simply wanna know what is is happening with the backup process.

    Here it is now:

    #!/bin/sh
    ### System Setup ###
    DIRS="/etc /srv/www /var/www /home"
    BACKUP="/tmp/backup.${$}"
    NOW=$(date +%d-%m-%Y)
    INCFILE="/root/tar-inc-backup.dat"
    DAY=$(date +%a)
    FULLBACKUP="Fri"

    ### MySQL Setup ###
    MUSER="user"
    MPASS="passwd"
    MHOST="localhost"

    ### FTP server Setup ###
    FTPD="/backup/Ubuntu-Server/incremental"
    FTPU="user"
    FTPP="passwd"
    FTPS="192.168.2.50"
    NCFTP=$(which ncftpput)

    ### Other stuff ###
    EMAILID="[email protected]"

    ### Start Backup for file system ###
    [ ! -d $BACKUP ] && mkdir -p $BACKUP || :

    ### See if we want to make a full backup ###
    if [ "$DAY" = "$FULLBACKUP" ];.
    then
    FTPD="/backup/Ubuntu-Server/full"
    FILE="fs-full-${NOW}.tar.gz"
    tar -zcvf $BACKUP/$FILE $DIRS
    else
    i=$(date +%Hh%Mm%Ss)
    FILE="fs-i-${NOW}-${i}.tar.gz"
    tar zcvf $BACKUP/$FILE $DIRS
    fi
    ### Start MySQL Backup ###
    # Get all databases name
    DBS="$(mysql -u ${MUSER} -h ${MHOST} -p${MPASS} -Bse 'show databases')"
    for db in $DBS
    do
    FILE=$BACKUP/mysql-$db.$NOW.gz
    mysqldump -u $MUSER -h $MHOST -p$MPASS $db | gzip -9 > $FILE.
    done

    ##Backup the Ldap Directory Database
    slapcat -v -n 1 -l $BACKUP/LdapDirectory.ldif

    ## Dump backup using FTP ###
    #Start FTP backup using ncftp
    ncftp -u$FTPU -p$FTPP $FTPS$T
    echo "Hostname: $(hostname)" >>$T
    echo "Backup succesfully Completed!" >>$T
    mail -s "BACKUP COMPLETED" "$EMAILID" $T
    echo "Hostname: $(hostname)" >>$T
    echo "Backup failed" >>$T
    mail -s "BACKUP FAILED" "$EMAILID"

    Cheers A.

  11. Sorry Here is the complete script hope this time is posted clear.
    #!/bin/sh
    ### System Setup ###
    DIRS="/etc /srv/www /var/www /home"
    BACKUP="/tmp/backup.${$}"
    NOW=$(date +%d-%m-%Y)
    INCFILE="/root/tar-inc-backup.dat"
    DAY=$(date +%a)
    FULLBACKUP="Fri"

    ### MySQL Setup ###
    MUSER="user"
    MPASS="passwd"
    MHOST="localhost"

    ### FTP server Setup ###
    FTPD="/backup/Ubuntu-Server/incremental"
    FTPU="user"
    FTPP="passwd"
    FTPS="192.168.2.50"
    NCFTP=$(which ncftpput)

    ### Other stuff ###
    EMAILID="[email protected]"

    ### Start Backup for file system ###
    [ ! -d $BACKUP ] && mkdir -p $BACKUP || :

    ### See if we want to make a full backup ###
    if [ "$DAY" = "$FULLBACKUP" ];.
    then
    FTPD="/backup/Ubuntu-Server/full"
    FILE="fs-full-${NOW}.tar.gz"
    tar -zcvf $BACKUP/$FILE $DIRS
    else
    i=$(date +%Hh%Mm%Ss)
    FILE="fs-i-${NOW}-${i}.tar.gz"
    tar zcvf $BACKUP/$FILE $DIRS
    fi
    ### Start MySQL Backup ###
    # Get all databases name
    DBS="$(mysql -u ${MUSER} -h ${MHOST} -p${MPASS} -Bse 'show databases')"
    for db in $DBS
    do
    FILE=$BACKUP/mysql-$db.$NOW.gz
    mysqldump -u $MUSER -h $MHOST -p$MPASS $db | gzip -9 > $FILE.
    done

    ##Backup the Ldpa Directory Database
    slapcat -v -n 1 -l $BACKUP/LdapDirectory.ldif

    ## Dump backup using FTP ###
    #Start FTP backup using ncftp
    ncftp -u$FTPU -p$FTPP $FTPS$T
    echo "Hostname: $(hostname)" >>$T
    echo "Backup succesfully Completed!" >>$T
    mail -s "BACKUP COMPLETED" "$EMAILID" $T
    echo "Hostname: $(hostname)" >>$T
    echo "Backup failed" >>$T
    mail -s "BACKUP FAILED" "$EMAILID"

  12. maybe cut it in pieces
    #!/bin/sh
    ### System Setup ###
    DIRS="/etc /srv/www /var/www /home"
    BACKUP="/tmp/backup.${$}"
    NOW=$(date +%d-%m-%Y)
    INCFILE="/root/tar-inc-backup.dat"
    DAY=$(date +%a)
    FULLBACKUP="Fri"

    ### MySQL Setup ###
    MUSER="user"
    MPASS="passwd"
    MHOST="localhost"

    ### FTP server Setup ###
    FTPD="/backup/Ubuntu-Server/incremental"
    FTPU="user"
    FTPP="passwd"
    FTPS="192.168.2.50"
    NCFTP=$(which ncftpput)

    ### Other stuff ###
    EMAILID="[email protected]"

    ### Start Backup for file system ###
    [ ! -d $BACKUP ] && mkdir -p $BACKUP || :

    ### See if we want to make a full backup ###
    if [ "$DAY" = "$FULLBACKUP" ];.
    then
    FTPD="/backup/Ubuntu-Server/full"
    FILE="fs-full-${NOW}.tar.gz"
    tar -zcvf $BACKUP/$FILE $DIRS
    else
    i=$(date +%Hh%Mm%Ss)
    FILE="fs-i-${NOW}-${i}.tar.gz"
    tar zcvf $BACKUP/$FILE $DIRS
    fi

    And the Second part:
    ### Start MySQL Backup ###
    # Get all databases name
    DBS="$(mysql -u ${MUSER} -h ${MHOST} -p${MPASS} -Bse 'show databases')"
    for db in $DBS
    do
    FILE=$BACKUP/mysql-$db.$NOW.gz
    mysqldump -u $MUSER -h $MHOST -p$MPASS $db | gzip -9 > $FILE.
    done

    ##Backup the Ldap Directory Database
    slapcat -v -n 1 -l $BACKUP/LdapDirectory.ldif

    ## Dump backup using FTP ###
    #Start FTP backup using ncftp
    ncftp -u$FTPU -p$FTPP $FTPS

    And the Third Part:
    ### Find out if ftp backup failed or not ###
    if [ "$?"="0" ];.
    then
    rm -f $BACKUP/*
    echo "Date: $(date)">$T
    echo "Hostname: $(hostname)" >>$T
    echo "Backup succesfully Completed!" >>$T
    mail -s "BACKUP COMPLETED" "$EMAILID" $T
    echo "Hostname: $(hostname)" >>$T
    echo "Backup failed" >>$T
    mail -s "BACKUP FAILED" "$EMAILID"
    Hope now it works to see the hole script.

  13. About date localization

    The command DAY=$(date +%a) is sensible to the system language,
    I prefer to use DAY=$(date +%u) to know the number of the day of the week

  14. I was looking for something like this…

    just one question: to send the email of a failed backup what program do I have installed in my pc?

    Thankyou

  15. Thanks for the info.. just wondering, does it work with a regular ftp client? I’m on a shared hosting shell account and it doesn’t look like ncftp is installed.

  16. Hi!

    Thank you for this nice tutorial!!

    I have a setup with 2 HDDs, having the second as active mysql database storage and another partition for this backup script.

    I have modified it to just move the files to the second HDD instead of transfering via FTP and made it so that folders with $NOW are created before moving.

    I am wondering how the $INCFILE works? If i want to restore the “fs” from a week before, do i have to use the initial (bigger) TAR which is in the first folder, or does it work with the smaller ones that are i.e. in the folder of today?

  17. Hello,

    When I run this script from the command line it works fine but when I run it as a cron job, the FTP connection seems to drop at some point and not all files are not copied over. The FTP server says “disconnected” so as ncftp client logs.

    It happens all the time and it is a bit weird. Any clues?

    Thanks,G

  18. Heelo,

    tks for the script.

    I have a problem when running the script with a cron job, it will do the backup but it will not upload via ftp. Any idea why?

    Best regards,
    Nuno

  19. This looks great, but in Ubuntu 7.10, ncftp isn’t working.

    the command
    ncftp -u”$FTPU” -p”$FTPP” $FTPS<

    gives an error:
    wwwbackup.sh: 53: Syntax error: newline unexpected

    If I remove the < at the end, it will log in, but then it won’t send the commands to ncftp (after NCFTP quits, it will try running those commands on my own shell!)

    Are there ny other way to get ncftp to take commands from the shell script?

  20. OK I think I solved my problem by using ncftpput*

    I used this line instead of the ncftp lines you have:
    ncftpput -u “$FTPU” -p “$FTPP” -m “$FTPS” $FTPD/$NOW $BACKUP/*

    and commented out the ncftp command through EOF command.

    * by the way, why do you get the path of ncftpput and never use it?

  21. Even more information on my humiliations in Ubuntu:

    They made dash the default shell script, not bash!* So you need to specify $!/bin/bash (not $!/bin/sh) for this to work properly. dash also had problems with the equivalency (==) operator at the end.

    This will probably fix the ncftp problems, but since I already have ncftpput working, I’ll keep it.

    *From http://ubuntuforums.org/showthread.php?t=265391

  22. Firstly, Thank you so much for this script. I’ve learnt so much just from following it through, and it works perfectly on CentOS 4 after installing ncftp.

    Is there a way you can tell it to exclude files? For example, I’ve set it to back up the whole of a certain folder, but there is one sub folder within that I do not want backing up… is this possible?

    Thanks!

  23. You can also use:

    /usr/bin/md5sum -b $BACKUP/* >$BACKUP/backup$NOW-$i.md5

    Between backup and transfer, so you can verify if the files aren’t modified during transfer or something.

  24. This seems to work well, except when I run it it just does incremental backups. Of what, I’m not sure, as there has not been a full backup yet. Invariably it generates just a 48kb file. The database alone should be at least 800kb… Is there any way to force a full backup? How can I check that this script is working properly?

    Thanks,
    Adrian

  25. In this:

    if [ “$?” == “0” ]; then

    you are checking the return of ncftp, but it always returns “0”

    I even altered the password to force it to error and still it returned “0”.

    Has anyone confirmed that this will show anything other than “0”?

    I did this to check before the if statement

    echo “Return Code = $?”

  26. for some unknown reason if i run the script manually it works as it should. everything is zipped up and sent to the ftp server.

    However if i set up a cron job all the files are zipped up and placed in the /tmp dir (so i know the script is running) but the files are never sent to the ftp server.

    i’ve tried this with several ftp servers etc. etc. with no luck.

  27. Thanks heaps. Clever script. Worked fully once I gave mysql user LOCK privileges. Very helpful.

    Might suggest putting php.txt link up at top of wizard page. I entered dummy info, not wanting to send IPs and logins to php script. I now have the script itself and could generate again.

    Nice work and thanks for sharing.

  28. Hi
    Well ncpftp is not available on my hosting I am using this on shared hosting.

    I did change the bash script to

    ### Dump backup using FTP ###
    #Start FTP backup using ncftp
    #ncftp -u”$FTPU” -p”$FTPP” $FTPS<<EOF
    # Login, run get files
    ftp -inv $FTPS <<END_SCRIPT
    quote USER $FTPU
    quote PASS $FTPP
    mkdir $FTPD
    mkdir $FTPD/$NOW
    cd $FTPD/$NOW
    lcd $BACKUP
    mput *
    quit
    EOF

  29. Great script, but am having some challenges getting it to work. First I edited the file on a windows machine but that produced a /bin/sh^M error, so I had to change the line endings to just \n instead of \r\n

    Then I had to install ncftp

    But now I am getting a “username and/or password was not accepted for login” error on the FTP. I know I am inputting the correct details, and have tried several.

    When I used Ali’s version, using FTP instead of ncftp I noticed that it seemed to be sending the username as ‘username_’ instead of just ‘username’. Does the script change the details in some way? Any other ideas why I cannot log in with the script?

    I would love to get this going!

    Thanks

  30. I just use MySQL administrator to run my daily backups. I can’t think of an easier solution. I have over 30 clients on a variety of shared MySQL databases, and VPS databases. I have the task scheduled to run at midnight every night. However I am searching for an automated web server solution for backups. Thus far I haven’t found anything. Would be great if I could find something like norton ghost but works via FTP. Making incremental backups every night. My hosting provider charges $25 / month for FTP backup. Its cheaper to go out and buy a 2 TB hard drives and run your own backups. If anyone has a more user friendly suggestion let me know.

  31. Hi,

    I’m having some problem with the script. I seem to be having problem with the ncftp usage.

    The backup files are being created on the temp folder however it will not upload it on my FTP and I am receiving the “Failed backup” email.

    I run the script and here’s what I’m getting:


    /var/lib/mysql/eximstats/sends.MYI
    /var/lib/mysql/eximstats/smtp.MYD
    /var/lib/mysql/eximstats/smtp.MYI
    mysqldump: Got error: 1033: Incorrect information in file: ‘./horde/horde_sessionhandler.frm’ when using LOCK TABLES
    mysqldump: Got error: 1033: Incorrect information in file: ‘./roundcube/cache.frm’ when using LOCK TABLES
    /root/ncftpd-2.8.6/glibc2.5/ncftpd: illegal option — i
    Usage: ncftpd [flags]

    Optional Flags (usually set in general config file):
    -p XX : Use port XX for control connection (and XX – 1 for data).
    -n XX[,YY] : Allow maximum of XX concurrent server users (max-users); keep
    at least YY processes running to serve users (min-users).
    -v : Increase logging verbosity.
    -q : Do not echo log messages to the screen.
    -Q : Force echo of log messages to the screen, even if not a tty
    (Default is to echo automatically if it is a terminal).
    -e : Print the detected hostname and exit.
    -b : Print the version information and exit.
    -d : Run as background daemon.
    Exiting.

    Any help will be greatly appreciated.

    Thanks

  32. i using server os my computer particion c & d & e my c driver only 20gb . d& E 120 gb my computer c drive full i compair e drive and c drive how to compair

  33. FYI. Since my remote FTP is a windows box, file format for the sql backups had to be modified to exclude the colons in the time format. Changed FILE=$BACKUP/mysql-$db.$NOW-$(date +”%T”).gz to FILE=$BACKUP/mysql-$db.$NOW-$(date +”%Hh%Mm%Ss”).gz and now it works great.

    Perhaps this will help someone out that has the same issue.

    Thanks!

  34. Thank to Ali for modifying this to work with ftp, Ali ftp script above
    But I found that my files were coming out corrupt after transfer.
    I had to add binary before mput * to switch it to binary mode before transferring the files. Hope this helps someone.

    ### Dump backup using FTP ###
    #Start FTP backup using ncftp
    #ncftp -u $FTPU -p $FTPP $FTPS<<EOF
    # Login, run get files
    ftp -inv $FTPS <<EOF
    quote USER $FTPU
    quote PASS $FTPP
    mkdir $FTPD
    mkdir $FTPD/$NOW
    cd $FTPD/$NOW
    lcd $BACKUP
    binary
    mput *
    quit
    EOF
  35. Perfect script generator (and script) thanks a lot NixCraft, just what I was looking for. Saved me my half life! Respect! You rock!

    some questions…

    # ftp connection drop
    What would happen if during the FTP transfer procedure my second server – which receives the transfer of the backed up files – goes down, or drops the FTP connection. Will the whole process be terminated, or will it try to reconnect and try to continue uploading the files? Will I get any notification about any of those? (Sorry if it is obvious, but I am a noob :)
    (( My second server is a shared hosting at HostGator, with an unlimited storage plan. Just ideal fore storing the files, but the FTP connection is crappy and fluctuating, sometimes drops because of the heavy shared usage ))

    # free space requirements
    please correct me (orr approve) if I get it wrong:
    the script copies the files directly into the tar-ed gzip-ed archive, so the required free space on the originating server is equal to the space requirement of the gzip-ed files. For instance:
    If the size of all my website files (altogether) is 1 GB and i have some smaller mysql (e.g. joomla) databases lets say 100 MB altogether, than i would need approximately 2,2 GB free space to back them up successfully. Is this rule ( [occupied space] x 2 ) a good way to estimate the free space needed for the backup process?
    ((On my primary server i have very limited space, since it is a virtual server, and the storage enhancement is expensive, so i would like to buy as small storage space as possible ))

  36. Thanks for the script – a good starting point for me. I ended up commenting out #NCFTP="$(which ncftpput)" under FTP server setup and then used the following, as the password kept failing for some reason:

    ### Dump backup using FTP ###
    #Start FTP backup using ncftp
    #ncftp -u"$FTPU" -p"$FTPP" $FTPS<<EOF
    # Login, run get files
    ftp -inv <<EOF
    open $FTPS
    user $FTPU $FTPP
    mkdir $FTPD
    mkdir $FTPD/$NOW
    cd $FTPD/$NOW
    lcd $BACKUP
    binary
    mput *
    quit
    EOF

  37. Hi,

    The backup script is great. But i have very limited space in the ftp backup. If there a way to have an addon to the script to remove old backups and keep 7 incremental backups and 4 weekly backups?

  38. Ok, I am at a loss. First, thanks for the great script. I think it will do wonders for me once I have it up and running.

    My fs-i tar’s transfer just great. They complete, and it begins to transfer mysql.XX-XX-2010-HH:MM;SS.gz and finally I get an error “lost data connection to remote host: Broken pipe.”

    Then it tries to tranfer others and i get a “put *: could not send file to remote host.” error.

    Any tips?

  39. This is fine but doesn’t mysqldump stop the databse from being upated? What if we have a very busy site and can not stop mysqld or allow the db to be blocked? What solutions are there for a hot backup?

  40. @Michael, a few ftp servers do not allow special characters in a file name. Update the following line to remove %T part and try again:

    FILE=$BACKUP/mysql-$db.$NOW-$(date +"%T").gz

    i.e. set time in hour_minute_seconds_am_OR_pm format

    FILE=$BACKUP/mysql-$db.$NOW-$(date +"%l_%M_%S_%P").gz
  41. BRILLIANT!!! That did the trick! BTW, the server I was FTPing to was using FIleZilla server, if anyone else has this problem!

  42. Hi,

    I have a server with a lot of domains and a bigger space ocupied. I want to make something like this batch but I need a separated tar file per domain.

    I have a vhost directory with a various subdirectoris (one per domain), ¿is it possible, with tar, to make one different file per directory?

    Sorry for my poor english :-(

  43. i want to backup (otherwise ftp) the tmp directory of ubuntu server to windows machine…
    (without network sharing.. it must be available in windows even ubuntu server is turned off)
    help me on tis….also guide in automating the above issue..

  44. As Nicholas, I do only have limited space on the on-site backup. Is there any script that would remove backups older than a specified number of days / weeks?

    Would be very grateful if you could help me out!

  45. What exactly is this condition checking for?

    if [ “$?” == “0” ]; then

    I’m finding that no matter what, it’s always true.

  46. doesn’t work for me shit.
    [: 47: ==: unexpected operator

    put *: server said: mysql-wpsusat.23-04-2010-13:29:41.gz: The parameter is incorrect.
    [: 79: ==: unexpected operator

  47. Ok i just used = instead of == because of “posix somwhat”
    and my “put error” was due to server filesystem, my storage server unfortunately is winnt based and doesn’t support that file format xxx-xxx-12:23:34 Tue xxx

    Working on a function to keep the backup of the last month only
    Thanks for the script it works like a charm

  48. Can you help me please? I keep getting errors that the directory doesn’t exist, even though the script is trying to create it:

    ProFTPD 1.3.1rc2 Server (ProFTPD Default Installation)
    Logging in…
    User xxx logged in
    Logged in to xxxxxxx.
    MKD /home/user/backup failed; [/home/user/backup: No such file or directory]
    Could not mkdir /home/user/backup: server said: /home/user/backup: No such file or directory
    MKD /home/user/backup/22-07-2010 failed; [/home/user/backup/22-07-2010: No such file or directory]
    Could not mkdir /home/user/backup/22-07-2010: server said: /home/user/backup/22-07-2010: No such file or directory
    Could not chdir to /home/user/backup/22-07-2010: server said: home: No such file or directory

    Any ideas? It’s able to log in properly because it transfers the files to /home/user instead of /home/user/backup, so I don’t know why it’s not properly creating those directories.

  49. Great technique and also great to have an auto-generated script of your own. When using it, I ran into the following error from mysqldump:

    ERROR: Access denied for user ‘root’@’localhost’ to database ‘information_schema’ when using LOCK TABLES

    This is because the ‘information_schema’ table in the latest versions of MySQL isn’t really a table (I think). So you can modify the mysqldump command in the script to add the –skip-lock-tables option. See ‘man mysqldump’ for details.

    Thanks, Sam

  50. because if [ “$?” == “0” ]; then is allways 0 i use:

    ncftpput -m -u”$FTPU” -p”$FTPP” $FTPS $FTPD/$NOW/ $BACKUP/*

    this give 1 if fails

  51. Thanks for this lovely script , I have a VPS and a reseller account & now I do regular 1 click backups from my VPS to reseller acct. with this script , Thanks again . by the way , I’m Vivek too :)

  52. hi vivek.
    i need to use this script but everytime i put this on the crontab , when im check the log in /var/log/cron only show me the “Crond [4631] : (root) CMD (run-parts /etc/cron.hourly)” . what is this mean by the way and how much time its get to do the job ?

    regards
    hamed

  53. Your script making monday-inc.tgz as incrementary backup from saturday, not from sunday full backup.

    When you are doing FULL backup, you must first delete $INCFILE

    if [ "$DAY" == "$FULLBACKUP" ]; then
      FTPD="/home/vivek/full"
      FILE="fs-full-$NOW.tar.gz"
      rm $INCFILE
      tar -g $INCFILE -zcvf $BACKUP/$FILE $DIRS
    else
      i=$(date +"%Hh%Mm%Ss")
      FILE="fs-i-$NOW-$i.tar.gz"
      tar -g $INCFILE -zcvf $BACKUP/$FILE $DIRS
    fi
    
  54. And this is modification – using lftp mirroring (so you always have a copy of backup locally) and automaticaly deleting old backup files (you can have 1-2-4-… weeks of backup as you wish)

    
    ### System Setup ###
    DIRS="/etc /var /home"
    BACKUP=/backup/backup
    NOW=$(date +"%Y%m%d")
    INCFILE="/root/tar-inc-backup.dat"
    DAY=$(date +"%a")
    FULLBACKUP="Sun"
    
    ### FTP server Setup ###
    FTPU="uu"
    FTPP="ppp"
    FTPS="ftps"
    NCFTP="$(which ncftpput)"
    
    ### Other stuff ###
    EMAILID="[email protected]"
    
    ### Start Backup for file system ###
    [ ! -d $BACKUP ] && mkdir -p $BACKUP || :
    
    ### See if we want to make a full backup ###
    if [ "$DAY" == "$FULLBACKUP" ]; then
      FTPD="//full"
      FILE="full-$NOW.tar.gz"
      rm $INCFILE
      tar -g $INCFILE -zcvf $BACKUP/$FILE $DIRS
    else
      i=$(date +"%Hh%Mm%Ss")
      FILE="inc-$NOW-$i.tar.gz"
      tar -g $INCFILE -zcvf $BACKUP/$FILE $DIRS
    fi
    
    ### Dump backup using FTP ###
    #Start FTP backup using ncftp
    lftp <<EOF
    open -u $FTPU,$FTPP $FTPS
    mkdir //backup
    mirror -c -e -R /backup/backup //backup
    quit
    EOF
    
    ### Find out if ftp backup failed or not ###
    if [ "$?" == "0" ]; then
     find $BACKUP/*gz -ctime +8 -delete
    
    else
     T=/tmp/backup.fail
     echo "Date: $(date)">$T
     echo "Hostname: $(hostname)" >>$T
     echo "Backup failed" >>$T
     mail  -s "BACKUP FAILED" "$EMAILID" <$T
     rm -f $T
    fi
    
  55. Well I have moded the script to do what I want. I used rar for compression and splitting the files up as its being made so I dont have a 100 gig file then have to call split to finish the job. Here it is. Hope this helps someone.

    ### System Setup ###
    DIRS=”/home/data”
    BACKUP=/olddrive/backup
    NOW=$(date +”%Y%m%d”)
    INCFILE=”/olddrive/backup/tar-inc-backup.dat”
    DAY=$(date +”%a”)
    FULLBACKUP=”FRI”

    ### FTP server Setup ###
    FTPU=”YOUR_USER”
    FTPP=”YOUR_PASS”
    FTPS=”YOUR_SERVER”
    NCFTP=”$(which ncftpput)”

    ### Other stuff ###
    EMAILID=”[email protected]

    ### Start Backup for file system ###
    [ ! -d $BACKUP ] && mkdir -p $BACKUP || :

    ### See if we want to make a full backup ###
    if [ “$DAY” == “$FULLBACKUP” ]; then
    FTPD=”//full”
    FILE=”full-$NOW.rar”
    rm $INCFILE
    rar a -m5 -v50m $INCFILE $BACKUP/$FILE $DIRS
    else
    i=$(date +”%Hh%Mm%Ss”)
    rar a -m5 -v50m FILE=”inc-$NOW-$i.rar”
    rar a -m5 -v50m $INCFILE $BACKUP/$FILE $DIRS
    fi

    ### Dump backup using FTP ###
    #Start FTP backup using ncftp
    ncftp -u”$FTPU” -p”$FTPP” $FTPS<$T
    echo “Hostname: $(hostname)” >>$T
    echo “Backup failed” >>$T
    mail -s “BACKUP FAILED” “$EMAILID” <$T
    rm -f $T
    fi

  56. Hello,
    … is there a way to put in script a rule to delete backups older then,…. let say 3 months?,…for limited ftp space!

  57. I’m not an expert by any means, but I wanted a way to inform me of failures as well. The script above only notified on successes. So I removed the ncftp portion of the script, and replaced it with this:

    ncftpput -E -u $FTPU -p $FTPP -m $FTPS $FTPD/$NOW $BACKUP/*
    

    I had to use the -E option because I can’t use passive mode at this time. Remove -E if you can use passive mode.

    I then modified the last portion of the script to include my text for when the backup failed. It looks like this

    ### Find out if ftp backup failed or not ###
    if [ "$?" == "0" ]; then
     rm -rf $BACKUP
     T=/tmp/backup.good
     echo "Date: $(date)">$T
     echo "Hostname: $(hostname)" >>$T
     echo "" >>$T
     echo "This email is to inform you that the latest backup was a success" >>$T
     mail  -s "[host.example.com] - Backup Successful" "$EMAILID" $T
     echo "Hostname: $(hostname)" >>$T
     echo "" >>$T
     echo "This email is to inform you that the latest backup has failed" >>$T
     echo "" >>$T
     echo "Please investigate why the backup failed. The backup files are still available at $BACKUP" >>$T
     echo "" >>$T
     echo "Don't forget to delete this directory when done" >>$T
     mail  -s "[host.example.com] - Backup Failed" "$EMAILID" <$T
     rm -f $T
    fi
    
  58. If you want to delete old backups:

    In my case, I only wanted to keep backups around for 1 week. So every day, when this runs, it removes the backup from 7 days ago. This is very easy to accomplish. Just add this code Before the ### Dump backup using FTP ### line:

    REMOVAL_DATE=$(date -d “1 week ago” +”%d-%m-%Y”)

    Then add this right after you open NCFTP, like so:

    ncftp -u”$FTPU” -p”$FTPP” $FTPS<<EOF
    rmdir $FTPD/$REMOVAL_DATE

    You can change "1 week ago" as needed. "1 day ago" or "1 month ago" or "3 months ago" etc etc etc are all fine.

    1. Can you expand on this? I can’t find the resources. Linux will interpret “1 day ago” vs. “1 month ago”?

      Is ncftp required? I installed VSFTP.

  59. Listen. After many (many) configurations and techniques (included the one above that seems to be not so 100% portable to me).
    The best solution for me was to use duplicity along with automysqlbackup and amazon s3 Service (totally cheap but optional).
    In this way I can backup the entire database and all files using a 100% incremental backup in a really easy and transparent way.

    Just have a look here:

    http://duplicity.nongnu.org/

    I used rsync, rsnapshot and suff but I never managed to get something really stable for all my servers like the solution I’ve found now.
    With duplicity I can automatically restore a little portion of my backup in no time and is really easy to use.

  60. Hello, your script is that what i need. Thanks! But i have a question: i have several databases on my host, how can i set single database, which i need to backup?

  61. This script is the BEES KNEES !!! I have this running on 3 servers like an absolute charm and now I can sleep at night knowing it’s all taken care of.

    I tried Shane’s suggestion above for removing older backup folders but all I get is “Directory not Empty” back from ncftp. Will have to experiment a bit more with this.

    What I found is a nice addition for is to include the following command in your ftpbackup.sh script just below the line ### Other Stuff ###

    Add the following:

    ### Backup DPKG Software List ###
    dpkg –get-selections > /etc/installed-software-dpkg.log

    Then you also have your list of all your installed applications and you can use that to restore them too as per this article: http://www.cyberciti.biz/tips/linux-get-list-installed-software-reinstallation-restore.html#comment-173345

  62. I had a problem last night and I hope someone can help with a solution. This script went rogue on one of my servers and did not complete but instead got into some horrible loop.

    This morning I started noticing errors on my web sites and eventually tracked that down to a lack of disk space. I then looked in the /tmp/ folder on the server and found about 200 backup.xxxx directories created by this script.

    What could have caused this script to have gone haywire like that?? I deleted the cron job, recreated it and reset it to run at midnight and will see if it happens again but it does worry me somewhat. The same script works perfectly on my other 2 servers.

    Is there some way to force this script to die and email me if it runs into an error of ANY sort ????

    Any help or suggestions would be greatly appreciated.

  63. Would like to backup a database running on linux to windows server where I would like to use data protection manager to backing the database!! How do I script this do dumps my database to a windows folder and then put it to tape? Help

  64. I am getting an issue when it goes to save the mysql dump file

    NcFTP 3.2.2 (Aug 18, 2008) by Mike Gleason (http://www.NcFTP.com/contact/).
    Connecting to 137.219.74.64…
    FileZilla Server version 0.9.39 beta
    written by Tim Kosse ([email protected])
    Please visit http://sourceforge.net/projects/filezilla/
    Logging in…
    Logged on
    Logged in to 137.219.74.64.
    Usage: mkdir dir1 [dir2…]
    CWD successful. “/04-10-2011” is current directory.
    fs-i-04-10-2011-14h39m01s.tar.gz: 160.00 B 112.98 kB/s
    …mysql.04-10-2011-14:39:02.gz: ETA: 0:00 32.00/131.51 kB 133.55 MB/s Lost data connection to remote host: Broken pipe.
    mysql-mysql.04-10-2011-14:39:02.gz: 128.00/131.51 kB 7.95 MB/s
    put *: socket write error.
    /tmp/backup.3661/fs-i-04-10-2011-14h39m01s.tar.gz /tmp/backup.3661/mysql-mysql.04-10-2011-14:39:02.gz

    The first file is saved fine, the database dump is not.

    Any ideas?

  65. The answer is sitting right in front of me. The file naming of the mysql files is different to the normal files. The time function used to create the name of the mysql files puts in “:” in the time segment of the name which is not allowed as part of a file name. Trying to figure out how to fix this up apart from just copying the way the time is done with the files, but this would change the way the incremental backups are done I presume.

  66. I got an error “syntax error near unexpected token ‘do”. What could be the possible cause of this error message?

    Your reply will be greatly appreciated :)

  67. I keep getting the following error, I have copied the script three times now. I would also like to add that I had to add the following to the script to make the mySQL work. It also looks like it is not doing the file system backup. For the “DIRS” i have var/www

    Error: “[: 79: 0: unexpected operator”

    1. Ok, It is doing the file system backup so that works. Also the “DIRS” is set to “/var/www” not “var/www” (That was a typo sorry). So it is doing the backup but still failing because of this error. Also why cant I uncompress the data? For example I downloaded the FS file to my mac and tried to extract it, but I got a permissions error.

  68. Hi,
    Your scripts work great, but i need to upload Mysql backup files to server via SMB.
    I have Apple TimeCapsule and i want to store all backups there.The device is in same LAN

  69. If your script don`t work via cron? you mast replace ncftp command to full path to ncftp. For me it: /usr/local/bin/ncftp
    so command line like:
    /usr/local/bin/ncftp -u$FTPU -p$FTPP $FTPS ….

  70. Hi Vivek
    Great site. Thanks for all the hard work.
    The way I read this script, it is not actually doing what you set out to do. It will do a weekly full backup fine. However as I read your code, the increment section will be endlessly incrementing off of the first incremental backup. You must get your increment file off of the full (level 0) backup (using the -g option), and erase the increment file before each full backup to reset it. That will then do a level 1-6 increment on each incremental backup and reset for the full backup which creates a new increment file. Would this not be the actual code to do what you set out to do, or am I missing something?
    `
    if [ “$DAY” == “$FULLBACKUP1″ ]; then
    rm $INCFILE
    FTPD=”/home/user/backupdir/full”
    FILE=”fs-full-$NOW.tar.gz”
    tar -g $INCFILE -zcvf $BACKUP/$FILE $DIRS
    else
    i=$(date +”%Hh%Mm%Ss”)
    FILE=”fs-i-$NOW-$i.tar.gz”
    tar -g $INCFILE -zcvf $BACKUP/$FILE $DIRS
    fi
    `
    Regards,
    Thomas

  71. Hi vivek,
    The$ FULLBACKUP1 variable in the above code is a red herring and should have remained $FULLBACKUP. I’m using a variation of this very helpful script to to do a level 0 and then two weeks of level 1s before doing a level 0 again, and left the extra variable in the code above by mistake when I cut and pasted to adapt it back to your script to post.
    Nonetheless, could you point out where I’m wrong, if I am, that your script will endlessly increment off of the first incremental backup never making reference to the full backups. I can’t see how tar will increment off of the first full backup when you’ve set no increment file from it and never reset the increment file each full backup, instead the first incremental backup will be a full backup and the increments will be based off of that forever (level 0,1,2,3,4,5,6,7,8,9…on to infinity) which would be a nightmare to try and reconstitute. I’ll very happily be wrong, but can’t see where I am.
    Regards,
    Thomas

    1. Yes the script need to be fixed. After level 0, it will go to level 1…beyond and infinity. What you can do is modify the script and create only 7 archives (0 full and rest as incremental). On Sunday (full backup day), you can force full backups either by removing the $INCFILE before running tar, or by supplying the ‘–level=0’ option. On the ftp server you will have dirs like:

      backup.full-level.0.tar.gz 
      backup.Mon-level.1.tar.gz  
      backup.Wed-level.3.tar.gz 
      ....
      ....
      .
      backup.Sat-level.6.tar.gz
      

      A sample script (this is just for dry run and to get you started):

      #!/bin/bash
      
      _fullday="Sun"
      _dirs="/etc /var/www/html"
      _snapshot="/root/snap.dat"
      _now=$(date +"%d-%m-%Y")
      _day=$(date +"%a")
      _lpath="/backup"
      
      _prefix="$_day"
      _incarg='--listed-incremental=/root/tarbackup.dat'
      
      # set empty to avoid compression #
      _compress=".gz"
      _tarargs=""
      _file=""
      _level=0
      
      _init(){
      	export PATH=$PATH:/usr/local/bin:/usr/local/sbin
      	[ ! -d "$_lpath" ] && mkdir -p "$_lpath"
      	[ "$_day" == "$_fullday" ] && _prefix="full" 
      	[ "$_compress" == ".gz" ] && _tarargs="-zcf" || _tarargs="-cf"
      
      	case $_day in
      		Mon) _level=1;;
      		Tue) _level=2;;
      		Wed) _level=3;;
      		Thu) _level=4;;
      		Fri) _level=5;;
      		Sat) _level=6;;
      	esac 
      
      	_file="$_lpath/backup.${_prefix}-level.${_level}.tar${_compress}"
      
      	case $_day in
      		Sun) _fullbackup;;
      		Mon|Tue|Wed|Thu|Fri|Sat) _incbackup $_day;;
      	esac 
      }
      _fullbackup(){
      	echo "Starting full backup @ level # $_level..."
      	echo "tar $_incarg --level=0 $_tarargs \"$_file\" \"$_dirs\""
      }
      
      _incbackup(){
      	echo "Starting incremental backup @ level # $_level..."
      	echo "tar $_incarg $_tarargs \"$_file\" \"$_dirs\""	
      }
      
      
      ## main logic ##
      _init
      

      Dry run output:

      Starting full backup @ level # 0...
      tar --listed-incremental=/root/tarbackup.dat --level=0 -zcf "/backup/backup.full-level.0.tar.gz" "/etc /var/www/html"
      Starting incremental backup @ level # 1...
      tar --listed-incremental=/root/tarbackup.dat -zcf "/backup/backup.Mon-level.1.tar.gz" "/etc /var/www/html"
      Starting incremental backup @ level # 2...
      tar --listed-incremental=/root/tarbackup.dat -zcf "/backup/backup.Tue-level.2.tar.gz" "/etc /var/www/html"
      Starting incremental backup @ level # 3...
      tar --listed-incremental=/root/tarbackup.dat -zcf "/backup/backup.Wed-level.3.tar.gz" "/etc /var/www/html"
      Starting incremental backup @ level # 4...
      tar --listed-incremental=/root/tarbackup.dat -zcf "/backup/backup.Thu-level.4.tar.gz" "/etc /var/www/html"
      Starting incremental backup @ level # 5...
      tar --listed-incremental=/root/tarbackup.dat -zcf "/backup/backup.Fri-level.5.tar.gz" "/etc /var/www/html"
      Starting incremental backup @ level # 6...
      tar --listed-incremental=/root/tarbackup.dat -zcf "/backup/backup.Sat-level.6.tar.gz" "/etc /var/www/html"
      

      Feel free to modify the script as you need to add code for mysql and ftp. Hope this helps!

      Edit: I will post tested script later on, but I’m dam sure the logic is on the correct path, as I’ve hardcoded tar 7 days only.

  72. For the people getting this kind of error: “syntax error near unexpected token ‘do””
    The solution is simple. Change the first line of the script to #!/bin/bash from #!/bin/sh.

    Tested on Debian Squeeze.

  73. I’ve used this as the basis for backups on a few of my server – I’ve had some restrictions on them. The first didn’t support the “ncftp” command, so had to change to “ftp”. The other didn’t support “ncftp” or “ftp”, so had to change to CURL.

    If you want to use CURL, just comment out the ftp/ncftp command lines, and add:

    curl -T $BACKUP/$FILE ftp://$FTPS/$FTPD/ –user “$FTPU”:”$FTPP”
    curl -T $FILE2 ftp://$FTPS/$FTPD/ –user “$FTPU”:”$FTPP”

    And set the MySQL back variable to $FILE2 instead of $FILE

    HTH

  74. I’m getting error

    ————
    [: 47: Fri: unexpected operator
    ————-
    even If I use sunday or monday, i get same error with those days
    and script is making increamental backup all the time.

  75. Hi,
    I want to make only the mysql database . I have removed the codes for the webfolder part. But i want the option of full back up and incremental back up. How can i do this ?

  76. Hi there! Personally I don’t use any scripts to backup mysql databases, I prefere GUI tool like dbForge Studio for MySQL, but it also allows to make backups through command-line interface.

  77. I fond you script very usefull and I had give it a try.
    It wirks, but about to restore compressed file ?
    I tried to deflat in different os (Mac) to see what has been saved, but can’t be able to do that (permission error issues)
    Any idea or help ?
    THanks

  78. Thank you for the script. Can you please suggest on how to delete the older backups in the ftp drive? I want to keep only 15 days backups in ftp server, and want to delete the backups daily in my db server once they are backed up to ftp server.

  79. If u have a secure ftp (SFTP) put file with this line:

    lftp -d -e “mirror -R $BACKUP $FTPD; bye” -u$FTPU,$FTPP sftp://serverName:22

  80. Dont work ? Why ?

    ftp-backup-script.sh: line 46: syntax error near unexpected token `$’do\r”
    ‘tp-backup-script.sh: line 46: `do

Leave a Comment