Extending Perfect Server - Debian Squeeze [ISPConfig 3] - Page 6

Want to support HowtoForge? Become a subscriber!
 
Submitted by 8omas (Contact Author) (Forums) on Fri, 2011-03-11 00:19. ::

10. Clients' BackUps

The contents of the script changed on 29-03-2011. Please update

This script WILL NOT work correctly for ISPConfig v. 3.0.5 and above. You have to do a lot of changes and is not recommended. Please use the ISPConfig new way of backing up.

The following script, is an easy way to backup your clients data and your clients databases in their website folder. As you may know, in ISPConfig3 each client has a folder in the form /var/www/clients/clientXY, in which there are all his web sites. The script will back up all his websites in each web folder, in companion with his databases and the client will be able to download them. In case, a client has more than one database, then all the databases will be backed up in his first (based on webID) site. The script will also keep the last 3 days of those files and the last 3 Sundays for admin usage in a directory of your choice (the default is /var/backup/sites).

This is a very simple script. For a more advanced solution look at this post.

Create the script, make it executable and edit it:

cd /root/scripts/
touch mybackup.sh
chmod 0700 mybackup.sh
nano mybackup.sh

The contents must be the following:(Change the variables ispUSER, ispPASS, ispHOST, DEST and SITES to fit your needs):

#!/bin/bash
# Shell script to backup MySql database and clients websites
#
# Last updated: March - 2011
# --------------------------------------------------------------------
# This is a free shell script under GNU GPL version 2.0 or above
# Copyright (C) 2011 iopen.gr
# Feedback/comment/suggestions : http://iopen.gr
# --------------------------------------------------------------------
#
# INTENDED for the ISPConfig 3.0.x and above
#
# This script will back up every web folder (web, stats, cgi e.t.c
# of every client in companion with all the client's DBs
# The backups will be placed in the website client's folder
# The scipt will keep the current and the 2 previous backup
# It will also keep the last 3 sundays
# --------------------------------------------------------------------

# Database credentials. Use a DB user with full read access or use the root user
ispUSER="root" # DB user
ispPASS="---yourpass---" # user's password
ispHOST="localhost" # Hostname

CURDIR="$(pwd)"
# Variables with full path to binaries
MYSQL="$(which mysql)"
MYSQLDUMP="$(which mysqldump)"
CHOWN="$(which chown)"
CHMOD="$(which chmod)"
GZIP="$(which gzip)"
TAR="$(which tar)"

# Your Server's Main Backup Directory
DEST="/var/backup"

# Sites (ONLY) backup directory in your Main Backup Directory
SITES="$DEST/sites"


# Variables for Dates in yymmdd format
TODAY=`date +%Y%0m%0d`
YESTERDAY=`date -d '1 day ago' +%Y%0m%0d`
BACK2=`date -d '2 day ago' +%Y%0m%0d`
BACK3=`date -d '3 day ago' +%Y%0m%0d`
BACK22=`date -d '22 day ago' +%Y%0m%0d`

[ ! -d $SITES ] && mkdir -p $SITES || :

# Give Only root access to backups in this scripts folders
$CHOWN 0.0 -R $SITES
$CHMOD 0600 $SITES

# --------------------------------------------------------------------------
# Remove previous (current) backups of the client directory
# The backups are in the form :
# *BU*gz
# -------- CAUTION ---------
# Do not store any other file in this form in the clients directory
# --------------------------------------------------------------------------
echo "-------------------------------------------------------------"

QRY="use dbispconfig; SELECT web_domain.system_user, web_domain.system_group, \
web_domain.document_root, web_domain.domain FROM web_domain WHERE \
web_domain.type!='alias' AND web_domain.system_user IS NOT NULL AND (LENGTH(web_domain.redirect_path)<5 OR web_domain.redirect_path IS NULL) ;"

echo $QRY | mysql -u $ispUSER -h $ispHOST -p$ispPASS | while read -r line
do # ${col[0]}=domain user / folder name / system user, ${col[1]}=clientID / system group ,
while read -a col # ${col[2]}=path to website, ${col[3]}= domain name
do
echo " CLEANING OLD BACKUPS in ${col[2]} folder "
for delfile in ${col[2]}/*BU*gz ;
do [ -f $delfile ] && rm $delfile;
done
done
done

# --------------------------------------------------------------------------
# Remove anything that is 22 days old and have the form :
# *[date 22 days old]*gz
# from server's $SITES directory
# --------------------------------------------------------------------------
echo "-------------------------------------------------------------"
echo " CLEANING OLD BACKUPS in SITES folder "
for delfile in $SITES/*$BACK22*gz ;
do [ -f $delfile ] && rm $delfile;
done
echo "-------------------------------------------------------------"
echo " "
echo " "

# --------------------------------------------------------------------------
# For each client, backup his database in his website folder
# For client with multiple sites backup all dbs in his first site
# Furthermore copy today's backup in the server's $SITES directory
# Remove the backup that is older than 3 days from server's $SITES directory
# Keep the last 3 Sundays
# --------------------------------------------------------------------------

QRY="use dbispconfig; SELECT web_database.database_name , web_database.database_user ,\
 min(web_domain.system_user) as muser, web_domain.system_group, min(web_domain.document_root) as mpath, \
web_domain.domain FROM web_database, web_domain WHERE web_database.sys_userid=web_domain.sys_userid \
AND web_database.sys_groupid=web_domain.sys_groupid AND web_domain.type='vhost' \
AND web_domain.system_user IS NOT NULL AND (LENGTH(web_domain.redirect_path)<5 OR web_domain.redirect_path IS NULL) \
GROUP BY web_database.database_name , web_database.database_user, web_domain.system_group;"

echo $QRY | mysql -u $ispUSER -h $ispHOST -p$ispPASS | while read -r line
do # ${col[0]} = dbname, ${col[1]}=dbuser , ${col[2]}=domain user / folder name / system user,
while read -a col #${col[3]}=clientID / system group , ${col[4]}=path to website
do
echo " DB: "${col[0]}
echo "-------------------------------------------------------------"
echo "Backing Up DB:" ${col[0]} "in :" ${col[4]}/${col[0]}BU.gz
$MYSQLDUMP -u $ispUSER -h $ispHOST -p$ispPASS -c --add-drop-table --add-locks \
--quick --lock-tables ${col[0]} | $GZIP -9 > ${col[4]}/${col[0]}BU.gz
cp ${col[4]}/${col[0]}BU.gz $SITES/${col[0]}.$TODAY.gz
if [ `date -d '3 day ago' +%u` -ne 7 ] # if 3 days ago is not Sunday
then #remove the backup
[ -f $SITES/${col[0]}.$BACK3.gz ] && rm $SITES/${col[0]}.$BACK3.gz
fi
$CHOWN ${col[2]}:${col[3]} ${col[4]}/${col[0]}BU.gz
$CHMOD 0660 ${col[4]}/${col[0]}BU.gz
echo "-------------------------------------------------------------"
echo " "

done
done

# --------------------------------------------------------------------------
# For each client, backup his sites in his website folder
# Furthermore copy today's backup in the server's $SITES directory
# Remove the backup that is older than 3 days from server's $SITES diriectory
# Keep the last 3 Sundays
# --------------------------------------------------------------------------



QRY="use dbispconfig; SELECT web_domain.system_user, web_domain.system_group,\
 web_domain.document_root, web_domain.domain FROM web_domain WHERE \
web_domain.type!='alias' AND web_domain.system_user \
IS NOT NULL AND (LENGTH(web_domain.redirect_path)<5 OR web_domain.redirect_path IS NULL) ;"

echo $QRY | mysql -u $ispUSER -h $ispHOST -p$ispPASS | while read -r line
do # ${col[0]}=domain user / folder name / system user, ${col[1]}=clientID / system group ,
while read -a col # ${col[2]}=path to website, ${col[3]}= domain name
do
echo " "
echo " Site:" ${col[3]}
echo "-------------------------------------------------------------"
echo "Backing Up site: " ${col[2]}/ "in :" ${col[2]}/${col[3]}BU.tar.gz
cd ${col[2]}
sudo -u ${col[0]} $TAR -czf ${col[2]}/${col[3]}BU.tar.gz .
cp ${col[2]}/${col[3]}BU.tar.gz $SITES/${col[3]}.$TODAY.tar.gz
if [ `date -d '3 day ago' +%u` -ne 7 ] # if 3 days ago is not Sunday
then #remove the backup
[ -f $SITES/${col[3]}.$BACK3.tar.gz ] && rm $SITES/${col[3]}.$BACK3.tar.gz
fi
$CHOWN ${col[0]}:${col[1]} ${col[2]}/${col[3]}BU.tar.gz
$CHMOD 0660 ${col[2]}/${col[3]}BU.tar.gz

echo "-------------------------------------------------------------"
echo " "
done
done
cd $CURDIR

You can run the backup script by executing:

/root/scripts/mybackup.sh

or you can add it as a cron job (e.g. every day at 22:30):

crontab -e

and append the following line:

30 22 * * * /root/scripts/mybackup.sh > /dev/null 2>> /var/log/backup.log

 

Final Notes

Please, feel free to comment anything for this tutorial in an appropriate threat in HowtoForge forums. Useful comments will be included in a future updated version.

This is the first version and as careful as I was, the tutorial may contain errors. Please let me know of them, so as to correct them, as soon as possible.


Please do not use the comment function to ask for help! If you need help, please use our forum.
Comments will be published after administrator approval.
Submitted by AceLine (registered user) on Sun, 2013-04-07 15:35.

Hi,

at first: thnx. for this wonderfull tutorial!!!

The only problem I've got at this point is that when I execute the mybackup.sh file I get this error:

ERROR 1054 (42S22) at line 1: Unknown column 'web_database.database_user' in 'field list'

I looked into the database and there really is missing the database_user field... Is this a never version of ispConfig3 I'm using? Can you adapt the script to the newer version?

Thnx. again for your good work.

Best regards, Ingmar

Submitted by Jonas Lateur (not registered) on Mon, 2012-04-23 15:51.
when i run /root/scripts/mybackup.sh, i get follow error

       Site: server.ttb-ltd.eu
-------------------------------------------------------------
Backing Up site:  /var/www/clients/client1/web1/ in : /var/www/clients/client1/web1/server.ttb-ltd.euBU.tar.gz
tar (child): /var/www/clients/client1/web1/server.ttb-ltd.euBU.tar.gz: Cannot open: Permission denied
tar (child): Error is not recoverable: exiting now
/bin/tar: /var/www/clients/client1/web1/server.ttb-ltd.euBU.tar.gz: Cannot write: Broken pipe
/bin/tar: Error is not recoverable: exiting now
cp: cannot stat `/var/www/clients/client1/web1/server.ttb-ltd.euBU.tar.gz': No such file or directory
/bin/chown: cannot access `/var/www/clients/client1/web1/server.ttb-ltd.euBU.tar.gz': No such file or directory
/bin/chmod: cannot access `/var/www/clients/client1/web1/server.ttb-ltd.euBU.tar.gz': No such file or directory
-------------------------------------------------------------

Submitted by bodri (registered user) on Sat, 2012-01-21 02:19.
At backup script mysqldump '--all' option is deprecated and restor from backup won't work ! Use '--create-options' instead.
Submitted by webmaster eddie (not registered) on Sat, 2011-12-17 07:50.

I tried following your instructions - just to harden the server using ipTables and the ddos - and the backup scripts - and I could no longer ftp - i only have a dynamic ip wifi public connection to the net and it was blocking me disconnecting me from ftping files after 1 second it seems... so I reversed every single thing I did following your instructions, and now cannot ftp with any program at all - I can connect but not a single file is allowed to be transferred - i get a permission denied 553 error. Can you help me ? I have checked everything - the ports in IPSCongif 3 panel are fine, etc.

 

Also I never got the backup scripts to work at all - so I removed them. I do thank you for the 2 mysql tuning scripts which work and seem to help

Submitted by erosbk (registered user) on Thu, 2011-05-05 06:17.

After downloading the file, you must:

chmod +x /usr/share/roundcube/plugins/fail2ban.php
touch /var/log/roundcube/userlogins
chown www-data:www.data /var/log/roundcube/userlogins

 

I don't know if chmod +x is necesary for fail2ban.php (i was traying to make it work, I am too tired to test it lol xD) but, you MUST chown userlogins, if not, apache2 (or roundcube through apache2) will not be able to write inside the file (access denied).

 Regards

 

PD: Perfect howto xD

Submitted by 8omas (registered user) on Sun, 2011-06-26 19:04.

You just need the following command:

 chown www-data:www-data /var/log/roundcube/userlogins

I updated the tutorial. Thx

Submitted by tuxic (registered user) on Thu, 2011-04-14 14:07.

I followed the client backup tutorial, however I'm getting a error after running it as root.

The backups seems to run fine but i receive te following error and this keeps repeating in the console:

 /bin/tar: ./tmp/sess_5jrh2r2d5m8lhtaq7sf04mo3n7: Cannot open: Permission denied