HowtoForge Forums | HowtoForge - Linux Howtos and Tutorials

HowtoForge Forums | HowtoForge - Linux Howtos and Tutorials (
-   Tips/Tricks/Mods (
-   -   Backup script ( no question, just script ) (

almere 18th March 2013 14:49

Backup script ( no question, just script )
Hey guys.

I have made a backup script for my machines. Maybe you will like it :)

It's making a .gz file for each directory and puts all .gz files to *.tar archive. Than it will transfer the *.tar file to a FTP server.

I have 3 servers, each is something, like 1.5 TB. For each server it takes 1.5 hours to make full backup and transfer it. I also limited tar and gzip commands for 20% CPU usage, because it loaded my CPU for 99% :D

As security for my hosting is at first place, i backup almost everything. If you are not SOOOO paranoid, as i am, you should disable directories, like: root, sys, boot, lib, lib64, bin... etc.

Here it is:

### System Setup ###
### backup directory for temp. file sotrage.

### FTP ###
### your FTP server

### Binaries ###
TAR="$(which tar)"
GZIP="$(which gzip)"
FTP="$(which ftp)"

## Today + hour in 24h format ###
NOW=$(date +%Y%m%d)

### Create tmp dir ###
mkdir $BACKUP/$NOW
### you can add or delete directories. They will be set to *.gz file

$TAR -cf $BACKUP/$NOW/home.tar /home
$TAR -cf $BACKUP/$NOW/var.tar /var
$TAR -cf $BACKUP/$NOW/etc.tar /etc
$TAR -cf $BACKUP/$NOW/root.tar /root
$TAR -cf $BACKUP/$NOW/boot.tar /boot
$TAR -cf $BACKUP/$NOW/opt.tar /opt
$TAR -cf $BACKUP/$NOW/usr.tar /usr
$TAR -cf $BACKUP/$NOW/sys.tar /sys
$TAR -cf $BACKUP/$NOW/sbin.tar /sbin
$TAR -cf $BACKUP/$NOW/lib.tar /lib
$TAR -cf $BACKUP/$NOW/lib64.tar /lib64
$TAR -cf $BACKUP/$NOW/bin.tar /bin



### ftp ###
quote USER $FTPU
quote PASS $FTPP
cd $FTPD

### deleting temp files ###
rm -rf $ARCHIVED
rm -rf $DUMPFILE
echo "Backup finished and transferred"

Have fun.

UPD: see the post below, for full and complete backup structure.

almere 19th March 2013 09:17

So, small update... Now i also want to delete files, that are older than 7 days. Cuz my 8 TB backup server is full :D :D
Let's start:

Open server.php (/usr/local/ispconfig/server/server.php)
find this line

$server_config = $app->getconf->get_server_config($conf['server_id'], 'server');
(something like 745)

add befor that line paste the following code (but of course with replacing some variables to your own, like FTP values, count of days etc.)


//delete server backups
* Function to delete old backups from a remote FTP server
* obj $connection - FTP connection object
* int $ttl - count days to delete backups. If older than $ttl, will be removed
* $source - directory , where all backups are stored
* be sure, that you have server id in $conf variable
function deleteOldFiles($connect, $ttl, $source){
    $files = ftp_nlist($connect, '/');
    if (substr($source, -1, 1) == '/') $source = substr($source, -1);
    $files = ftp_nlist($connect, $source);

    foreach ($files as $f) {
            $server_id = $conf['server_id'];
            if($server_id < 10){$server_id = '0'.$conf['server_id'];}
        if (preg_match('#server'.$server_id.'-files-(.*?).tar.gz#', $f, $m)) {
            $t = $m[2];
            if (date('Ymd') - $t > $ttl) {
                ftp_delete($connect, $source.'/'.$f);

$server = 'hostname';
$ftp_user_name = 'username';
$ftp_user_pass = 'password';
$mode = FTP_BINARY;//do not touch it

$connection = ftp_connect($server);
if(!$connection) {@ftp_close($connection); $app->log('Connection attempt failed for the remote FTP backup server!', LOGLEVEL_ERROR); }

ftp_pasv($connection, true); //enable passive mode

$login = ftp_login($connection, $ftp_user_name, $ftp_user_pass);
if (!$login) { ftp_close($connection); $app->log('Login attempt failed for the remote FTP backup server!', LOGLEVEL_ERROR); }

ftp_pasv($connection, true); //enable passive mode

deleteOldFiles($connection, 7, '/'); // files that are older 7 days

$app->log('Deleting old backups from the remote FTP backup server is done!', LOGLEVEL_DEBUG);

Than go to your ssh and run:

crontab -e
than enter something like:

03 00 * * * /backups/ > /dev/null 2>> /var/log/ispconfig/cron.log
/backups/ - you should replace with directory and file, where you have the FTP backup script from the previous post.

Done. Now your server will be fully backup to a remote FTP server and all files, older than N days will be deleted.

Why do i use bash and php ?
- I do not know bash that good. For regular expressions etc. Would be grate, if somebody could translate my PHP to bash.

Good luck & have fun ;)

florian030 19th March 2013 10:09


Done. Now your server will be fully backup to a remote FTP server and all files, older than N days will be deleted.
I prefere numbered-backups instead of adding the date to each backup.

Instead of server01-files-$NOW.tar.gz i use something like server01-files-$DOY.tar.gz


DOY1=`date +%j`
DOY=`expr $DOY1 % 10` # for 10 Backups (0..9).

No need to delete old files, they are just overwritten.

almere 19th March 2013 10:15

Thank you for your post :p

In my case i need to store backups every day and to delete it if it's older, than 7 days. I think it's more readable for the system and for user to have it like i did. Then you have server ID, and a day, when the backup was made. If you will get a problem with your server after an update, i will be easier to restore files with date in the file name , than to guess, from what date was the backup made ( yeah, i know about date in the file info, but it's allways possible that the fil will be rewrited or changed... You never know. ).

florian030 19th March 2013 10:23

You´re welcome. ;)

I`m running my script each day, too and have never more than 10 backups in the backup-space.

Personally I don´t like too much code in a simple backup-script. Instead of delete-and-upload i just like upload-with-overwrite. :)

tahunasky 27th April 2013 03:18

I do backups every weekday, and then a weekly backup for 4 weeks, that way i only ever have 8 backup files, but my backups go back a month, and i can even do a monthly backup if needed. Everything gets over-written, so no need to delete anything. and the weekly backups are stored off-site.

backup-daily-monday.tar to backup-daily-thursday.tar, backup-weekly-01.tar to backup-weekly-04.tar

I also run a script on a backup server, which has a wake-on-bios function. Every morning at 1am the server turns on, runs through the list of other servers to check if the servers are running, if so syncs with it/them (emails/websites/cloud storage area, and copies over the backup file from that night, and turns itself off... if it finds a server that is down, it will configure itself to be the server that is down (changing IP/hostname etc) then emails me to say what has happened. Then i just have to send my mum or sister an email to swap out the downed server with a pre-configured spare or they can swap out a faulty HD with pre-configured spares. I can then start the replacement server (with wake-on-lan) when i am ready and configure remotely, and restore files etc from the backup server which does the syncing.

By doing this i have only had to do one restore from backup in the last couple of years, and is particularly useful for me as i spend alot of the year traveling and living in other countries, and can be away for home for 9 months or more at a time.

Fulike 26th May 2013 11:18

incremental backup
1 Attachment(s)
I have downloaded Ioannis Sannos's backup srcipt, after I rewrote.
My version is weekly incremental for website, and e-mail account,

My script is in hungarian language :)
( is a gzipped tar)

All times are GMT +2. The time now is 16:30.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.