ISPConfig 3 - back-res a Backup and Restore script

Discussion in 'Plugins/Modules/Addons' started by go0ogl3, Dec 6, 2009.

  1. go0ogl3

    go0ogl3 New Member

    Dots...

    I've removed the space character after the "/root" in the download-able script so nobody will have any more issues with this.
    It's seen as a dot by newer mc to show when you have free spaces after your last edited content.
    If you wanna see just add one or more spaces after that "/root" entry, save the file then do a
    Code:
    cat back-res | grep "/lib /lib64 /root"
    and see if there are dots... :)

    Thank you for your time!
     
  2. skoena

    skoena Member

    Is it a lot of work to change the script so it can also be used as remote backup script?
     
  3. go0ogl3

    go0ogl3 New Member

    Remote Backup

    Hello, if you mount anything remote, before de use of the backup script, and the backup dir is the remote mounted one, then you can use it as a remote backup script.
    You can use samba (cifs) for remote windows shares, ssh, or anything you can think of and can be mounted, even remote internet drives.

    I'm in the process of testing with remote ftp, but I'm not sure if it'll be good to make a separate script for this or integrate it in the backup script.
    For now I'm using 2 separate scripts, partially because the script which makes the ftp transfer compares the files first, then makes something like a sync, but that is not needed here... so for now I'm not sure.

    The simplest way now is to use a remote mounted dir for backup. For more info you can see: Linux Userland Filesystem
     
  4. vaio1

    vaio1 ISPConfig Developer ISPConfig Developer

    Hi go0ogl3,

    First of all thanks for your script. I have 3 production server boxes and one backup server.

    I have installed ISPConfig in the Multiserver setup system. So I have different services for each server.

    1. Server A
    --->Web Server
    --->SVN Server

    2. Server B
    ---> Database Server

    3. Server C
    ---> Mail Server

    4. Server D
    ---> Backup Server

    Is there a way to send the request to backup from the Backup SERVER to every other servers? I would like to load the script from the backup server and store all the backups files and directory in the Backup SERVER.

    Thanks
     
  5. Toucan

    Toucan New Member

    Cron runs the backup on each of my servers and writes it to the nas I have mounted on each.

    All you would need to do is mount your backup server instead

     
  6. vaio1

    vaio1 ISPConfig Developer ISPConfig Developer

    So, each server has a copy of the script, right?

    I don't understand how have I set this variable in each server:

    Code:
    BACKUPDIR="/bck/$COMPUTER" 
    
     
  7. Toucan

    Toucan New Member

    Yes, the same script runs on each of the servers at different times

    The variable backupdir dictates where the backup us saved to.

    On my server I have a directory that mounts to the nas elsewhere on the network.

    So my backupdir = /mnt/backup/$computer

    $computer vRiable holds the name of the computer backing up.
     
  8. go0ogl3

    go0ogl3 New Member

    MUltiple servers backup to special backup server

    Hello,

    The way I do it is to have a copy of the script on each of the servers and create on the backup server a dir for every server. For example:
    1. on Server D I can make /backup/Server_A, /backup/Server_B, /backup/Server_C
    2. mount the dir /backup/Server_A/ on Server A to dir /backup
    3. mount the dir /backup/Server_B/ on Server B to dir /backup
    4. mount the dir /backup/Server_C/ on Server C to dir /backup
    5. define the BACKUPDIR="/backup/$COMPUTER" on servers A,B,C (you can have BACKUPDIR="/backup", but it's usefull to have a good name to your backups)
    6. set up the cron jobs on the servers

    Good luck!
     
  9. fordwrench

    fordwrench New Member

    I have a server that has debian lenny and ispconfig via the perfect server howto.

    I am trying to migrate to a new server I have created with debian squeeze viat the perfect server howto.

    I make my backup and when I try to restore I get the following:



    oot@srv1:~/scripts# sh /root/scripts/backres.sh db all 2011-07-16
    date: invalid date `07/'
    /root/scripts/backres.sh: line 448: [: : integer expression expected

    You want to restore db all to date 2011-07-16.

    Please input "yes" if the above is ok with you and press [ENTER]: yes

    Restoring all mysql databases from date 2011-07-16 to local server:
    touch: cannot touch `/tmp/tmpbck/datestart': No such file or directory
    touch: cannot touch `/tmp/tmpbck/dateend': No such file or directory
    mysql: unknown option '-x'
    mysql: unknown option '-x'
    mysql: unknown option '-x'
    mysql: unknown option '-x'
    mysql: unknown option '-x'
    mysql: unknown option '-x'
    mysql: unknown option '-x'
    mysql: unknown option '-x'
    mysql: unknown option '-x'
    mysql: unknown option '-x'
    All restore jobs done!
    Database all restored to date 2011-07-16!
    /root/scripts/backres.sh: line 573: /tmp/tmpbck/maildata: No such file or directory


    Can you please give me some insight as to what the problem might be.
     
  10. Toucan

    Toucan New Member

    Hi google,

    Been using the script for a long time on one server and now also have it running on a remote vm server.

    I'm not that great with bash. Is there a simple line I could add to scp whatever files have just been written to my remote server?

    Like:
    Code:
     scp full_var-2011-08-16.tar.bz2 root@myip:/tmp
    
    but obviously a bit more dynamic!



    This script has saved my skin before!
     
  11. il.manuel

    il.manuel New Member

    hi google and thx for your great job!

    i have two problem related on server rather than script.

    my scenario: a VPS and a ftp backup server on remote machine with a 10GB quota on it.

    i have mounted the remote ftp server using curlftpfs (http://curlftpfs.sourceforge.net/) and everything seems to be ok.

    first problem: df show me the entire disk space on remote server (7,5TB !) and not my quota (10GB). max percentage calculation have no way to run well and the ftp space will end without a check possibility.

    second problem: i thgink that curlftpfs not support append to file ('>>'). i got a lot of this error

    Code:
    ./backupres: line 236: /backup/<<HOSTNAME>>/log/backup.log: Operation not supported
    when he try to update logfile.

    I think for an alternative, but i can't find a solution (manually deleting old backups!!!)

    third problem :)

    he not send me an email at the end of script. /tmp/tmpbck/maildata exist and he contains every log entry.
     
    Last edited: Aug 23, 2011
  12. flyingscubakev

    flyingscubakev New Member HowtoForge Supporter

    Just a quick question from a relative newbie.
    I've followed the perfect server - ispconfig3 on opesuse and everything is working, I've now loaded this backup script and it seems to work OK, my question is since I followed the perfect server install, it has me disable mail, I was wondering if there was a way to mail the logs to me using what ever the server is configured with (postfix,dovecot,???)?

    Thanks
     
  13. erosbk

    erosbk New Member

    Code:
                # If it's not the first day of the month we make incremental backup
                if [ ! -e $tmpdir/full-backup$XX.lck ] ; then
                    log "Starting daily backup for: $YX"
                    NEWER="--newer $FDATE"
                    $TAR $NEWER $ARG $BACKUPDIR/$MDATE/i$XX-$FDATE.tar.bz2 $YX -X $tmpdir/excluded
                    log "Daily backup for $YX done."
                else
                log "Lock file for $YX full backup exists!"
                fi
    
    Hi, I am running backup at 1 35 each night, so today, it will run at 25/08/2011 01:35 AM.

    NEWER="--newer $FDATE"

    with that line, it will check if a file is newer than 25/08/2011, and, if a file was created/modified between 24/08/2011 1:35 and 24/08/2011 23:59, it will be ignored.

    I think that the solution should be to work with yesterday date and not with the actual date. It will copy twice files created between 25/08/2011 00:00 and 25/08/2011 1:35 (at next backup), but right now, incremental backup is doing almost nothing.

    Is this correct or I forget to check something?

    I will try this night with line value:
    NEWER="--newer `date --date=yesterday +%F`"

    instead of:
    NEWER="--newer $FDATE"

    But, I don't know is --newer checks only creation date or modified date too... we will see this night :p

    Best regards

    Edit: working ok, right now inc backup working, I think that this should be a modification for the scripts (or a custom value to modify depending on script running time.
     
    Last edited: Aug 25, 2011
  14. go0ogl3

    go0ogl3 New Member

    If you have a smtp mail account you can configure one of the multiple mail programs (mail, mutt, etc.) to send mail at the end of the script using that account.
     
  15. go0ogl3

    go0ogl3 New Member


    Hello,

    the <<HOSTNAME>> is changed by you? If not that's the problem.

    curlftpfs does not support append to the files. The solution is to move all the appended files away (log file, there is another?)
    The alternative for the free space function is to get the free space with another method (repquota?) and use that value in the script...

    The mail problem:
    You need to have some mail server running (sendmail, postfix, dovecot, etc.) on your linux host to send mail with the back-res script...
    I'll try to post a little script to send mail using SMTP AUTH and your ISP mail server (or another) and I'll try to modify the back-res script later this month.
     
  16. go0ogl3

    go0ogl3 New Member


    Hello,

    From the tar manual seen on http://www.gnu.org/software/tar/manual/tar.html#SEC114:

    The FDATE from the script is the actual date like
    Code:
    2011-09-12
    which means 2011-09-12 time 00:00
    In my tests I run the script at 00:01 on 8 september for example (first minute of the 8 day) and the script archives some files modified on
    Code:
    Sep  7 23:25
    and on
    Code:
    Sep  8 00:00
    so I can say it's doing his job....

    How you use the script the backed up files are a day older, so you'll end up archiving more files on the incremental backup.

    Use a simple test:
    create a file in cron (touch) just before the actual run of the script... or use the tar command from the command line with the required arguments.
     
  17. go0ogl3

    go0ogl3 New Member

    Sorry for seen your post so late, but I'll respond maybe others will need to restore too.

    You can restore all of your files unpacking all the full_ archives, then all the incremental archives from that month OVER the full_ archives until the date needed. This way you'll end up with all your files, including the ones deleted from one incremental backup to another. You can use find --newer to delete unwanted files.

    Hope this helps others too.

    In your case it's possible that you have a problem with the date command and your /tmp dir is not writable or something.

    The way I do the restore is using an empty partition or drive or a loop device mounted in a running system, I'm adding the needed /proc and others then make it bootable (grub, lilo, etc.), then remove the drive and run the restored system.
     
  18. go0ogl3

    go0ogl3 New Member

    Mail function / script for non-mail systems

    Hello

    here is a simple code for a mail function or script for the systems with mail server disabled:

    Code:
    #!/bin/bash
    #
    mailuser=`echo -n $email| base64`
    mailpass=`echo -n $password| base64`
    
    maildata=`$tmpdir/maildata`
    
    nc mail.server.com 25 << EOF
    EHLO test
    auth login
    $mailuser
    $mailpass
    mail from: $email
    rcpt to: $email
    DATA
    Subject: test mail telnet
    
    $maildata
    .
    quit
    EOF
    
    I'll try to modify the back-res script to add an option for this function. The code works only for mail servers which support smtp auth login but I hope will be enough...
     
  19. dar_ek

    dar_ek New Member

    smal FIX for NFS disk usage problem

    I noticed that if you store backup on a NFS disks when DELETE old backups NOT WORKS :confused:

    I have standard maxp="85" but disks was full after some days.
    Problem is in "function check_space"
    Line:
    Code:
      pfs=`df -h $BACKUPDIR | awk 'NR==2{print $5}' | cut -d% -f 1`
    In common disks it works cause all infos from df are in one line but in NFS mounted disks is IN TWO LINES, look:

    Code:
    # df -h
    Filesystem            Size  Used Avail Use% Mounted on
    /dev/sda1              14G  1,7G   12G  13% /
    10.246.120.212:/home/BACKUP/POCZTA
                           65G   65G     0 100% /home/BACKUP
    in this case you can user "P" switch in df, that tell df to dont do this (2 lines output for one disk).
    And fix will look that:

    Code:
     pfs=`df -Ph $BACKUPDIR | awk 'NR==2{print $5}' | cut -d% -f 1`

    please check that "df -Ph" will be works on other linux distros, and if its woks to change that.

    regards
     
  20. go0ogl3

    go0ogl3 New Member

    Thanks for pointing this up!

    Thank you for help, this is because your NFS path is long :) but is a small fix wich I'll include in the next version, sometime in december, if my timing is ok.
     

Share This Page