Database backup script

Discussion in 'Programming/Scripts' started by Poliman, May 14, 2018.

  1. Poliman

    Poliman Member

    I did some script (it will be extended for other functions of mails, www and backup function will change) to database backup
    Code:
    #!/bin/bash
    
    #Script for backup databases stored on the server.
    #Database credentials stored in chmod 400 .password file in form user,password
    #Need to be added to cron at - i.e. - midnight, example cron entry -> 0 2 * * * /root/skrypty/script_name
    #Databases are backuped as tar.gz compressed archive.
    
    function backupDB()
    {
            DATE=`date +"%y-%m-%d-%H:%M:%S"`
            DIRECTORY="mysql_backup-$DATE"
            MYSQLDUMP=/usr/bin/mysqldump
           
            #need create .password file in proper directory contains data in form -> user,password
            MYSQL_USER="`awk 'BEGIN {FS=","} {print $1}' /root/.password`"
            MYSQL_PASSWORD="`awk 'BEGIN {FS=","} {print $2}' /root/.password`"
    
            #file will be stored in this place
            cd /mnt/disk/backups
            mkdir $DIRECTORY
            cd $DIRECTORY
    
            # backup the database, each in own file
            databases=`mysql --user=$MYSQL_USER -p$MYSQL_PASSWORD -e "SHOW DATABASES;" | grep -Ev 'Database|information_schema|performance_schema'`
            for database in $databases
            do
                    #$MYSQLDUMP --force --log-error=error_file --user=$MYSQL_USER -p$MYSQL_PASSWORD --databases $database | gzip > "$database.gz"
                    echo -e "\e[38;5;40mI am doing backup of $database.\e[39m"
                    $MYSQLDUMP --force --log-error=error_file.log --user=$MYSQL_USER -p$MYSQL_PASSWORD --databases $database > "$database.sql"
            done
    }
    
    function backup()
    {
            #execute doing database backup
            backupDB 2>/dev/null
    
            cd /mnt/disk/backups/$DIRECTORY
    
            #here need to archive and compress each file
            db="*.sql"
            for archive in $db
            do
                    tar -czvf $archive.tar.gz $archive >/dev/null 2>&1
            done
    
            #make the backup files readable only by root
            /bin/chmod 600 *.tar.gz
    
            #delete unnecessary dumped db files
            rm *.sql
           
            #mail admin that all went properly + add attachement using parameter -a and specify path to fattached file
            echo "All databases from server $(hostname -f) are backuped. You can check log file /path/to/directory/with/error_file.log" | mutt -s "Report - database backup from day $(date +%y-%m-%d-%H:%M:%S)" [email protected] -a "/path/to/directory/with/error_file.log"
    }
    
    #start doing backup - call function
    backup
    
    echo
    echo -e "\e[38;5;45mFinished doing databases backup.\e[39m"
     
  2. florian030

    florian030 ISPConfig Developer ISPConfig Developer

    you can dump all databases in single files with some less code ;)
    Code:
    #!/bin/bash
    DOW1=`date +%j`
    DOW=`expr $DOW1 % 3` # keeps bacjkups for 3 days
    HOUR=`date +%H`
    BPATH="/var/backup/sql"
    SQLDUMP=`which mysqldump`
    SQLBIN=`which mysql`
    MYSQL_USER=root
    MYSQL_PASSWORD=123
    DATABASES=`$SQLBIN --user=$MYSQL_USER -p$MYSQL_PASSWORD -e "SHOW DATABASES;" | grep -Ev "(Database|information_schema|performance_schema)"`
    for db in $DATABASES; do
            $SQLDUMP --force --opt --user=$MYSQL_USER -p$MYSQL_PASSWORD --databases $db | gzip > "$BPATH/$db-$DOW-$HOUR.gz"
    done
    
    
    [code]
     
  3. Taleman

    Taleman Active Member HowtoForge Supporter

    Debian GNU/Linux has https://packages.debian.org/stretch/automysqlbackup

     
    ahrasis likes this.
  4. ztk.me

    ztk.me ISPConfig Developer ISPConfig Developer

    big databases, need to restore just part of one database ... file per database is better. compress them with multithreded tool like $pigz
     
    ahrasis likes this.
  5. ahrasis

    ahrasis Active Member

  6. Poliman

    Poliman Member

    Yes, but it generates one file with whole databases. I need separate files. ;)

    All. I tested it (around half year ago).
     
  7. ahrasis

    ahrasis Active Member

    Is your backup script by chance incremental?
     
  8. Taleman

    Taleman Active Member HowtoForge Supporter

    Each database separately. The backups look like this:
    [email protected]:/# ls /var/lib/automysqlbackup/*/custname_ski/
    /var/lib/automysqlbackup/daily/custname_ski/:
    custname_ski_2018-05-09_06h28m.keskiviikko.sql.gz custname_ski_2018-05-13_06h28m.sunnuntai.sql.gz
    custname_ski_2018-05-10_06h27m.torstai.sql.gz custname_ski_2018-05-14_06h27m.maanantai.sql.gz
    custname_ski_2018-05-11_06h29m.perjantai.sql.gz custname_ski_2018-05-15_06h27m.tiistai.sql.gz

    /var/lib/automysqlbackup/monthly/custname_ski/:
    custname_ski_2017-12-01_06h25m.joulukuu.custname_ski.sql.gz custname_ski_2018-03-01_06h26m.maaliskuu.custname_ski.sql.gz
    custname_ski_2018-01-01_06h26m.tammikuu.custname_ski.sql.gz custname_ski_2018-04-01_06h28m.huhtikuu.custname_ski.sql.gz
    custname_ski_2018-02-01_06h27m.helmikuu.custname_ski.sql.gz custname_ski_2018-05-01_06h26m.toukokuu.custname_ski.sql.gz

    /var/lib/automysqlbackup/weekly/custname_ski/:
    custname_ski_week.15.2018-04-14_06h27m.sql.gz custname_ski_week.18.2018-05-05_06h26m.sql.gz
    custname_ski_week.16.2018-04-21_06h27m.sql.gz custname_ski_week.19.2018-05-12_06h28m.sql.gz
    custname_ski_week.17.2018-04-28_06h26m.sql.gz
    [email protected]:/#
     
    Last edited: May 15, 2018
    ahrasis likes this.
  9. Poliman

    Poliman Member

    Something changes. That's nice. When I tried it I had one sql.gz file with all tables from each backuped database. ;)

    No, at the moment but I want to implement this. Still learning. ;)
     
  10. ahrasis

    ahrasis Active Member

    Great. Thanks for the info @Taleman. It is very useful.
     
  11. Poliman

    Poliman Member

    The newest version of the script:
    Code:
    #!/bin/bash
    
    #Script for backup databases stored on the server.
    #Need to be added to cron at - i.e. - midnight.
    #Databases are backuped as tar.gz compressed archive.
    
    function backupDB()
    {
        DATE=`date +"%y-%m-%d-%H:%M:%S"`
        DIRECTORY="mysql_backup-$DATE"
        MYSQLDUMP=/usr/bin/mysqldump
            
        #need create .password file in proper directory contains data in form -> user,password
        MYSQL_USER="`awk 'BEGIN {FS=","} {print $1}' /root/.credentials`"
        MYSQL_PASSWORD="`awk 'BEGIN {FS=","} {print $2}' /root/.credentials`"
    
        #file will be stored in this place
        cd /mnt/disk/backups
        mkdir $DIRECTORY
        cd $DIRECTORY
    
        #backup the database, each in own file
        databases=`mysql --user=$MYSQL_USER -p$MYSQL_PASSWORD -e "SHOW DATABASES;" | grep -Ev 'Database|information_schema|performance_schema'`
        for database in $databases
        do
            echo -e "\e[38;5;40mI am doing backup of $database.\e[39m"
            $MYSQLDUMP --force --log-error=error_file.log --user=$MYSQL_USER -p$MYSQL_PASSWORD --databases $database > "$database.sql"
        done
            
        echo
        echo "Open path /mnt/disk/backups to see details."
    }
    
    function backup()
    {
        #execute doing database backup
        backupDB 2>/dev/null #> /dev/null 2>&1
    
        cd /mnt/disk/backups/$DIRECTORY
    
        #here need to archive and compress each file
        echo -e "\e[38;5;45mNow I am going to create zipped archives to save some disk space and remove not necessary files. Please, give me a moment. :)\e[39m"
        echo
        db="*.sql"
        for archive in $db
        do
            tar -czvf $archive.tar.gz $archive >/dev/null 2>&1
        done
    
        #make the backup files readable only by root
        /bin/chmod 600 *.tar.gz
    
        #delete unnecessary dumped db files
        rm *.sql
            
        #mail admin that all went properly
        echo "All databases from server $(hostname -f) became backuped. You can check the log file attached to message." | mutt -s "Report - backup from day $(date +%y-%m-%d-%H:%M:%S)" [email protected] -a "/mnt/disk/backups/$DIRECTORY/error_file.log"
    }
    
    function remove()
    {
        cd /mnt/disk/backups/
    
        #array contains all mysql backup directories
        var=($(ls))
        
        #remove first element of the array - we need last seven backups
        echo -e "\e[38;5;196mRemoving the oldest backup directory --> ${var[0]}.\e[39m"
        rm -r ${var[0]}
        echo -e "\e[38;5;40mRemoved with success.\e[39m"
        
        #release some memory used to allocate for array variable
        unset var
    }
    
    #start doing backup - call function
    backup
    
    #remove redundant backup directories
    remove
    
    echo
    echo -e "\e[38;5;45mFinished doing databases backup.\e[39m"
     
  12. sjau

    sjau Local Meanie Moderator

    Backticks shouldn't be used anymore. According ot #bash on freenode it's better to use var=$(command ....) syntax instead

    Also they say to no use echo... I use meanwhile printf everywhere. For easy coloring I use usually something like:

    Code:
    #!/usr/bin/env bash
    
    # Set info colors
    _Info () {
        printf '\033[1;30mInfo:\033[0m %s\n' "${1}"
    }
    _Success () {
        printf '\033[1;32mSuccess:\033[0m %s\n' "${1}"
    }
    _Error () {
        printf '\033[1;31mError:\033[0m %s\n' "${1}"
        exit 1
    }
    _Important () {
        printf '\033[1;33mNotice:\033[0m %s\n' "${1}"
    }
    
    _Info "some info"
    _Success "something was successful"
    _Error "noh..... this wasn't supposed to happen."
    _Important "you really should read this"
    
     
    Poliman likes this.
  13. Poliman

    Poliman Member

    Thank you for advices. Should it be done in functions or just i.e printf '\033[1;33mNotice:\033[0m %s\n'?
     
  14. sjau

    sjau Local Meanie Moderator

Share This Page