5
Backup Solution on Linux Ubuntu for Smartermail
Idea shared by George Rauscher - 8/15/2024 at 7:54 AM
Proposed

Hello everyone,


I recently transitioned to Linux and started thinking about a cost-effective and efficient backup solution. After some experimentation, I developed a method that perfectly meets my needs, and I hope it will be useful for you as well.


My Solution to the Backup Problem


This script is designed to back up my SmarterMail server directories to an external hard drive. The script is structured to stop the SmarterMail service during the backup process to ensure data consistency, perform the backup with a real-time progress display, and then restart the service once the backup is complete.


What the Script Does


    •    Stops the SmarterMail service to ensure no data is modified during the backup.

    •    Backs up the directories /var/lib/smartermail and /etc/smartermail/ to an external hard drive located at /mnt/backup_vipmail/backup.

    •    Displays the backup progress in real-time, allowing you to monitor how far along the process is.

    •    Restarts the SmarterMail service after the backup is completed.

    •    Logs all steps and status messages in a log file located at /root/smartermail_backup.log.


Installing the Tool for Progress Display


To display the backup progress as a percentage, the script uses the pv (Pipe Viewer) tool. You can easily install pv on Ubuntu with the following command:

sudo apt-get install pv


Creating and Setting Up the Script


    1.    Create the Script: Create the script in the /root directory:


sudo nano /root/smartermail_backup.sh

    2.    Add the Script Content: Insert the following content into the script:


#!/bin/bash

SOURCE_DIR1="/var/lib/smartermail"
SOURCE_DIR2="/etc/smartermail"
DEST_DIR="/mnt/backup_vipmail/backup"
LOGFILE="/root/smartermail_backup.log"

log_and_echo() {
    echo "$1" | tee -a $LOGFILE
}

mkdir -p $DEST_DIR

log_and_echo "$(date): Stopping SmarterMail service..."
sudo service smartermail stop
log_and_echo "$(date): SmarterMail service stopped."

log_and_echo "$(date): Starting backup of $SOURCE_DIR1..."
tar -cf - $SOURCE_DIR1 | pv -s $(du -sb $SOURCE_DIR1 | awk '{print $1}') | tar -xf - -C $DEST_DIR
log_and_echo "$(date): Backup of $SOURCE_DIR1 completed."

log_and_echo "$(date): Starting backup of $SOURCE_DIR2..."
tar -cf - $SOURCE_DIR2 | pv -s $(du -sb $SOURCE_DIR2 | awk '{print $1}') | tar -xf - -C $DEST_DIR
log_and_echo "$(date): Backup of $SOURCE_DIR2 completed."

log_and_echo "$(date): Starting SmarterMail service..."
sudo service smartermail start
log_and_echo "$(date): SmarterMail service started."

log_and_echo "$(date): Backup successfully completed."

    3.    Set the Permissions: Make the script executable:


sudo chmod +x /root/smartermail_backup.sh

    4.    Test the Script: Run the script manually to ensure everything works correctly:


sudo /root/smartermail_backup.sh

After running the script, check the log file at /root/smartermail_backup.log to confirm there were no errors and that the backup completed successfully.


Setting Up a Cron Job


To automate the script execution at 2 AM every night, you can set up a cron job as follows:


    1.    Open the Crontab File:


sudo crontab -e

    2.    Add the Cron Job:

Add the following line to schedule the script to run at 2 AM daily:


0 2 * * * /root/smartermail_backup.sh

Save the file and exit the editor.


I hope you find this script helpful and that it meets your backup needs. With this method, I’ve been able to avoid costly backup solutions like Acronis, and it delivers a fast and reliable result. For me, backing up 50 GB takes about 4 minutes – pretty impressive!


Best of luck with setting up your own backup solution, and I’m looking forward to your feedback!


George
George A. RauscherMember of the German Society for Criminology (Deutsche Gesellschaft für Kriminalistik e. V.)Member of "LEVA" Law Enforcement and Emergency Services Video Association, Inc.intelligent piXel GmbHExperts in forensic criminologyEnzianstr. 4a82319 Starnberg0800 - 999 8 99 88 (free*)Website: www.intelligent-pixel.comManaging Director: George A. RauscherAuthorized Representative: Dr. Louise MorgottTax Number: 143 / 150 / 31010HRB 207 679 / Munich Local Court

8 Replies

Reply to Thread
0
A little suggestion - 

You may consider to add LVM snapshot for backup.  (Assume your data is on LVM volume and VG still contains space for snapshot)

i.e.
  • stop SmarterMail service
  • Create LVM snapshot(s), may be for data and config/log/..
  • start SmarterMail service (SmarterMail service run as usual)
  • For each LVM snapshot(s),
  • mount the snapshot, tar (or even rsync) those files, umount snapshot, remove snapshot.
It can speed up the noticeable downtime from minutes to seconds (esp when you have hundreds GB or even TB data).
1
From my experience Veeam is the best solution, even in its Free and/or Community Edition incarnations.

For those who want a stand-alone solution, in particular, there is a free version of Veeam Agent for Linux.
Gabriele Maoret - Head of SysAdmins at SERSIS Currently manages 6 SmarterMail installations (1 in the cloud for SERSIS which provides services to a few hundred third-party email domains + 5 on-premise for customers who prefer to have their mail server in-house)
1

@georgeto, Thank you very much, that is a good approach for servers that would be offline for a longer period of time. Thank you! I will adjust the script accordingly and then post it here.

@Gabriele, 

Veeam is a great program, that would also be an option, but I am reluctant to install third-party software because, in the end, it’s a service where I can’t be sure if it might create an open door for attacks.

George A. RauscherMember of the German Society for Criminology (Deutsche Gesellschaft für Kriminalistik e. V.)Member of "LEVA" Law Enforcement and Emergency Services Video Association, Inc.intelligent piXel GmbHExperts in forensic criminologyEnzianstr. 4a82319 Starnberg0800 - 999 8 99 88 (free*)Website: www.intelligent-pixel.comManaging Director: George A. RauscherAuthorized Representative: Dr. Louise MorgottTax Number: 143 / 150 / 31010HRB 207 679 / Munich Local Court
0

Hello everyone,


I wanted to share an update on my backup script for SmarterMail. After some great suggestions, I’ve now rewritten it to use rsync, which allows me to perform backups without stopping the SmarterMail service. Here’s how it works:


The script targets two key directories: /var/lib/smartermail and /etc/smartermail. It uses rsync to efficiently copy these directories to an external hard drive. What’s great about rsync is that it only copies changed files, making the process much faster and less resource-intensive.


I’ve also added a retry mechanism to the script. If something goes wrong during the backup (like a temporary network glitch), the script will automatically try again—up to two more times. This gives me extra peace of mind, knowing that minor issues won’t cause the entire backup to fail.


Here’s a quick overview of the script:


    •    It logs all actions and errors to a log file, so I can easily check what happened during the backup.

    •    It creates the backup directory if it doesn’t exist yet.

    •    The script performs the backup for each directory separately and logs the success or failure of each attempt.

    •    If all retries fail for a directory, it logs a critical error, so I know to look into it.


This approach has worked really well for me so far, and I’m happy with how smoothly it runs. If anyone is dealing with similar issues, I hope this helps! Feel free to ask if you have any questions or need further details.

---- snippet -----


#!/bin/bash

# Paths to the directories that need to be backed up
SOURCE_DIR1="/var/lib/smartermail"
SOURCE_DIR2="/etc/smartermail"

# Destination directory on the external hard drive
DEST_DIR="/mnt/backup_vipmail/backup"

# Path to the log file
LOGFILE="/root/smartermail_backup.log"

# Maximum number of retries if the backup fails
MAX_RETRIES=2

# Function to log messages to the console and the log file
log_and_echo() {
    echo "$1" | tee -a $LOGFILE
}

# Function to perform the backup with rsync
perform_backup() {
    SOURCE_DIR=$1
    RETRY_COUNT=0
    SUCCESS=0

    while [[ $RETRY_COUNT -le $MAX_RETRIES ]]; do
        log_and_echo "$(date): Starting rsync backup of $SOURCE_DIR (Attempt $(($RETRY_COUNT + 1)))..."
        
        # Perform the rsync backup
        rsync -avh --delete $SOURCE_DIR $DEST_DIR
        
        # Check if the rsync command was successful
        if [[ $? -eq 0 ]]; then
            log_and_echo "$(date): Rsync backup of $SOURCE_DIR completed successfully."
            SUCCESS=1
            break
        else
            log_and_echo "$(date): Rsync backup of $SOURCE_DIR failed. Attempt $(($RETRY_COUNT + 1)) of $(($MAX_RETRIES + 1))."
            RETRY_COUNT=$(($RETRY_COUNT + 1))
        fi
    done

    # If the backup failed after all retries, log a critical error
    if [[ $SUCCESS -eq 0 ]]; then
        log_and_echo "$(date): Critical Error: Rsync backup of $SOURCE_DIR failed after $(($MAX_RETRIES + 1)) attempts."
    fi
}

# Create the destination directory if it doesn't exist
mkdir -p $DEST_DIR

# Backup for the first directory
perform_backup $SOURCE_DIR1

# Backup for the second directory
perform_backup $SOURCE_DIR2

# Log completion message
log_and_echo "$(date): Backup script completed."
George A. RauscherMember of the German Society for Criminology (Deutsche Gesellschaft für Kriminalistik e. V.)Member of "LEVA" Law Enforcement and Emergency Services Video Association, Inc.intelligent piXel GmbHExperts in forensic criminologyEnzianstr. 4a82319 Starnberg0800 - 999 8 99 88 (free*)Website: www.intelligent-pixel.comManaging Director: George A. RauscherAuthorized Representative: Dr. Louise MorgottTax Number: 143 / 150 / 31010HRB 207 679 / Munich Local Court
1
This is a good "cheap" method!

But I have some questions about how your procedure works:

- How do you manage retention?
Suppose, for example, you want to keep 2 backups per day (1 every 12 hours) and 30 days of retention (this is my actual backup plan...)

- How do you manage compression?
For example, with my Veeam-based backup I have backup compression/deduplication which saves me around 70% to 80% of backup space...

- Can you have "differential" backups or are you limited to mantain full backups?

- How do you get success/error/warning notifications that are easily understandable at a glance?
Gabriele Maoret - Head of SysAdmins at SERSIS Currently manages 6 SmarterMail installations (1 in the cloud for SERSIS which provides services to a few hundred third-party email domains + 5 on-premise for customers who prefer to have their mail server in-house)
0
Jereming Chen Replied
Employee Post
Rsync has the capability of copying/backing up differentially. Simply use these switches -azP
The z switch should handle compression and there are other switches to adjust the compression behavior. 
As for your other questions, George will have to take over. 
Hope that helps.
Jereming Chen System/Network Administrator SmarterTools Inc. www.smartertools.com
1

Jereming, thanks for the tip, and Gabriele, here’s the adjustment we’ve made. I think we’ve now developed a solid solution. However, maintaining backups for 30 days requires a considerable amount of disk space. I believe 7 days will be sufficient.


This enhanced backup script not only performs two backups per day with a retention period of 7 days, but it also includes comprehensive logging and email notifications. Every step of the process—from starting the script to completing the backups and handling errors—is logged in detail to a file (/var/log/backup_script.log) and also sent to your email for easy monitoring.


The script uses rsync with advanced options like compression during transfer, deletion of obsolete files, and partial file retention to ensure that your backups are efficient and reliable. By retaining only the last 7 days of backups, the script automatically manages disk space, making sure you don’t run out.


The flexibility to adapt the retention period, paths, and email notifications ensures that this script can be easily tailored to different needs, providing a simple yet powerful way to manage regular backups. The peace of mind knowing that your data is securely stored and any issues are promptly communicated makes this an effective solution.


If you have any more questions or need further adjustments, just let me know!!

George



------- script ------------------


#!/bin/bash

# Configuration
BACKUP_SOURCE1="/var/lib/smartermail"
BACKUP_SOURCE2="/etc/smartermail"
BACKUP_DEST="/mnt/backup_vipmail/backups"
RETENTION_DAYS=7
MAIL_TO="your-email@example.com"
LOG_FILE="/var/log/backup_script.log"

# Function to log to both console and log file
log_and_notify() {
    echo "$(date): $1" | tee -a $LOG_FILE
    echo "$1" | mail -s "Backup Script Notification" $MAIL_TO
}

# Ensure backup destination exists
mkdir -p "$BACKUP_DEST"

# Perform backup with rsync, using additional options for compression and progress
perform_backup() {
    local source=$1
    local dest=$2
    local label=$3

    log_and_notify "Starting backup for $label: $source"
    
    rsync -azP --delete --partial --backup --backup-dir="$dest/$(date +%Y%m%d%H%M)" $source $dest/current >> $LOG_FILE 2>&1
    
    if [ $? -eq 0 ]; then
        log_and_notify "Backup for $label completed successfully."
    else
        log_and_notify "Error during backup for $label."
        exit 1
    fi
}

# Cleanup old backups
cleanup_old_backups() {
    log_and_notify "Cleaning up backups older than $RETENTION_DAYS days in $BACKUP_DEST"
    
    find "$BACKUP_DEST" -maxdepth 1 -mtime +$RETENTION_DAYS -type d -exec rm -rf {} \; >> $LOG_FILE 2>&1
    
    if [ $? -eq 0 ]; then
        log_and_notify "Cleanup of old backups completed successfully."
    else
        log_and_notify "Error during cleanup of old backups."
        exit 1
    fi
}

# Start the backup process
log_and_notify "==========================================="
log_and_notify "Backup script started."

# Perform cleanup first to ensure enough space for new backups
cleanup_old_backups

# Perform the backups for the specified sources
perform_backup "$BACKUP_SOURCE1" "$BACKUP_DEST" "SmarterMail Data"
perform_backup "$BACKUP_SOURCE2" "$BACKUP_DEST" "SmarterMail Configuration"

# Final message
log_and_notify "Backup script completed successfully."
log_and_notify "==========================================="
George A. RauscherMember of the German Society for Criminology (Deutsche Gesellschaft für Kriminalistik e. V.)Member of "LEVA" Law Enforcement and Emergency Services Video Association, Inc.intelligent piXel GmbHExperts in forensic criminologyEnzianstr. 4a82319 Starnberg0800 - 999 8 99 88 (free*)Website: www.intelligent-pixel.comManaging Director: George A. RauscherAuthorized Representative: Dr. Louise MorgottTax Number: 143 / 150 / 31010HRB 207 679 / Munich Local Court
3
Very impressive scripting solution, George, and kudos for all your hard work!

Particularly in a virtual environment, Veeam is, bar none, the most amazing backup software I have ever had the pleasure to use and rely upon. From restoring individual files/folders to full server restore, it is absolutely amazing and reliable. Newer versions of Veeam have the ability for instant restore. I ran into a situation recently where this was invaluable and my jaw dropped when I realized that instant truly meant instant restore. A customer "accidentally" deleted the wrong Hyper-V VM in their server farm. On their Veeam backup server, I right clicked the backed up VM instance item, followed the  "instant restore" process and the virtual machine both booted up instantly from the backup files, and restored to its original Hyper-V host WHILE the files were being copied back. Downtime was less than 10 minutes (really only a minute or two once I got into the console) although the VM had nearly a TB of data to restore to the host server! Using volume shadow copy, Veeam will make full/differential/incremental and fully restorable backups with zero downtime.

Incremental backups, ability to save multiple full backups of varying ages, instantly restore from any full or incremental backup, de-duplication, encryption, off-site backup copying, 1st or secondary backup to AWS/Azure/SFTP/Backblaze, sandboxing. I know this sounds like some paid sales pitch but it's just my enthusiastic recommendation for the product. Nevertheless, having good scripting solutions using tools built into the OS are always great to have available, as you've done!

Matt

Reply to Thread