Automate Daily MySQL Backups on AlmaLinux 10

By Anurag Singh

Updated on Jun 14, 2025

Automate Daily MySQL Backups on AlmaLinux 10

Learn how to automate daily MySQL database backups on AlmaLinux with a secure bash script. This guide also covers syncing backups to a remote server using rsync.

Backing up our MySQL databases daily is one of the most important tasks we can automate to protect data from accidental loss, corruption, or hardware failure. In this guide, we’ll walk through the steps to set up automatic daily backups on an AlmaLinux server using a custom bash script. This setup ensures that we always have a recent copy of our databases safely stored and ready to be restored when needed.

Prerequisites

Before we begin, let’s ensure we have the following in place:

How to Automate Daily MySQL Backups on AlmaLinux and Sync to Remote Server Using Bash Script

Let’s break it down step by step.

Step 1: Update Our System

Before doing anything else, we should make sure our AlmaLinux system is up to date.

sudo dnf update -y

Step 2: Create a Backup Directory

We’ll create a dedicated directory where all our backups will be stored.

sudo mkdir -p /var/backups/mysql
sudo chown $USER:$USER /var/backups/mysql

This ensures proper ownership and permission for the backup script to write files.

Step 3: Create a MySQL Backup Script

Now, let’s create a simple yet powerful bash script to automate the backup process.

nano ~/mysql_backup.sh

Paste the following script:

#!/bin/bash

# Variables
DATE=$(date +%F)
BACKUP_DIR="/var/backups/mysql"
MYSQL_USER="root"
MYSQL_PASSWORD="your_mysql_root_password"
MYSQL_HOST="localhost"

# Create backup
mkdir -p "$BACKUP_DIR/$DATE"
databases=$(mysql -u$MYSQL_USER -p$MYSQL_PASSWORD -h$MYSQL_HOST -e "SHOW DATABASES;" | grep -Ev "(Database|information_schema|performance_schema|mysql|sys)")

for db in $databases; do
    mysqldump -u$MYSQL_USER -p$MYSQL_PASSWORD -h$MYSQL_HOST --databases $db > "$BACKUP_DIR/$DATE/$db.sql"
done

# Delete backups older than 7 days
find $BACKUP_DIR/* -type d -mtime +7 -exec rm -rf {} \;

Important: Replace your_mysql_root_password with your actual MySQL root password. To secure it better, consider storing credentials in a .my.cnf file (explained below).

Step 4: Make the Script Executable

chmod +x ~/mysql_backup.sh

Step 5: Secure MySQL Credentials (Optional but Recommended)

Instead of exposing our password in the script, we can create a hidden file:

nano ~/.my.cnf

Add the following content:

[client]
user=root
password=your_mysql_root_password

Save and secure the file:

chmod 600 ~/.my.cnf

Then modify the script like this:

databases=$(mysql -e "SHOW DATABASES;" | grep -Ev "(Database|information_schema|performance_schema|mysql|sys)")

for db in $databases; do
    mysqldump --databases $db > "$BACKUP_DIR/$DATE/$db.sql"
done

Now the script will automatically use credentials from .my.cnf.

Step 6: Schedule Daily Backups Using Cron

Let’s automate the script to run daily at 2:00 AM.

crontab -e

Add this line:

0 2 * * * /bin/bash /home/your-username/mysql_backup.sh >> /var/log/mysql_backup.log 2>&1

Replace /home/your-username with the actual path to your home directory.

This schedules the backup every day at 2 AM and logs the output.

Step 7: Test the Backup and Restore Process

It’s important to test both backup creation and restoration regularly. Here’s how we restore a database:

mysql -u root -p < /var/backups/mysql/2025-06-14/my_database.sql

Updated Bash Script with Remote Sync Support

Edit your backup script again:

nano ~/mysql_backup.sh

And replace it with the full script below (this version includes remote backup support and optional .my.cnf credentials handling):

#!/bin/bash

# === Configuration ===
DATE=$(date +%F)
BACKUP_DIR="/var/backups/mysql"
REMOTE_USER="backupuser"
REMOTE_HOST="192.168.1.100"          # Change to your remote server IP or hostname
REMOTE_DIR="/home/backupuser/mysql"  # Remote backup path

USE_MY_CNF=true                      # Set to false if you don’t want to use ~/.my.cnf

# === Create Backup Directory ===
mkdir -p "$BACKUP_DIR/$DATE"

# === Generate MySQL Database Dumps ===
if $USE_MY_CNF; then
    databases=$(mysql -e "SHOW DATABASES;" | grep -Ev "(Database|information_schema|performance_schema|mysql|sys)")
    for db in $databases; do
        mysqldump --databases "$db" > "$BACKUP_DIR/$DATE/$db.sql"
    done
else
    MYSQL_USER="root"
    MYSQL_PASSWORD="your_mysql_root_password"
    MYSQL_HOST="localhost"
    databases=$(mysql -u$MYSQL_USER -p$MYSQL_PASSWORD -h$MYSQL_HOST -e "SHOW DATABASES;" | grep -Ev "(Database|information_schema|performance_schema|mysql|sys)")
    for db in $databases; do
        mysqldump -u$MYSQL_USER -p$MYSQL_PASSWORD -h$MYSQL_HOST --databases "$db" > "$BACKUP_DIR/$DATE/$db.sql"
    done
fi

# === Delete Local Backups Older Than 7 Days ===
find "$BACKUP_DIR"/* -type d -mtime +7 -exec rm -rf {} \;

# === Sync to Remote Server ===
# Option 1: Using rsync (recommended)
rsync -azP "$BACKUP_DIR/$DATE" "$REMOTE_USER@$REMOTE_HOST:$REMOTE_DIR"

# Option 2: Using scp (alternative)
# scp -r "$BACKUP_DIR/$DATE" "$REMOTE_USER@$REMOTE_HOST:$REMOTE_DIR"

# === Logging (optional) ===
echo "Backup completed and synced to $REMOTE_HOST on $(date)" >> /var/log/mysql_backup.log

SSH Key Authentication (Recommended for rsync/scp)

To avoid entering the password each time:

Generate an SSH key (if not already created):

ssh-keygen -t rsa -b 4096

Copy it to the remote server:

ssh-copy-id backupuser@192.168.1.100

Now the script can sync files without password prompts.

Final Thoughts

  • rsync is preferred over scp because it only transfers changes and can resume interrupted transfers.
  • Always test the script manually before automating with cron.
  • Ensure both local and remote directories have the correct permissions.

By following this guide, we’ve created a complete, hands-off system to back up our MySQL databases on AlmaLinux every day. This setup not only ensures data safety but also gives us peace of mind. We strongly recommend integrating remote storage and regular testing to make our backup strategy even more resilient.

Let’s keep our data safe, stable, and secured—automatically.