Why Backup is Needed?¶
On Linux servers, data is everything—configuration files, user data, application logs… Once lost, the consequences can be severe: accidental file deletion, hard drive failure, hacker attacks, or even server system crashes can all cause data to “disappear.” The essence of backup is to create an extra “copy” of your data, allowing you to recover quickly when disasters strike.
Basic Backup Concepts¶
The core of a backup strategy is efficient data replication. Common classification methods include two:
- By Backup Content:
- Full Backup: Copies all data (e.g., once a week, compressing an entire directory).
- Incremental Backup: Copies only newly added or modified data since the last backup (saves space, ideal for long-term backups).
- Differential Backup: Copies data added or modified since the last full backup (slightly more than incremental, but recovery only requires full + differential, faster than full backup).
Example: A full backup is done on Monday (copies 100% of data), an incremental backup on Tuesday copies 10% of new data, and an incremental backup on Wednesday copies 5% of new data added since Tuesday. To restore, combine the Monday full backup with the Tuesday and Wednesday incremental backups.
Common Backup Tools (Beginner-Friendly)¶
Linux offers many backup tools; here are two of the most basic and widely used:
tar: Archiving and Compression Tool
tarpackages multiple files/directories into one archive and supports compression (e.g., .gz format).
Basic Usage:
# Package and compress the /data directory into backup.tar.gz (-c create, -z compress, -v verbose, -f filename)
tar -czvf backup.tar.gz /data
# List contents of the compressed file (-t list files)
tar -tvf backup.tar.gz
# Restore files from the compressed archive (-x extract, -C specify restore path)
tar -xzvf backup.tar.gz -C /restore/path
rsync: Synchronization Tool (More Flexible)
rsyncsynchronizes files locally or across servers and only copies changed data (incremental sync).
Basic Usage:
# Local sync: Copy /data to /backup (-a archive mode, preserves permissions; -v verbose output)
rsync -av /data/ /backup/
# Cross-server sync (requires SSH key-based authentication): Copy /data to remote server user@ip:/backup
rsync -av /data/ user@192.168.1.100:/backup/
Simple and Practical Backup Strategies¶
Based on data importance and server scenarios, here are recommended strategies (from simple to advanced):
- Scenario 1: Personal/Small Server (Small Data Volume)
- Strategy: Weekly full backup + daily incremental backup.
- Actions: Use
tarfor weekly full backups (e.g., 8 PM every Sunday) andrsyncfor daily incremental backups (e.g., 8 AM daily). - Command Examples:
# Weekly full backup script (/backup/weekly.sh)
#!/bin/bash
BACKUP_DIR="/backup/weekly"
mkdir -p $BACKUP_DIR
tar -czvf $BACKUP_DIR/$(date +%Y%m%d).tar.gz /data
# Daily incremental backup script (/backup/daily.sh)
#!/bin/bash
BACKUP_DIR="/backup/daily"
mkdir -p $BACKUP_DIR
rsync -av --delete /data/ $BACKUP_DIR/$(date +%Y%m%d) # --delete removes redundant files in the target
- Scenario 2: Enterprise/Important Data Servers
- Strategy: Daily full backup + offsite backup (e.g., local + cloud storage).
- Key: Use
tarfor full backups,rsyncwith cloud storage (e.g., Aliyun OSS, AWS S3) for offsite backups, and encrypt backup files.
After Backup: Verify to Ensure It Works¶
Backing up without verification is useless! Verification methods:
- Local Verification: Restore to a temporary directory to test, e.g.,
tar -xvf backup.tar.gz -C /tmp/test # Check if files are complete
- Incremental Verification: Test sync with
rsync --dry-run(no actual execution):
rsync -av --dry-run /data/ /backup/ # Simulate sync to check for differences
Automate Backups (Eliminate Manual Operations)¶
Use crontab to automate backups for daily/weekly recurring tasks.
Steps:
1. Write a backup script (e.g., backup.sh) based on the above strategies.
2. Make the script executable:
chmod +x backup.sh
- Schedule with
crontab:
# Edit crontab tasks (-e to edit current user's cron jobs)
crontab -e
# Add a task: Run the full backup script daily at 3 AM (format: minute hour day month weekday command)
0 3 * * * /path/to/backup.sh
Note: crontab may lack environment variables. Use absolute paths in scripts (e.g., /usr/bin/tar instead of tar).
Important Considerations¶
- Backup Encryption: Encrypt sensitive data (e.g., user passwords) with
gpgbefore storage. - Offsite Backup: Maintain at least two backups (local + cloud) to prevent physical server failure.
- Regular Restore Testing: Restore once monthly to a test environment to confirm backup validity.
- Permission Control: Set backup directory permissions to
700(only root can read/write) to prevent unauthorized access.
Summary¶
The core of Linux server backup is “simplicity, practicality, and automation.” Beginners can start with tar + crontab, choose full/incremental strategies based on data volume, and ensure security with regular verification and offsite storage. Remember: there’s no “best” backup, only the “right” solution for your needs!