1. First, create a local git repos, called /git. Set permission: 700 unreadable, except the ower.
2. Each sites, use versioning:
cd /srv/web-a # Your site is located hereIn order to secure .git folder, the domain mount point in HTTP server must be like /srv/web-a/public (in a subfolder).Otherwise, we need deny access to .git from webserver.
git init # This is the real, production file
git init --bare /git/web-a # This is a local backup
git add .
git remote add /git/web-a # Set path
git commit -m"Initial"
git push backup master
3. Whenever you make changes, do a commit and push it to backup repos (keep it updated!). We can even schedule to git on daily using cron (done later). This keep uploads and data are all the lastest in repos!
4. Database will be stored in /git/db (althought it's not a repos :D )
We need create an user for dumping, with minimal permission as needed:
CREATE USER 'dump'@'localhost' IDENTIFIED BY 'your-password';
GRANT LOCK TABLES, REFERENCES, SELECT, SHOW VIEW ON *.* TO 'dump'@'localhost';
5. Make a script at /home/you/backup.sh, chmod 700 (runable and readonly for you):
mysqldump --lock-tables=false -u backup -p"your_password" --all-databases \
| bzip2 > /git/db/all-$(date +%F).bz2
find /git/db/* -mtime +15 -exec rm {} \;
cd /srv/web-a
git add .
git commit -m"Auto-commit at $(date +%F)"
git push backup master
We can customize to dump each of your desire database.
The most important option is --lock-tables=false. It helps websites keep runing while dumping.
The second line is used to remove old backup aged more than 15 days.
6. Finally we need a cron to run:
0 23 * * * /home/you/backup.sh 2>&1 >/dev/nullThis will run on everyday at 23:00. Enjoy you night without worry! : )
7. When we are in trouble, we can use powerful git to restore the version we want. Currently database dump is not optimized, because it's a full backup. If there is a solution for incremental backup, it's better to save space and able to archive longer timespan.
No comments:
Post a Comment
New comment