Backuping data is real easy (with linux). I use three different ways:
- For a data base BDD :
mysqldump [cnx infos] |gzip > backupbdd.sql.gz
- For not too numerous files, which data is important, like configuration ones:
tar czvf yoyobackup.tar.gz dafiles/
- For files I want to access easily and/or numerous :
rsync -av me/ othercomputer:me/
In the two first cases, I add to the filename the date of the day, and like this, I have many backups, one per day. Useful if a bad change was made a week ago. Another way would have only yesterday backup, and I’d look stupid.
Network administrators know about this problem and so handle backups per date. Of course, one per day is a lot after one year, and we don’t need this level of detail, so no need to keep everything.
There are different ways to managed these date backup, among of them is
dirvish which backups a whole directory tree, and the good old
logrotate, except it’s perfect for log files but a pain for my zipped files.
As I didn’t find a good solution to tiny my files so I don’t have 58 files after 58 days, I coded it myself.
You can download the code :
There is a « inoffensive » mode, in which the programs tel what he would delete and rename instead of actually doing it.
Example of use for my database. Thanks to the backups, when my server died, only a few comments of visitors of the same day disappeared (yeah I could have used a SQL sync, but I’m no admin) :
#!/bin/sh # cronned every day early in the morning dirdate=`date +"%Y-%m-%d"` path=/home/salagir mysqldump -u... -p... --skip-extended-insert --all-databases | gzip > $path/backupsbdd/$dirdate.sql.gz php organize_backups.php $path/backupsbdd/'*'