A heavy burden on the soul is the fact that even though I'm already doing backups — not doing it properly.
The data on the computer divided into two groups: big and bad (the films, the image of the installation disks, etc), a small and important (documents, scripts, source code, etc). Fortunately, photography and video are not engaged so large and important files there.
At this point an important directory once a day are copied automatically using rdiff-backup to another hard disk partition. Occasionally (so as to automate this fails) the contents of archived section moves on DVD.
An obvious disadvantage: copying to an external medium rare (a couple times a year), and up to another hard drive partition will not help in case of problems with the disk.
The size of the important data, about 8 GB, usedon
most changes infrequently.
The right product (or combination of products), which would have the following properties:
- Automation — not required human involvement while copying
- Incrementalcost — copy only changed files, and even better, only the changes in files. With some babouta larger period to do a full backup, just in case. UPD: to store not only the latest version, but (differential) all previous (as does rdiff-backup)
- Save backups in the cloud. External media are not suitable due to the lack of automation possibilities.
- Encryption on the client side. On the server side should not be stored anything meaningful. The encryption key must be only on the client, and (backup) on a durable medium, but not on the server.
- Control. When copying, the human part should not be required, but it is necessary that in case of malfunctions it became known. For example to display the report results (success/fail, time, size) copy once a week.
- The opportunity to work under Windows
- The ability to work in "notebook" terms (the Internet is often, but not always. The laptop is included not always. When using battery power — postpone/pause)
The first thing comes to mind nnCron + rdiff-backup + TrueCrypt + Dropbox, will describe the disadvantages of such combination.
- Synchronization will occur only when you unmount the tc container. So you have to mount/unmount script container every time
- The restriction on the length of the path in some of the API, which rdiff-backup crashes for some files
- Some files can exclusive be opened in other programs, which leads to problems when copying
- Copying is not atomic, so some files may be're discordant (though it rarely gives a problem)
- There is no error control. One day something may go wrong and I'll never hear about it
Plus: if you include a pack-rat
, it is possible to replace rdiff-backup to any rsync (incrementalcost and the version history will provide Dropbox) to save space. Although there was (or is?) ways to view in Dropbox any directory and all files in it as at some point in time without restoring a single file. And with the active use of such features, it does not consider whether this abuse = ban.UPD
Found an article comparing the services: Backup online
, yet tend to CrashPlanUPD2
Eventually settled on Duplicati
, support all of the above but quite a lot.
For other OS alternatives (or rather the original version) is the duplicity
, from the author of rdiff-backup.