![]() If you're on Mac/Windows and want simplicity, Backblaze is an excellent choice. I use rclone for my music collection because it's much simpler and my music doesn't need to be encrypted or packed up into archives or anything. Now that I'm on Linux, I use FreeFileSync, Duplicati and rclone. When I was on Windows, I used FreeFilesync, Bvckup and Backblaze, both excellent products (not affiliated, just a happy customer). One of them lives with my work machine at home now because my local network used to be too flakey/slow for streaming music sanely. The music collection is also backed up to two different external drives, mostly so I can listen to music at work back when we lived in the office. Less frequent backup to Backblaze's B2 butt.įor large, mostly-static data like my music collection:.Frequent backup to my second internal hard drive.I might be a little paranoid (or just paranoid enough), but I've killed so many disks over the years that this seems reasonable. ZFS provides protection against bitrot and other nastiness, I have just recently has a disk die on me, which I easily replaced. I did stop with minute and hourly based snapshots. Restoring a file is a chore, but not impossible. The raspberryPi syncs with the ZFS data volume once every 24 hours. My backup server has triple mirrored stripes to store all the snapshots from all my machines (11T) and my first ZFS snapshot is from 2009.Īll my other machines run rsync continuously against a raspberryPi with a JBOD (3.6T) in my utility closet. It is there primarily for peace of mind on the big disasters, and cheaper than S3 for my data volume. With Crashplan I can go back several months and years, depending on the latest change of layout. ![]() The backup-server retains all monthly snapshot, while my primary machine releases them. As a secondary measure I have a backup-server sitting in the garage to which I send a monthly snapshot. These are backed-up continuously via Crashplan/Code42 with a small business plan. Striped, two-way mirror each for the home (7.25T) and data (3.62T) volumes. I use ZFS for my user and data directories. I'm willing to wager that 80% of the good backups in use today are due to people losing something important and not wanting to lose something again. (d'oh!) That was an amazing picture of a wild rose on the bike path, which I paid a lot of blood to the mosquitoes to get, I was upset about it and it's the reason I have such good backups now. The other time was before I had my laptop backing up properly and had no backup to restore when I RM'd a link using the wrong syntax, leaving the link and deleting the original. I had to rebuild the partition table by hand to restore it, lost one file. The first one almost two decades ago when a server drive thoroughly trashed its directory AND the backup drive somehow completely wiped its partition table. ![]() And I still need to improve my offsite storage. I have yet to need to do a full system restore. (at the time, that took almost 3 days to copy back) So far I've needed to dig into time machine on a few occasions for something that got changed or corrupted on my laptop, and I've had to restore a 23gb photo library container to a computer 300 miles away due to operator error. Last I counted, I had in excess of 25 TB total online storage under the roof. The backup server also uses rsync over the internet to back up a few other computers. That server also backs up its own boot drive, as well as several other computers in the building, in addition to all the server volumes. I also have a sophisticated custom bash script running rsync to an onsite backup server. My primary laptop has a Time Machine drive attached to it (which uses Rsync under the hood), I replace the drive every 2 years or so, removed goes into cold storage as long-term incremental backups. They're interested in "what other Slashdotters use," as well as "why and what your experience has been given more than superficial testing." So share you own thoughts in the comments. Rclone for uploading to business cloud storage, versioned cloud storage to provide resistance against bitrot and other corruption. My current solution is Unixy: separate tools for separate jobs. leaking file names), outrageously expensive, hard to set up, tied to a single storage supplier I don't fully trust, entirely proprietary (which makes me doubt long term stability), lack of file history, reputation for slowness, and so on. I found most of the methods generally had one or more of the following problems: poor Linux support, weak security (e.g. I moved away from the "bunch of disks with some off site" method. "I am curious as to what other Slashdotters use for backing up of home machines," asks long-time Slashdot reader serviscope_minor:
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |