Hacker News new | past | comments | ask | show | jobs | submit login

First, I'd recommend thinning out - multiple terabytes sounds very extensive and can be thinned by removing duplicates and by using better compression like webp or x265, removing unnecessary raw-files, etc.

My personal backup is the usual 3-2-1: 3 backups, 2 places, 1 offline. I have one copy on my local harddrive (that I work with), one automatically synced copy via seafile on one of my dedicated servers (which also maintains a few months of history in case I accidentally delete something) and I have one external, offline harddrive at a relatives house, that I sync to every half a year or so. Since I'm paranoid, my dedicated server is backed up to an external storage every night as well via borgbackup. If you don't want to spend a few bucks a month on backblaze or another service, just use a local NAS - as long as you have one harddrive offline and external as well (in case of a ransomware attack that crypts all files).

Important: My files and backups are fully encrypted and it's imperative(!) that you backup all documentation, all config files, all settings, all cronjobs, all executables that have something to do with the backup and restoration process unencrypted with every backup - in the desaster case, nothing sucks more than trying to find the right settings again.

Case in point: I originally used a custom shell script and encoded the files with openssl. However, the default hash scheme was changed between openssl 1.0 and openssl 1.1 (or something like that) and when it came to restoring after a harddrive failure, this took me like a weekend to sort out.

As for posterity: it's up to you if you encrypt the external drive at a relative - if you're fine with a burglar having the images and you cannot be ransomed with them (e.g. due to nudes), just write what is on the harddrive clearly and you're fine.




Having to archive environments and toolchains along with backups is unpleasant.

What is the plan: when decryption fails (and before you identified that it's a versioning issue with openssl, in your case) you'd reinstall an old linux to a random computer and work from there? How many config files and settings are even involved in your backup process and how can you be sure you haven't missed anything?

I hope there are dependency-free solutions for this - a winzip-encrypted .zip file that asks for a password should work everywhere even in the future?


  > if you're fine with a burglar having the images and you cannot be ransomed with them (e.g. due to nudes), just write what is on the harddrive clearly and you're fine.
For the most part, just using an unusual filesystem e.g. ZFS will foil the vast, vast majority of attempts to read the data from a drive stolen from a home burglary (e.g. where the data was not the target).


That's security by obscurity - either do it right (if your data demands it) or don't put any effort in it at all, IMHO.


  > That's security by obscurity
Exactly. And in instances of securing physical objects, depending on the threat model, obscurity can an effective barrier.


Yeah, obscurity seems to help in the real world. The Presidential motorcade has a bunch of identical limos so attackers don't know which one the President is in. They, of course, have armor too. But the decoys add to the security, even if it's "by obscurity".


Sure, and don't lock your door since a smashed window is always an option.


The content gets stolen either way. You would save the money on window repair at the cost of losing a small deterrent (maybe a thief would refrain from making noise)


I have a similar setup, and have set up my backup infrastructure configuration as a git project, mirrored on both GitHub & Gitlab.

I can thus checkout the project on a new machine and just initiate it (giving it the right api keys etc) without issue.


> better compression like webp or x265, removing unnecessary raw-files, etc.

OP is talking about digitized videos, so asking to re-compress the videos is a https://xkcd.com/1683/ in the making.


The source for most movie rips today is Bluray, which is already an encoded medium. Yet Bluray remux's are not the common distribution format.

Yea you are technically correct, but if the distance from the original is just a handful of encodes, good luck noticing any lower quality that's not simply due to poor encoding settings. And when a proper encode can be a 10th the size with hardly any drop in quality, in a video file you might view <12 more times in your life, does it really matter.


Are you encrypting every file or create some virtual encrypted volume and copy all file over ?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: