Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: What is your system for backing up family photos and video?
321 points by reisr3 on Jan 18, 2022 | hide | past | favorite | 375 comments
My extended family has several terabytes of family photos and videos from over the years. We've mostly digitized everything, but some segments are just sitting on external hard drives in closets - waiting to eventually break or become corrupted.

My current methodology for our immediate family is aligned with the common back up advice - one local copy, one off-site copy (at grandma's house,) and one in cloud storage. We're using Google Photos for cloud storage. The easy integration with Nest Hubs makes for nice digital picture frames around the family homes.

What is your system for backing up family photos and videos to stand the test of time? Is it adequate to put everything in cloud storage and forget about it? Do you reassess every couple years and adjust to the new landscape of storage services? Is it unavoidable that we'll be paying $100+/year forever for a [presumably increasing] few terabytes of cloud storage?

Is there a good solution for posterity? For example, once I die, and if my family were to become unable to pay the hosting bill, is there any way to guarantee these heirlooms remain intact and available?




I use Google Photos and once a year I do a Google takeout to a hard drive and put that drive somewhere safe.

I tweeted about this a while back and several people had good replies: https://twitter.com/aarondfrancis/status/1445408384472211464

For context, this was after facebook was down for a day. I think the odds of Google deleting all my photos is pretty low, but the odds of me losing access to my Google account for legitimate or non-legitimate reasons is much higher.


Since we're talking about takeout, it's always a good idea to export your Gmail inbox once a year, encrypt it with GPG (or something similar) and store it offline somewhere on a drive.


There's also a fantastic tool by Jay Lee that will do it in an automated fashion

https://github.com/jay0lee/got-your-back


I use a google script with time trigger and filters to download each message (30 days or more older, so that I have time to delete unwanted), as .eml files too, along with your suggested mox files.


Are there any tools to automate this process? I know you can schedule Google Takeout, but I want the backups to be automatically downloaded and saved somewhere, e.g. S3 or my local NAS


The problem I had is that the takeouts always failed with "some data could not be exported, please retry" no matter how many times I retried.


If you're in the EU, you can start annoying them with “this is my legal right, pls fix”-type complaints. If you're polite, persistent, and keep escalating, you might even get all your data.


Exact same problem here. Takeouts just does not work for me.


I do this and I put the takeout zips into AWS as well. The odds of losing access to both at the same time are sufficiently low.


1. copy to hard disk

2. buy a new drive every year and copy it forward

3. store a copy with relatives

Note that a cloud provider:

1. can shut off your access at any time, by accident or on purpose, and you have no recourse

2. will sell your data to anyone at any time

3. will give it to any government for the asking

4. will leak it to any hacker

5. will allow employees of said cloud provider to browse your data

6. will scan it to use it to sell you ads

7. will scan it looking for anything they don't approve of this week, and take "corrective action"

8. if you've got terabytes, good luck trying to download it


1. Bitflip.

2. Copying Error ( Not sure if that is still a thing on Windows )

3. HDD failure.

A cheap, ZFS, consumer friendly, Dual 2.5" HDD DAS would have been great. Unfortunately no one is making one. May be there is no market, May be they just dont care.

Or burn it to an Archival Disc. Apple could have make an iOS Time Capsule. But they are too busy to push their service revenue.


>A cheap, ZFS, consumer friendly, Dual 2.5" HDD DAS would have been great. Unfortunately no one is making one. May be there is no market, May be they just dont care.

IIRC ZFS is more trouble than it's worth on tiny systems due to the RAM requirements... generally speaking if you care enough to want ZFS specifically you'll get a slightly larger QNAP / similar NAS (I think the smallest ZFS capable qnaps are 4-bay), or just DIY the hardware and install FreeNAS (or similar) on top.

It might work its way down to the consumer segment eventually but given cloud is pushed heavily (and often you get a decent chunk of storage for free with M365 / google / whatever), needs no end-user know-how, and provides a touch of geographic separation in case of a house fire or other minor disaster it's hard to see tiny ZFS boxes competing vs. cheap cloud storage.


Please stop talking about ZFS having excessive RAM requirements - that only applies if you turn on De-Duplication. ZFS will happily run on a box with 2GB of RAM.


Why on earth no one said this on the truenas forum. I have 96GB of ram on my home server


Wow. Do you have any pointers to this information? You can get easily 2GB with a SBC for less than $60. The expensive part is the case, and power supply.


You can google it as there should be plenty of discussions. Even on HN using 1GB of memory for ZFS. [1] The two ZFS memory intensive feature is ARC ( Cache ) and De-Duplication.

The 2nd thing is that you are not using the NAS / DAS frequently in this scenario. The memory only get you extra performance. The only reason why you want ZFS is file integrity.

[1] https://news.ycombinator.com/item?id=11897571


Points 2, 3, 4, 5, 6 and 7 can be fairly easily mitigated by just encrypting your backups, especially if you're using something like Duplicati which will do it automatically.

For 8, cloud storage services like Backblaze B2 can mail you a hard drive with all your data on it. I believe with B2 you can also return the drive afterwards and get your deposit on it back.


Maybe I am nitpicking but there are some very important steps missing:

Encrypt everything

Check drives (health etc)

Etc


I don't Encrypt my family photos. I don't understand why anyone would?


Multiple reasons:

1) Someone can use those to train DeepFake models and scam your family

2) There could be private photos and videos there that you aren't aware of

3) Why not?


1 and 2 will probably make sense to some. It's not something that would overcome my why not.

For 3 the biggest problem with encryption is you introduce a single point of failure, if you lose access to the keys your photos are gone. On a simpler note I'm a big fan of keeping things as simple as I can. Encryption is not simple. It can be fun to work with and I don't want to ignore that but unless you want to do that for fun I don't think it's worth the complexity.


If you use restic for backups there is not even an option to do it without encryption. Just use something simple you will never forget. Everybody has that password.


I don't encrypt them because then if I lose the encryption key or the encryption program or the encryption program won't run on my future machine, I lose it all.


Perhaps some family photos would include sexy time memories? Ideally those are password protected though or not included with general family photo storage..


It's easy to say "Encrypt everything," but how do you go about it? For folks looking for an "easy" cross-platform solution to encrypt your sensitive bits in the cloud, check out Cryptomator [^1].

[^1]: https://cryptomator.org/

(no affiliation)


Restic. If you create a restic backup it asks you for an encryption password, you enter it, co fogure the folders etc.

Sadly it has no gui (that I know of) but something like runtestic can help with automated incremental and encrypted backups, where you only want to keep the last n versions (and for example a weekly snapshot)


On Gnome, DejaDup recently rolled out restic support, and I must say, it's looking really good!

If you use Gnome, I would encourage you to use it and give feedback here [1]

[1] https://gitlab.gnome.org/World/deja-dup/-/issues/192


Thanks for the hint, I am a KDE plasma user, but I will keep this in mind for Gnome users in my circle.


What encryption are you using ?


> 1. copy to hard disk > 2. buy a new drive every year and copy it forward > 3. store a copy with relatives

Yep, always good to have a local copy.

> Note that a cloud provider: >1. can shut off your access at any time, by accident or on purpose, and you have no recourse

Absolutely, which is why steps 1-3 help here.

> 2. will sell your data to anyone at any time

Not true. Neither Google, Amazon or Microsoft do this. The current privacy policies state that. If you want to be extra cautious, you can create a cloud storage account or an account with personal domain and their privacy policies are even strong. Imaging Google or Microsoft "selling" a business's images in Photos or Drive.

> 3. will give it to any government for the asking

Unfortunately, true.

> 4. will leak it to any hacker

Will leak it is a bit too strong and harsh. 'Could be leaked' is better. Although, I have a basic trust in their security as opposed to not even using their services for this fear. Of course, I would never upload any 'private' or 'confidential' images there.

> 5. will allow employees of said cloud provider to browse your data

Again, false. 'Will' here is too strong. None of the employees have direct access to your data -- neither emails, nor images. And if you want to be even more confident, go for Google Suite or Microsoft 365 account and this is guaranteed to not happen.

> 6. will scan it to use it to sell you ads

True, but you could mitigate it by using the paid tier accounts.

> 7. will scan it looking for anything they don't approve of this week, and take "corrective action"

No arguments here -- although the paid tier accounts should guarantee this does not happen.

> 8. if you've got terabytes, good luck trying to download it

The better solution is to keep a local copy using steps 1-3 and also upload them to "your chosen service".


> None of the employees have direct access to your data

I've seen enough reports of employees charged with viewing customers' private data to be skeptical that this can't be done. Also, people keep finding security flaws. If outsiders can penetrate the system and download terabytes, it ought to be even easier for insiders.

> Will leak it is a bit too strong and harsh

Not at all. I used to design airplane parts. The principle is not "could it fail" but "when it fails, how do we ensure the plane won't crash?" It's a completely different mindset.

> guaranteed to not happen

If you say that in the airplane design business, you'll get your knuckles thwacked with a ruler. It is always when it happens, ...

> current privacy policies

Which can change at any time.

I clearly judge risk differently than you do. My approach comes from 3 years of designing airplane parts. I wish this approach was taught more often.

I know that if my data is not in the cloud, it will not be compromised by the inevitable failure of the cloud to protect it.


> Not at all. I used to design airplane parts. The principle is not "could it fail" but "when it fails, how do we ensure the plane won't crash?" It's a completely different mindset.

Right, but your analogy is akin to saying "I won't travel by any Boeing plane because the plane could crash". Different risks are fine as long as the mitigation plan isn't "I will never travel by planes". Just my 2 cents since I don't design airplane parts.

> If you say that in the airplane design business, you'll get your knuckles thwacked with a ruler. It is always when it happens, ...

Sure, different industries, different risks. The risk in this context isn't death. I agree "guaranteed" was a wrong choice of word here, however I have basic trust in their enterprise products to know my data is safer and secure than doing my own thing.

> Which can change at any time.

Which is when you can decide if you still want to continue -- or purge your data (since I assume there is a backup).


I do something similar.

Recent photos live on grandma-/normie-friendly services like Google Photos, and once every year or so I download everything and store it on platter drives locally with par2 archives to guard against bit rot.


What redundancy level do you use for par2?


I did this, NAS to NAS as storage density improved, with rsync checksums on. Years into this I noticed a thumbnail that didn't render, found I couldn't open the original. Looked around, there were quite a few.

Ran a RAW + JPEG error detection script: on the order of 10^1 - 10^2 of photos were no longer valid out of on the order of 10^5 - 10^6 photos.

Turns out rsync verifies on the way, not after write.


Propagating errors is indeed an issue. That's why, for example, my backup program never copies different files back from the backup. The backups form a tree, always writing down, never up.


Do you actually do this? Or is it theory?


It doesn’t really seem that hard. Buying a new hard drive and cloning data from the old one can be done pretty easily once a year.


Nothing seems hard until you do it.


I agree. GP comment brings to mind the infamous reply to Dropbox Show HN which compared it to something users could trivially implement using basic Unix commands.


I wrote my own backup software. Does that count? :-)


That's a lame backup frequency though. One year of data is significant loss.


I add files to the new drive throughout the year. I keep the drive unplugged most of the time but periodically plug it in to add files. I still have a >1 year recovery point in case of fire, but less than that for more mundane data loss scenarios (accidental deletion, etc).


I have more than one backup drive, and rotate them. I just protect myself against aging drive failure by rotating in a new one once in a while.


Yes. Newegg spams me all the time with offers for deep discounts on disk drives, and I take advantage.


Okay, so tell me: how do you do it? What are the error modes you've experienced?


1. buy the drive

2. plug it into my system via a USB drive adapter

3. format the drive

4. run the program I wrote to copy the files I want to back up to it

5. unplug the drive

6. plug the drive in next time I want to do a backup

7. run my program again, which will only copy over changed files. You could use rsync if you prefer

8. unplug the drive

Error modes? I've had disk drives fail now and then. I have a 10 lb sledge hammer I use to render it unreadable by anyone but the NSA, and throw it in the trash. Buy another one.

I suppose I'm not really understanding your question. But it really is this low tech and simple.


> 2. will sell your data to anyone at any time

I've never heard that, is there any evidence that this has happened?


Remember that the cloud, is just "some other guys computer"


How long have you been following this procedure?


I don't follow this exact procedure, but I do something similar.

I have a bunch of external drives. They go back to ~100GB parallel-ATA drives in external USB enclosures. I think I have ~10 generations of disks in play right now. I've spent less than $2,000 on drives in ~20 years. I probably should have purchased a few more disks over the years since I went >18 months between generations and sometimes 24-30 months.

I wrote some simple scripts to copy the contents to a new drive, compare hashes of the copy to the source, and store the hashes for any new files on the new copy. It's mostly automated. The only thing I've thought about adding is perhaps using PAR2 to add some bitrot protection.

I keep the most recent disk at home and copy files to it throughout the year. (I keep it unplugged most of the time.) When I buy a new disk I run my copy scripts then take the old drive to a relative's house and store it in their gun safe. (Lately I've been the oldest drive from the safe and marveling at the old technology. >smile<)


I wrote my own rsync clone to do backups. I do the rsync once a day to my primary backup.


30 years now in a more organized fashion, 40 years if you count disorganized.

Some of my older drives are no longer recognized by the drivers anymore.

I've had several drives completely fail, and one I dropped that never worked again.

I've used CDROMS, DVD-Rs, zip drives, and even cartridge tape for a while, but the hard drives work the best. I still have the cartridge tapes, but no way to read them anymore. Same with zip disks, can no longer read them.


why not using AWS and Google deep archive storage? seems cheap enough. Keep the data at home, then use cloud deep archive storage as backup. No?


I did use AWS for a while as a backup, but lost interest in it.


For posterity and "the test of time," I really think you need to create physical photo books. My wife and I have used Blurb for that. It's not cheap, and it's a lot of work, but I'm skeptical that our digital photos will outlive us for very long, for the reasons already mentioned.

I've found that the greatest benefit of cloud storage is not as a backup, but as our primary place to keep and look at photos. We started uploading and organizing everything in Flickr a few years ago, and now we spend more time enjoying our photos with Flickr's web and mobile apps than we ever did when they were scattered across our laptops and backed up to an external hard drive.

I still have the photos stored locally on my laptop and an external hard drive, and I fully expect that I'll need to upload them again to another service (maybe Google Photos, if that still exists) when Flickr finally fails as a business.


I am heavily involved in genealogy, practitioners of which have a huge interest in maintaining data, documents, photos, and other research for the next generations.

I have a copy of a handwritten family tree and other notes written by great-grandmothers, cousins, and great aunts, some of which survived close to 100 years. If they had computers back then, I doubt any of the data would be available now. The photo books I have shared with relatives contains copies of these letters and charts and family photos, and even if half of them are destroyed or lost in the next 50 years, the remainder will survive, hopefully for much longer.

It amazes me how much faith other genealogists have in cloud storage or paid services like Ancestry, owned for the past 10+ years by a series of profit-focused private equity firms. Policies change (for PE's benefit), subscriptions lapse, people delete accounts, or online services get shut down, as some Ancestry users discovered the hard way:

https://slate.com/technology/2015/04/myfamily-shuttered-ance...

Another problem with these services: Even though I can share a link to an online tree or a shared cloud folder, more than half of the recipients hate to deal with them because they are forced to learn a new UI or get prompted to death to sign up for a paid service they don't need. The other half eventually forget the links or the information that was in them. Paper or books is the only proven long-term storage medium for genealogy data.


What do you do to preserve your accounts? I share your concern about the services available today and don't subscribe to them. That said - my question for you, as you are heavily involve: what do you do and is it effective / costly / built for what time period?

Thanks!


I use a desktop software program to organize and save data (Reunion for Mac) and export a backup GEDCOM from time to time which can be imported into other applications if need be:

https://www.familysearch.org/wiki/en/GEDCOM

GEDCOM has many drawbacks, but it's the de facto standard, it's been around for nearly 40 years, and it's plaintext.

Core data and pictures and family stories gets put into photo books and widely shared with relatives, including some cousins I barely know. I've done three so far for different branches, and will do some updated ones as well as new research is completed.


> I've found that the greatest benefit of cloud storage is not as a backup, but as our primary place to keep and look at photos. We started uploading and organizing everything in Flickr a few years ago, and now we spend more time enjoying our photos with Flickr's web and mobile apps than we ever did when they were scattered across our laptops and backed up to an external hard drive.

This is a good point. I was 100% anti-cloud for my photos, but have to admit the barrier of bringing out the external HDD and hooking it up to a PC means we hardly ever look at them.


Not really a need for big external drives any longer. Terabyte thumb drives are under $40 on Amazon and can be put on a hub or otherwise tucked away. SD card is a more expensive option but gets out of the way more effectively.


So I am going through this process with my mother who is organizing photos. I asked her the intended audience and a lot of it is for posterities sake.

My next questions were then how do you arrange a project for the intended audience (context and information is important as random photos of people 100 years from now won't have much value). And then how do you create it so the paper book lasts for 100 + years.

Your idea on Blurb was was interesting to me. Do you know the quality of the paper of the book will survive a long time or did you look into that? Also what reasons did you settle on Blurb (if you don't mind me asking)?

Thanks for pointing me in the right direction - I was about to suggest printing out individual copies and putting them in classic photograph book with plastic covers and adding her own blurbs.


You make a good point about context and information. Part of the work in making a book is writing all the captions to tell the story. I think we've all looked through a box of old photos from our childhoods and wondered, "who is that?" "where was this taken and when?"

The books we've made through Blurb look high quality to me. They offer several choices for the type of paper at different price points, but I didn't do any research on that. We went with Blurb because it has some relationship with Flickr for pricing and an interface to import our photos from those already organized albums (see "Prints" from the drop-down menu on Flickr). It used to have a decent web app for creating a book. Now you have to download and install their Windows-only desktop app, but it's good.


I've looked into this. Unfortunately, most commercial book printing services don't offer sufficient quality for the binding, paper quality, or image duplication. Or, some things will be good, and other things will be bad. Apple's old photo book service (discontinued 5+ years ago) had fantastic photo reproduction but the bindings were terrible.

If you are making a book for your mom, pay for the higher quality options if offered, such as hardback with a dust cover over paperback, and the heavier paper if offered.

The U.S. Library of Congress recommends acid-free paper for archival storage (https://www.loc.gov/preservation/resources/rt/perm/pp_x1.htm...) but AFAIK this option is not available for commercial photo books.


I've recently started building a few Blurb books. I haven't had one printed yet, but I'm working on curating a "best of" collection from my photos (e.g. Best of 2020, 2019, trip to XYZ, etc...) and carefully arranging them all. It does take time, but I'm looking forward to the result. But as you said, not cheap.

I also use Adobe Lightroom, not just for editing, but for storing photos. It's a great way for me to look at, organize, and even publish-to-web, from either my desktop, laptop, or mobile. As for it being a cloud solution, I'm more comfortable with handing my photos to Adobe's cloud than to Google's (or Facebook's!) Although it contains far far more photos than I plan to put in my Blurb books, it too is a "best of" sort of thing.

My "source of truth" is a local hard drive dedicated to photo storage. When I shoot, my camera is set to RAW+JPEG, and I save both copies. The thinking is: RAW has the most data, but do I know for sure that someone digging through these photos in 100 years will have proper Sony RAW Codec to view them? Probably, just JPEG is a nice fall-back.... I organize photos using the following file structure

  /{celestial_object}/{continent}/{country}/{state_or_province}/{city_or_locale}/YYYY.MM.DD__album_title 
  .. e.g. /earth/north_america/usa/california/los_angeles/2021.01.18__visiting_steve
  .. e.g. /moon/mare_tranquillitatis/2021.01.18__first_shoot_with_new_celestron_lens
  .. e.g. /jupiter/2021.01.18__galilean_moons
Once a year I plug in a second hard drive and use Macrium Reflect to mirror the image. Then unplug that second hard drive an put it somewhere "safe", which at the moment is a storage unit down the street.

I'm looking to get a NAS and set up RAID, but even then I'll still mirror the disk periodically and keep a copy in a safe place.

But I do encourage everyone to consider printing photo books.


My family has used Blurb for many years. Once you've created your book, just wait a few weeks and a 50% coupon usually comes around.


I love Blurb. We use it to make photobooks every year. It forces us to pick the 200-300 photos from the year that really mean something to us. The rest of the 50,000 photos go into our Google Cloud drive. Not that they don't mean something, but there's no way we can enjoy 50,000photos/year.


How do you handle organization of the massive amount of day to day photos taken on phones? It seems really tough to keep up with the increased volume the reduced friction has created.


> For posterity and "the test of time," I really think you need to create physical photo books.

Didn't work out for my dad. A leak in the roof poured water on the albums, and all the emulsion ran away.

A friend of mine had her family home burn down, all photos/films gone.


(Very) old school: you get a multi-TB disk of some sort, doesn't have to be fast or high quality.

Back up the media to that disk, and then take it down to your safe deposit box. No absurd monthly fees, no Terms of Service, it's off-site so if your house burns down, you're covered.

"Your what?" Yeah, the banks still have those. Mine costs $15 a year. If you die, your next-of-kin will have to bring in a death certificate to get access to it. They'll have lots of other hassles, believe me.

Forget being trendy and having it in the cloud. How often are you going to access it? "Never", right?

You do have to go in every few years and replace it with the latest & greatest, and you'll probably have more media to backup anyway.


jfyi, HDDs do break occasionally, even when rarely used. I had an external multi TB drive that one day just stopped working. It just wouldn't spin up. The heads got stuck to the platter. Stiction, it is called. Luckily, I got it fixed for enough time to make a full copy to a new HDD.

Lesson: never rely on a single backup for what's important to you.


First of all safe deposit box, despite impressive security measures shown in movies, aren't all that secure. Google "safe deposit box news" for extra details. Even ignoring the LA issue where the FBI seized a bunch of safe deposit boxes, seems like just loss is pretty common.

Disks are not designed to be usable after 5-10 years of sitting on a shelf. Portable drives aren't particularly robust either, I've worked with many people with lost data on a USB attached drive. I'd use at LEAST two drives, ideally in two places if that's your only store for anything important.

I'd recommend using a cloud service (amazon.com/Amazon-Photos/, https://www.flickr.com/, https://photos.google.com/) or similar AND keep a local copy on a disk. Plan on replacing the disk every 5 years and paying whatever monthly/annual fees for enough photo storage. Include in your will full details on the online and offline access to the data. Sure it's a pain, sure there are costs involved, but it's not some insurmountable task. Much like your taxes, will, deeds for property, bank account info, etc. In all cases you want the original documents and pointers to the company/account.

Bitwarden (among others) handles designating a successor for accounts in the case of death, you can include username, password, and any encrypted volume info there.


First of all safe deposit box, despite impressive security measures shown in movies, aren't all that secure. Google "safe deposit box news" for extra details

The fact that it's news shows that it's rare, not common. The whole point of the news is to show things that don't happen every day.

I've never understood how people got to a state where they think "I saw it on the news, so it must happen all the time."


Yeah, the banks still have those.

Not as many as used to.

I was at the branch inside the regional headquarters for one of those too-big-to-fail banks a couple of weeks ago adding my wife to an account, and I asked the guy if there was a way to transfer the contents of my safe deposit box at another one of the bank's regional headquarters to this one because it would be more convenient. He said that his bank (one of the five largest in America) doesn't open new safe deposit boxes anymore.

Still, if you can find one, they're invaluable. For reasons beyond the scope of this discussion, I'm glad I still have the one I have, even though it's a thousand miles away.

I wonder if the smaller banks are still doing them. Did credit unions ever have them?


The lease agreement on those boxes usually states the customer assuming all risk. Sometimes boxes randomly go missing, and natural disasters can make them inaccessible.

"She sued the bank in Los Angeles Superior Court, seeking $7.3 million. Bank of America sought to have the case dismissed, citing language in its lease agreement stating that the renter “assumes all risks” of leaving property in the box."[0]

[0] https://www.nytimes.com/2019/07/19/business/safe-deposit-box...


You bring up the lease agreement, but you should look also to the terms of service of the various cloud storage services.

Chances are the deposit box are way better.


Let it never be said I can't change my mind. HN is supposed to be open, curious discussion, right?

No, I'm not closing my safe deposit box, but that article made me realize the danger of banks just deciding someday "You know what? Who needs those stupid things any more?"

Of course, the same thing applies to cloud backup services. Google is constantly telling me I should buy more storage.

Eventually, you don't want to rely on a service that's unprofitable for the company providing it. Unless, like an insurer paying off your life insurance policy, it's something that they can't out of, without destroying their reputation. Or bringing the government down on them.


> Sometimes boxes randomly go missing

citation?

and nothing ever goes wrong with cloud backup schemes, over a very long time period?



I don't mind looking for an alternative to Google Photos because, while their first 2 tiers are anything but absurdly priced (I currently pay C$40 a year for 200 GB), it abruptly jumps to 2 TB after that, and a price that's far less justifiable (C$140).

But I see little if any point to offline-only backups of something like family photos. Usable and backed-up is what I'm after, and preferably not just usable but very convenient.


I also hate their pricing model. So far we are below 200GB and after that we can open another account for my wife. I hope they increase the limit before we reach 400 GB.

Maybe MS has some solution, they have a 1 GB level.


Is this still true now that Google One is a thing?


Those are Google One prices.


Idle hard drives can still fail due to what's called "Stiction" (static friction), due to the breakdown of lubricants inside.


Yeah, as I said, "backup & forget it" is never a good strategy. Go in every year (or so) and replace it.


Or just take a disk to a relative, like a parent or sibling, once a year. It is a very low chance that you both get break-ins at the same time but you can always have a cloud copy too.


Synology NAS, using Synology Photos (installed on my phone + wife's phone for photo backups). The NAS is backed up nightly using Synology C2. I also have a local standalone disk. We also both use Google Photos - although not all photos on here and ideally I wouldn't use it, the utility is so strong it's hard to say no to it.

I did used to use Mylio [1] which is a really lovely solution that uses a sort of internal P2P approach - your images don't go near a cloud host if you don't want them to. It just keeps your "vaults" updated with each other - a vault can be on a NAS / desktop / mobile. The issue I had with it was that the mobile scenario is a bit shonky - required you to have your phone on at the same time as one of your other vaults in order to sync mobile photos, and that proved too fiddly to maintain.

[1] https://mylio.com/


I also use Synology, but with apple photos. There's a mac mini on my home network that syncs the photo library automatically, and then Synology Drive syncs the local folder to the Synology. That is in turn backed up to backblaze weekly


How do you sync your Apple photo library to your Syno, given the Apple library is a package file containing a maze of machine-generated nested folders and many nearly duplicate image files (at least for those images you’ve edited)?


Here's how I do it...

- subscribe to iCloud 2TB plan

- all photos on my wife and my devices are auto uploaded to the cloud

- i run a synology nas that downloads and organizes all the photos in a directory structure YYYY/MM/DD.

- a hyperbackup task backs up the (encrypted) photos to Backblaze B2.

That way i have my photos in apple's cloud, on my nas and in another cloud.

What do I use to download/sync the photos from iCloud to the NAS, you ask?? It's an awesome project called "iCloud Photos Downloader" - https://github.com/icloud-photos-downloader/icloud_photos_do...

I have it running in a docker container on the nas and it periodically syncs and keeps itself up to date, no hassle. https://hub.docker.com/r/boredazfcuk/icloudpd


For me the most important part is to keep it simple and not dependent on the one tech person of the family paying for some cloud storage.

Unlike other mentions in the comments what you don't want to have is custom scripts, git, backups tools, strong encryption, cronjobs running on some server that will go away eventually. A labeled, unencrypted HDD in a closet in two different locations will most likely do the job just fine. If you want to throw in your cloud backup that you take care of that will be a nice bonus but I wouldn't rely only on a "tech" solution.


I want to agree but find even this is too complicated some times.

Where the HDD approach usually falls apart is merge conflicts. And by that I don’t mean only in the git world. This is an inherently difficult problem.

Someone extracts a subset of the hdd collection onto their laptop to do some editing. Then forget to store back the result on the hdd for the next 6 months. Next time when doing backup, mix up the hard drives, only saving the photos on the second one. And so on. Even basic stuff like deduplicating photos or filtering out only the good photos can leave a lot of confusion. Quick resolutions like “just keep everything” completely reverts such a cleanup (1). A strict master-slave scheme helps but requires access to the master hdd for any writes.

Cloud doesn’t solve this entirely but shortens the window of conflicts by convenient and always in sync access to both read and write.

Someone really need to invent a git for photos, for humans. That can deal with the longer partitions which loose hard drives usually lead to.

(1) on the topic of cleaning up photos. Doing this before backing up is strongly recommended. The smaller your data the easier it is to manage. You also will enjoy your old photos a lot more. Nobody wants to click through the 100 angles you shot of some food from a local cafe 5 years ago. Keep only what is truly relevant.


Maybe not exactly what you have in mind for spanning physical drives, but somewhat, perhaps...see Photo Mechanic Plus's cataloging feature: https://home.camerabits.com/tour-photo-mechanic-plus/


You need to periodically check those HDDs for any corruption though, which requires some tech knowledge.


probably an optical disk is a more long-lasting solution


We've been building ente[1] as an e2ee alternative to Google Photos.

Posterity is something we've been thinking deeply about. Our infrastructure is designed to support successors, similar to GitHub[2]. But since cloud storage is expensive, successors will have to choose between exporting a local copy[3] or paying for the newly accumulated data.

We are still designing this "feature". If you think we can do better, please let me know. We would be grateful for any feedback. :)

[1]: https://ente.io

[2]: https://docs.github.com/en/account-and-profile/setting-up-an...

[3]: https://ente.io/faq/migration/out-of-ente/


First, I'd recommend thinning out - multiple terabytes sounds very extensive and can be thinned by removing duplicates and by using better compression like webp or x265, removing unnecessary raw-files, etc.

My personal backup is the usual 3-2-1: 3 backups, 2 places, 1 offline. I have one copy on my local harddrive (that I work with), one automatically synced copy via seafile on one of my dedicated servers (which also maintains a few months of history in case I accidentally delete something) and I have one external, offline harddrive at a relatives house, that I sync to every half a year or so. Since I'm paranoid, my dedicated server is backed up to an external storage every night as well via borgbackup. If you don't want to spend a few bucks a month on backblaze or another service, just use a local NAS - as long as you have one harddrive offline and external as well (in case of a ransomware attack that crypts all files).

Important: My files and backups are fully encrypted and it's imperative(!) that you backup all documentation, all config files, all settings, all cronjobs, all executables that have something to do with the backup and restoration process unencrypted with every backup - in the desaster case, nothing sucks more than trying to find the right settings again.

Case in point: I originally used a custom shell script and encoded the files with openssl. However, the default hash scheme was changed between openssl 1.0 and openssl 1.1 (or something like that) and when it came to restoring after a harddrive failure, this took me like a weekend to sort out.

As for posterity: it's up to you if you encrypt the external drive at a relative - if you're fine with a burglar having the images and you cannot be ransomed with them (e.g. due to nudes), just write what is on the harddrive clearly and you're fine.


Having to archive environments and toolchains along with backups is unpleasant.

What is the plan: when decryption fails (and before you identified that it's a versioning issue with openssl, in your case) you'd reinstall an old linux to a random computer and work from there? How many config files and settings are even involved in your backup process and how can you be sure you haven't missed anything?

I hope there are dependency-free solutions for this - a winzip-encrypted .zip file that asks for a password should work everywhere even in the future?


  > if you're fine with a burglar having the images and you cannot be ransomed with them (e.g. due to nudes), just write what is on the harddrive clearly and you're fine.
For the most part, just using an unusual filesystem e.g. ZFS will foil the vast, vast majority of attempts to read the data from a drive stolen from a home burglary (e.g. where the data was not the target).


That's security by obscurity - either do it right (if your data demands it) or don't put any effort in it at all, IMHO.


  > That's security by obscurity
Exactly. And in instances of securing physical objects, depending on the threat model, obscurity can an effective barrier.


Yeah, obscurity seems to help in the real world. The Presidential motorcade has a bunch of identical limos so attackers don't know which one the President is in. They, of course, have armor too. But the decoys add to the security, even if it's "by obscurity".


Sure, and don't lock your door since a smashed window is always an option.


The content gets stolen either way. You would save the money on window repair at the cost of losing a small deterrent (maybe a thief would refrain from making noise)


I have a similar setup, and have set up my backup infrastructure configuration as a git project, mirrored on both GitHub & Gitlab.

I can thus checkout the project on a new machine and just initiate it (giving it the right api keys etc) without issue.


> better compression like webp or x265, removing unnecessary raw-files, etc.

OP is talking about digitized videos, so asking to re-compress the videos is a https://xkcd.com/1683/ in the making.


The source for most movie rips today is Bluray, which is already an encoded medium. Yet Bluray remux's are not the common distribution format.

Yea you are technically correct, but if the distance from the original is just a handful of encodes, good luck noticing any lower quality that's not simply due to poor encoding settings. And when a proper encode can be a 10th the size with hardly any drop in quality, in a video file you might view <12 more times in your life, does it really matter.


Are you encrypting every file or create some virtual encrypted volume and copy all file over ?


Spent roughly 1k on a nas system at home which gives me 6tb of redundant storage.

It's raid 0 (2x6tb) with 2 separate backups done nightly/weekly, one in the house and one done in the garage.

The speed of access is unmatched to any cloud system, in the long run it'll be cheaper than online (assuming £10 per month for 6tb of storage).

I made the plunge after trying a few cheaper options plus cloud. I'm happy with the investment and haven't looked back, it's been rock solid. Synology.

The same system could probably be had for less now, the nas box also acts as my docker server which has been running without issues for 2 years. Most importantly it's not a full time job maintaining it, I spend maybe an hr a month if that checking up on it.

Comes with duplication detection and everything you'd need without learning shell can be done using the nice gui. It's just nice having a system anyone can use, no shell scripts in place at all. It just works.


RAID0? Assuming that’s not a typo, RAID0 could not be a worse choice for backups. You’re at least doubling your chances of losing all the data, when you typically want to have something more resilient than a single standard disk for backups.


My mistake, RAID 1.


Easy way to remember: It's called RAID 0 because that's how much data you get back if something goes wrong.


I use a NAS for backing up my photos I take on my camera (i.e. my Lightroom catalog.) I also back these photos up to Backblaze B2. I use Duplicati for this.

On my phone however, I simply send my photos to Google Photos. Actually, Lightroom exports end up there too. This makes it easy to see and share photos with family and friends.

I would love to hear how you view and share your photos using your NAS. Are you using something like OwnCloud or NextCloud? Or does Synology offer this type of software built-in?


Synology has a built in app called Photos. It's relatively new (DSM 7), it has facial recognition which to be honest works half well at best. It might be the shear number of family photos we have but it does struggle to 'learn' as we tag people.

Provided the app is installed you can easily share the photo through WhatsApp or others on your phone. In terms of public access / link I have disabled this however its possible to expose through a cloud service (Synology cloud).

The photos app works well when exporting however I wish it has a backup feature however the flip side is my family wouldn't want all their personal photos sitting on our family nas.


I did the same thing with a WD My Cloud (6 tb redundant RAID 1) man was that a terrible decision. One failed and then the other eventually did as well (within 3 years) - not to mention it was painfully slow and cost maybe 600 USD.

I've since been able to recover them through another external drive casing but I haven't found a replacement idea on what to do.


Wasn’t My Cloud also hacked rather severely?


Quite possibly - it really was a bad product and the interface to connect your local storage to a cloud front end was bad and super slow (which is why i didn't use it).


> Most importantly it's not a full time job maintaining it, I spend maybe an hr a month if that checking up on it.

Exactly, I want my family to be able to get the photos if I suddenly get hit by a bus. Noone is gonna be able to deal with your custom server/cronjob/etc. crap in case anything happens to you.

Over the past decade I have migrated back and forth between cloud storage and a NAS. Both of them need upkeep, either to keep footing the bill or to check the drives and replace them if they die.

I think anyone who is serious about this needs something you can put in a closet/safe/bank vault and a decade later it will still work, and the best bet is optical disks, they were suppose to last hundred years +, but that industry is kind of on it's last legs and we were never sure how accurate the longevity claims are.


I have a similar setup (also Synology), but only backup to Backblaze nightly. What's housing your garage backup? Another NAS or something else? If it's another synology, did you get the same model as your primary or something less powerful?


It's an old laptop that turns on with a plug timer and shutdowns after it's finished copying hyper backup off the second server over SMB.

The timer runs every Sunday at 5am. It's never failed.

I should probably add it as a secondary destination in hyperbackup however need a nice way to autorun an rsync server on Windows. Then I could combine the alerts in Synology should something fail.


Ah, interesting - and what's the second server?


I'm surprised how technical and homegrown all these solutions are (well, it's an HN crowd so maybe I shouldn't be). I'm a techy too and pretty into photography, but I've gone the other way.

I store all my photos on a desktop computer and use BackBlaze to automatically keep everything backed up.

It violates the 3-2-1 rule but in the 10ish years I've been using this approach I've had tons of harddrive failures, had to restore some backups due to my own mistakes, but I haven't lost a single image.

I used to do complicated sync stuff but I found that the more moving pieces there were the more I would lose photos for various reasons.


Question about your approach: are you able to partition your Backblaze back-up in order to maintain just a minimal set of photos on your local machine?

Would love to do this: - Back-up local machine via Blackblaze - Periodically delete all local photos, save somewhere in BB - Turn on regular back-up again - keeping both the local stuff (now much smaller) but also the previous bulk save in BB.


My answer to maintaining a minimal set of photos on a local machine isn't great unfortunately. The desktop holding all the photos acts like a file server and I connect my laptop to that when browsing images.

There's a way to do it that I find somewhat risky: pay a bit extra for permanent storage. Normally Backblaze will remove deleted files in 30 days (I pay for a year). If you do permanent then you can delete your originals from your local machine and Backblaze will keep them presumably forever. That approach makes me nervous personally :)


Interesting. Wasn’t aware of that option. And I agree that it feels strange to sign that data over to someone/something in that manner.


I use Backblaze as well but I'm disappointed the company doesn't support backing up Application folders or system folders. They also stopped supporting Mac OS X older than 10.13.


I’m much the same but with an additional local backup: iCloud Photos (with one computer set to save all originals), Time Machine backups to a local NAS, plus BackBlaze for offsite.


I wanted the photos outside the Time Machine bundle. It was (and is) a horrible process, but I quit photos.app, rsync them to a nas, then reopen iPhoto and post the result (success or fail) to Slack.

Getting to the photos was horrible, so made an alias of the folder that contains the files (which is in the library bundle).

It’s all on a VM so it sort of out the way.


Use Google Photos on my android phone, then use gphotos-sync[0] to sync the files to a hard drive on my DIY NAS. Contents of hard drive are periodically backed up with restic[1] to B2[2].

My reasoning is that I don't trust Google to not lock me out of my account at some point, so having both a local and a remote backup gives me piece of mind. I periodically check the offsite backup to check that it's still all working. Total cost for about a terabyte of files (it's not only photos and videos) is about $6/month, which is pretty reasonable.

[0] https://github.com/gilesknap/gphotos-sync

[1] https://restic.net/

[2] https://www.backblaze.com/b2/cloud-storage.html


Doesn't Google reduce the quality of them all when they're uploaded? Not that it's really material in most use cases but I find myself having to do manual backups from my phone.


Nope - you can switch between "Original quality" and "Storage saver".


I'd love to use the Google photos API but it's really frustrating that Google won't include geolocation data.


Awesome, I came in here hoping for a google photos downloader!


It's a little fiddly to set up, but once it's working it works flawlessly. This blog post was helpful in setting it up - https://ubuntu.com/blog/safely-backup-google-photos.


While it won't yet solve all your backup needs (still early days), a few friends and I have been working on a privacy focused competitor of sorts for Google Photos, but a bit easier to share your albums with friends and family. It is free for the time being, but it will be a paid solution as you're not paying with your data. It's also E2E encrypted. While all photos are currently hosted by us, we're also exploring letting you to connect the app to your own cloud or storage solution for those that want full control. If you're interested it's called StoryArk https://storyark.eu/


Some of my fantasy thoughts about a photo storage service: Can I ask Siri, Google, Alexa (or any smart assistant) when was the last 3 times I saw ____?

Can it show me emails, photos, tweets, etc of the related meetings/events?

Can I take a photo or upload a photo of someone and provide as many details as possible? In relation to me personally and them alone as a person?

Can it be a celebrity or a person from the past (no longer living) and it could try to see a connection, related things, etc?

Would this also be helpful for parties, weddings, funerals, and more?

Could I ask it to try to find a relation between two people?

What if you connected it to something like Lexis Nexis? Be able to display connections in different chart types? bubble, connected graph, or graph commons: https://neo4j.com/blog/creative-network-science-graph-common...

Can a timeline or photos help to tell a story? Act as a guide or reference? Add more clarity to situations/events? Could we offer this as an addon to weddings and other events? Can we create a linear timeline of photos, dates, times and locations? Could we also show related influences? Like songs, albums, family history or tree. Topics to disable: lyrics, alcohol, ?? How to access/merge other apple photo accounts? AI photo collage, web based, download as video or PPT


Thanks you for sharing - these are interesting ideas.

At the moment we're focused on how we can tie together shared events and memories of your personal story (hence the name) into one place. Currently this is primarily from photos, but as you've suggested there is a lot one can do with how those photos are connected to time, location, each other, and other possible connections. This is what I see missing in most photo backup tools, since they're primarily archival and aren't designed to tie ephemera together into a permanent catalog of your experiences.


I have 3 NAS systems based on Freebsd + ZFS where all system are over 10 years old on 3 different locations. My home, my parents home, and at my co-location.

It is just ZFS snapshots with replication. I wrote my own shell scripts for this that takes daily/hourly snapshots. This scales well with multiple TB's, bandwidth is no issue as only changes from last snapshot(s) are replicated.

After 10 years, I've replaced only 2 hard drives of about 30. I do upgrade FreeBSD once every second or third year. Nothing else, it just keeps going.

This project has probably costed me max a week in configuration & setup & maintenance over the 10 years.


With the electricity, hardware, space, and time cost it might be cheaper to have just one and sync to a hosted ZFS service. Of course, those didn't exist 10 years ago, I don't think.


Not really, space has been free for me. Electricity is very cheap usually here in Norway. And hardware are used servers from Ebay. Though these are actually desktop AMD x2 with 8GB ddr2 ECC memory on Asus motherboards. Still working fine, but I am planning to replace soon.


My problem with this approach, and many of the other ones (including my own, at the moment), is that 99% of my relatives wouldn't know how to access such a system in case of death or emergency. This is why the trustee for post-life access to my info is actually a friend who works in IT like me; but I've not sorted out pics yet.


This is a problem I'm trying to solve as well. My main repository is ZFS, which IMO is a no-go for postmortem access. I don't trust that online accounts, especially paid ones, will not be locked out due to inactivity or lack of payment. It may be several years between departure and access.

I have 3 separate use cases: casual access to shared memories for which I'm the family steward (photos, videos), motivated access to personal important documents, and motivated access to other personal files.

I'm not about to leave an unencrypted drive full of personal info in a family member's house, as the risk of theft is strictly greater than just having one copy. My current plan is to use EXT4 + Luks to satisfy the last 2 scenarios, which I think stands a reasonable chance for anyone slightly techy (most modern Linux distributions will simply prompt you for the container password when you try to access the encrypted drive) and is likely to enjoy long-lived support for at least a decade or 2. I have a techy person in my family, not sure I'd do this if I didn't... For casual access to shared memories, I plan on leaving an unencrypted partition. While it lasts, these media are also available on a family-only photo album I put up on AWS.

I'm considering using a laptop as the vessel for the encrypted drive, with a suitable Linux distro pre-loaded and instructions on the desktop/printed out and kept in an adhesive document pocket stuck to the machine.


Currently I run a freedbsd zfs nas as well. I had planned to do something similar but I wanted to allow the places I store backups, some access to the storage. In your example, it would be your parents. I figured it would be a nice way of saying thanks. I was thinking syncthing or something. How do you do NATing?


All servers are behind a NAT, but the servers are only configured with SSH open. With the exception of the NAS I am using which also has NFS, iSCSI running. Also, only the backup servers are allowed to make connections to NAS I am using. The NAS I am using is not allowed to connect to the backup servers.


In addition what everyone else has mentioned, my plan (yet to be implemented) is to slowly organize the photos into albums, e.g. "Family vacation to Hawaii, 1995".

For pics that don't fit neatly into an album theme I'll group them by date, "Family pics 1995-1996" based on their quantity.

Then I'll print photobooks, adding info where possible about location, date, and people. Basically, recreating physical photo albums. I'll print 2-5 copies of each album, depending if I expect to send copies to other family members or not.

Having physically-printed photos works both as a last-ditch backup and as easy access for anyone to browse. I nostalgically remember spending time as a kid flipping through my parent's and grandparent's photo albums - sometimes with them, but many times on my own. I want to ensure my own kids can have the same experience.


I ran all my old photos through a script that renamed them all based on the exif date. This made it way easier to organize.

I have a server at home with a 4tb drive shared out with Samba and of course sftp (xfs). There is a 2nd internal drive formatted with btrfs. Once a day the btrfs drive is mounted, drive 1 rsyncs to drive 2, and drive 2 creates a snapshot, then unmounts. Later that night, I run an borg backup and rsync (depending on sensitivity of data) to a time4vps (https://www.time4vps.com/?affid=1881&_ga=2.199835630.1135648...) 1TB storage server. Currently, that's enough for the stuff I want to backup.


We do something very similar. And we always print at least 3 copies and gift extras to family or friends that took the trip with us, or might like the book. It’s a nice gesture, but it also serves as insurance that if anything happens to our house we have the photo books spread out all over the place.


My photo organization is folders or albums named Year-Event or Year-Place, e.g. 2014-GrandmasFuneral, 2015-Berlin, or 2018-Kenya.


I have 3 layers:

Online: A big BTRFS "raid 1" (block-based mirroring) array. This is what Plex runs off of. If I lose a drive, I just replace it and rebalance. I run btrfs scrubs weekly.

Offline: I have a WD external hdd with a plain old ext4 hdd that I rsync all files to weekly. If my btrfs array shits itself, I just copy back over from this hdd.

Emergency: If the above process doesn't work (say: the external hdd is dead when I go to restore from it, or I can't physically find it, or my house floods and everything in the basement is lost, etc), then my fallback is:

I sync all truly critical stuff that can't be replaced (so, basically just family pics and videos) to S3 as deep archive storage class. Costs pennies. I don't do any fancy tar/archiving, encrypting, or compressing (since video and image files are generally already compressed). I figure the risk of people snooping at our family pics is smaller than the risk of me needing to restore from a backup and not being able to because of stupid reasons (lost gpg key, some kind of problem with tar, etc).

This is small, since most of my disk use is from things like rips I've made of BluRay/DVD's, FLACs from bandcamp or rips of music CD's, etc. Things I can recover with appropriate effort.


It took me several years in pursuit of a self hosted solution, that I don’t need too much time to maintain. I think the current solution is working very well for me.

I have a raspberry pi with a USB drive attached. The OS is ubuntu, so I can use ZFS. The drive is backed up to backblaze S3 using restic in a cron job. The backup and ZFS health is monitored with healthchecks.io

The data sync across all the devices including the raspberry pi is done using resilio sync.

I view my photos when I am home using an app written by myself: photograf (https://github.com/ptek/photograf) It runs on the same raspberry.


I have a very similar setup. An old SFF desktop PC running Debian with a USB3 disk attached, though mine is just running ext4. SMB sharing allows easy access to everyone in the house, plus I have Syncthing configured on our phones and the Linux box which we use to automatically sync our phone pictures and videos. Then this box is backed up using Restic to Backblaze B2.

I know there's some holes in my plan (ext4 isn't foolproof, etc.) but it's been an ongoing process.


Regarding posterity...I think when we die no one will care much about our data. I recently wrote a will and this included documenting my bank accounts. A death notice is sent to the banks when you die and and whoever specified on the will can access these funds. This ticked a big box in my mind and resulted in me dropping some unecessary tech.

You have got me thinking though about my backup of family photos/videos. My backups are all too clever, encrypted and inaccessible at the moment. I feel inspired now to make sure photos/videos are easy for my family and future generations to access.


Google Photos with family sharing and Original quality with paid Google One. I used to have a NAS based solution which was backed up to S3 etc. I even wrote a photo browsing server for my NAS but became too complex as we increase family devices among people with different levels of skill.

The ability to search photos easily over time and the weekly "this week X years ago" reels are great value.


To save the environment in terms of redundant storage, I just upload them directly to the NSA cloud service. The free tier plan is pretty good.


But their free tier doesn't allow restores.


lol! I think we're all uploading to the NSA's cloud service, whether we want backups or not!


Related question, how does everyone get the family photos grouped together in the first place?

My family are all taking photos on their own devices during a vacation, holiday, gathering, etc. and then someone creates a shared Album on Photos (iOS) where everyone just dumps all the photos taken. It's a complete mess in my opinion. WE have a shared album "Christmas 2018", "Holidays 2019", "Xmas'20", etc. depending on who named it.

Everyone else seems to like the ease of upload and ability to find past pictures. Getting the photos out of the apple app seems like a pain to me. I feel like there has to be a better way, but it has to work for the lowest common denominator (Grandparents, etc).

We have a couple people on Android and are essentially cut off from the family photo stream. When that happens everyone's mentality is, "get an iPhone".


I use Google photos with Google One for sharing with family. My mom can figure it out and that says a lot about how easy it is.

Works across all grandparents and I used one of their auto upload a face (I'm sure that's the technical term for it) to automatically send a photos of the grandkids to the grandparents. This makes the whole system worthwhile, I get comments from them constantly about how much they love the auto photo albums.


Google Photos has been a lifesaver. Photos my wife takes are automatically shared between us.

If we have pics from someone else, I usually create an album and ask them to upload their pictures there. Not everyone does it - some people just share over SMS and then it's likely those pictures will get lost.


Sync everything to Google photos -> Twice a year, do Google takeout and save it on a hard drive, throw hard drive in a fire resistant safe

It's cheap, easy and ransomware proof.


Since I am also an avid DSLR photographer, the first decision I made was to use Adobe Lightroom (Classic) as the "single source of truth" to manage all our photos.

This means obviously importing all photos taken via my DSLR into Lightroom, but also syncing all photos taken on our iPhones via the Lightroom Mobile App.

Lightroom Classic keeps all the (compressed) photos in the Adobe cloud for easy sharing and browsing, but also writes them out (unaltered) to a directory on my Local NAS.

This NAS gets automatically backed up via Arq Backup (https://www.arqbackup.com) to an encrypted Amazon S3 bucket. Additionally, once or twice a year I create a versioned copy of the NAS via Carbon Copy Cloner (https://bombich.com/) to an external hard drive. This hard drive is stored offsite somewhere safe.

In a nutshell, for around $12 a month + a NAS + a hard drive, we have all the convenience of the Adobe Lightroom cloud combined with a local copy on the NAS, a cloud copy on S3 (in case Lightroom cloud gets corrupted) and an offsite copy (in case our place and the whole internet burns down :-)).


You could also make the Arq backups in your S3 bucket immutable, for more ransomware protection.


That's why I love Stefan and Arq Backup :-)


I posted about my homebrew solution for photos/documents/scan archiving with a local NAS + encrypted offsite backup on a VPS last week at https://news.ycombinator.com/item?id=29863822#29888273 . My use case is 300-400GB of photos which is affordable on a 4TB VPS.

In short, I have an old desktop-turned-NAS with ZFS that will tolerate 2 physical drive failures and access my photos over a network share. It runs a weekly cron script to copy an iterative, encrypted borg backup of the photos offsite over sshfs to my VPS. I can mount specific date's snapshots over sshfs back to the nas/another machine if I ever need to recover anything without having to enter the password on the VPS.

It's a good question on what would happen if I died. Thanks for getting me to consider this. My spouse wouldn't have a clue with the VPS or how to fix the NAS but could pay someone to easily copy the data off. I should leave a note taped inside the case. For now I'll plug an external ntfs drive into the NAS and if needed she could plug that into any other device.


External HDD that's backed up to Google Drive. If the drive fails, it's safe at Google. If google cancels my account, I had it on my local HDD. External drive is pretty cheap and so is Google Drive storage.


I trust survival of the fittest on my digital data, scattered around a small pile of old laptops, a box of unordered hard drives that used to belong to a RAID, and the current working computers.

Life is already complicated as is.


Stored on a Synology NAS, and indexed by PhotoPrism. I backup the entire NAS to a Google cloud encrypted bucket every night, and once a quarter or so update a backup on a solid state drive I keep in a safe.


Do family members know how to access said system if something were to happen to you?


My husband has access to the PhotoPrism web UI, but he'd have no idea how to access a backup or anything like that.


I wrote Timeliner [0] and am writing its successor, Timelinize [1], to download all my family's stuff to our own computer.

[0]: https://github.com/mholt/Timeliner

[1]: https://twitter.com/timelinize


Basically iCloud plus AWS S3. Every so often I dump old ones into S3, zip them and put them in the glacier instant retrieval tier. Super easy to use (drag and drop) and cheap as hell. I pay less than a dollar per month right now.

You’ll obviously need to leave keys, instructions etc for your next of kin. Or put a hard drive in a safe deposit box.


All computers in the house backup regularly with borgbackup on a small servers (a 2010 Macbook with an attached USB drive). This is done by hand, but there is a script that periodically checks that no backup is falling behind.

Then I have a custom script that encrypts and replicates all the borgbackup repositories on Google Drive (where I have unlimited space because my previous university still hasn't deleted my account).

Ideally I would also replicate everything on a drive at some relative's house, but I haven't had time to set up this yet.

Overall I am rather pleased with this setup. This is for general backup of everything. Specifically for photos, I have a script on both my and my wife's Android phone that copies all photos and videos to a shared directory on the MacBook, and she periodically organizes all the photos there. Then they fall in the main backup.


Well, I don't store photos or movies, but there is some audio data and total system backups. I'm currently using Cloudberry + Wasabi and pay about $10/month for 1.7TB data on a European server. There is no good way around paying if you want some cloud storage to protect against fire damage, etc. You could ask friends to host servers for you in exchange for you offering disk space for them, but this is very fragile and incurs too much administrative burdens in the long run.

If I were you, I'd delete 90% or more of those photos and videos to save running costs. I don't know how enthusiastic you are about watching photos of yourself when you were shitting your pants as a toddler, but, speaking for myself, I'd consider being urged to watch even a small fraction of what you have at family reunions some form of torture....


> What is your system for backing up family photos and videos to stand the test of time?

In addition to the other good answers, a box full a printed photos will stand the test of time.


professionally printed ones, yes. However, finding a good print service has become a real hassle and most photos that I printed start to fade after 10ish years.


The anecdata of my random Walgreen photos seem to be holding up pretty well in the living room and hallway environment that they live in. We have some done from their "photo copy" machines from 20 years ago. Mind, I don't have master originals to do a side by side comparison, but using the Mark 1 Eyeball, they look fine. They're doing much better than my 50 year old polaroid.

While I can print photos, it's typically far easier and cheaper to have Walgreens do it.


Sure, until your house burns down.


1. Encrypted external USB drive as a live copy

2. Encrypted external offline USB drive onsite

3. Encrypted external offline USB drive offsite

4. Upload of an encrypted archive to the cloud storage

It's all manual so sometimes some backups are months old. Haven't lost a single digital photo for almost 15 years now, and have gigabytes of obsolete photos...


Mine is similar, but 1 is an internal drive, and I have almost no cloud storage.

> It's all manual so sometimes some backups are months old.

Do you at least have a script for backing up the live copy to onsite? Like plug in the drive, enter encryption key, then run script?


Yes, a script grabbing the paths specified in backup configuration file and calculating the checksums of created archives. No additional encryption at this point as the external drives are LUKS encrypted. The setup works seamlessly only with Linux machines though.


We pay for Apple iCloud 2TB plan. Sync all photos & videos to local external hard drive. And then periodically copy local Photos library to NAS.

Then I share my Amazon Prime backup with my wife. All our photos are backed up there but not videos.

For videos, I upload edited videos on YouTube as unlisted or private. And some of better videos as public on Youtube, Facebook, and Instagram.

And finally I try to share as many photos and videos in emails, Whatsapp, with friends and family. Sort of distributed back up in the worst case scenario.

(Also never delete any photo or video unless you really don't want anyone to see it ever. I used to extensively prune my library but wish I had not. Storage will likely keep getting cheaper. And AI will get better and it will be able to create better slideshows and edited videos.)


I use iCloud, and would pay for a solution that allows me to automatically archive on a monthly basis to Amazon Glacier. Anyone know of such a solution?


I'm the author of HashBackup. I'd advise against using Amazon Glacier for pretty much anything, because retrieval requires a lot of mental, financial, and procedural gymnastics (4-hour delays, spacing our retrievals to keep costs down, etc). I implemented it for HashBackup when it was first announced, it was a nightmare, and I deprecated it a year or two later because alternatives like Backblaze B2 are nearly as cheap and have none of the retrieval headaches.

When talking percentages, sure, Glacier is $4/TB and B2 is $5/TB, so Glacier is 20% cheaper. But if you have to actually retrieve your data, it will likely cost more than any savings you've accrued. If you take a look at the S3 pricing page and your head isn't swimming afterwards, you probably don't understand the complexity of it. For example, they have data "retrieval" fees that are sometimes per request, sometimes per GB, and sometimes free (they promote that of course). But there's still that pesky 9 cents/GB data transfer fee that's on top of the per-GB retrieval fee, putting you at 10 cents/GB to retrieve data. Glacier also needs special tools: S3 tools can't access data stored in Glacier.

If you really want to stick with Amazon, I'd suggest S3 One-Zone Infrequent Access. It's a little more expensive, but you can use standard S3 tools to get your data back. You'll still have that 9 cents/GB retrieval cost, vs 1 cent/GB for B2.


I researched this a little yesterday. I am using a Synology NAS and their Photos App allows to easily sync new pictures. It also allows syncing to cloud storage like B2 or glacier.

I don’t want to download the old photos that are just on iCloud to my device, but there seems to be this python script which I will try to use tonight: https://github.com/icloud-photos-downloader/icloud_photos_do.... I assume that you could combine this script easily in a cron job with RClone which is rsync for the cloud: https://rclone.org/


I wanted to use icloud. But I always got confused as there are different versions of the same media contents (on cloud, on the laptop, exported or non-exported? duplication between devices). And the cost.

So I gave up. Unfortunately I still have to do manual export all media from i$$$ devices, move them to linux/android-based system. "move" really means that, because I never pay icloud again.


What different versions were you seeing? Everything just gets uploaded to iCloud automatically.


icloud is so weird for someone who are used to simple file and folder.


An rsync script can pull the files out the library bundle then put them anywhere. Done via a Cron job it’s fairly good.

I have no prior experience of such stuff and managed it (off to a Synology and from there to Backblaze).

Getting stuff out of iCloud in an automated and regular basis is surprisingly hard.


Assuming you've considered using something like Arq on your photos library folder, mind sharing why that's not a great solution?


I’m looking to improve, simplify, and more efficient but here is my present setup for our family (no extended family yet).

We use Apple Photos as the primary, of which I export a backup once a while (at-least yearly for sure). Because Apple has’t solved what everyone ask — a way for two or more users to merge Photos, I use Dropbox[1] on the other devices (wife, daughters) to collect into “Camera Uploads”.

I split the content into “Screenshots” (PNG), Videos, and Photos with Hazel[2]. Photos to imported to the Primary Photos Library, Screenshots, and Videos are either deleted or sorted weekly (few minutes), monthly (30-min-ish), quarterly(30-min-ish), and yearly (about 3-5 hours on a day around the end of the year, with the other digital clean-up task) during my digital chores that I do. My daughters produces a lot of them that I need to throw out. She can position her iPad and video-shoot while she talks her way cleaning her desk for an hour.

The photos and videos are now in my primary device iMac (for now), with a second backup on a 2012 MacMini running many other digital errands. The 2TB iCloud+[3] is good enough for now.

I would definitely love to clean this steps and move to a more non-proprietary format (read non-Apple) to Open Source tools/process.

I tried Google Photos because it was easier to combine and I even had one of their large plan but it turned out to be one of the messiest thing I tried. Stopped using them[4].

I will be reading and learning from the other interesting comments in this thread. Thanks to all.

1. https://www.dropbox.com

2. https://www.noodlesoft.com

3. https://www.icloud.com

4. https://brajeshwar.com/2021/how-to-delete-all-photos-and-get...


Can you tell me a bit more as to why Google was so messy? The blog post doesn't go into details. Thank you.


Every year I do a photo book. Used to use apple but they discontinued the service so now it’s snapfish. It’s a lot of work to create but worth it


I do the same. Whitewall are very good quality. They also make excellent gifts.


I really like the idea of an annual photo book. Might steal this.


Hard drives with multiple copies in different places, and you need software to manage those photos and video for you, google Picasa was good in the old days, until google kills and replace with google photo. All major players provide photo cloud service, and for some people think the problem is solved, while some believe the digital assets should be taken care by themselves, store locally, backup locally, that is the primary, and cloud backup is the tertiary backup, a good complementary.

The price of existing cloud storage is too high, and some of the companies(Shoebox, Canon Irista) doing the business gradually shutdown the services, this is a money losing business, it’s not the efficient way to manage huge amount of assets centralized (flicker CEO’s open letter sent last year confirmed this), they have to either make it more expensive, or make you the product. Cloud service is convenient for the user, people don’t need to buy expensive hardware, don’t need to be the professionals to maintain that, don’t need to worry about the energy fee to keep it run 24x7, but things are changing, single board computers are getting cheaper, more powerful and more energy efficient, storage are getting cheaper with larger capacity, software are getting more intelligent, people are having more and more concerns about the privacy, it’s now viable to host the Photo service, your private cloud, at your own place.

There are tons of open source alternatives there, but non of them provide competitive features like google photo, even though there are some backend services like photoprism that does a very good job on indexing the photo with AI, however the mobile APP is missing, and google also has the ecosystem to show photo/video on google home and chromecast.

We've been building Lomorage (https://lomorage.com) trying to fill the gap, it's easy to setup a self hosted service(Still need a lot of work to make it easy for non tech users), cross platform, has iOS and Android mobile APP, can setup multiple accounts, some basic AI search(not polished and need more work), there are other features missing, but it's stable enough for daily use. Would appreciate you try it out and give some feedback.


The solution I've used for the last 2-3 years:

1. a cronjob on my android phone (via termux) does an rsync to a VPS

2. the VPS has a cronjob to sync everything with S3

3. the frontoffice tool for people to access those photos is my open source Dropbox like frontend that is bring your own backend: https://github.com/mickael-kerjean/filestash My wife and I got an account and family members can access it through shared links.

The S3 bill goes to a shared account so that If I die, the VPS will probably be quickly removed but S3 should stay in there with my wife paying for it.


But what if before that s3 or Amazon dies?


S3 is just another copy of what is already on the VPS


nice


Parallel question: formats

I keep all my photos in the "native" format, which is basically JPEG and whatever my iPhone is spitting out these days. They are currently editable (which for me is mainly crops, with occasional other fix ups). I scanned all my old negatives, APS, and slides and just had them all issued as the lowest compression (highest resolution) JPEG (I didn't have a huge choice of formats).

But I have a bunch of videos from film tape, and cards. What's the best equivalent (supporting simple crops and edits) that might reasonably be editable in future?


I burn 25 GB M-DISCs every few months.

If I were being properly rigorous I'd be burning three copies and putting two of them in different physical places.

Printing out the occasional photo book of our favorites is also a part of the strategy - if our grandkids want to see photos of our lives after we're dead, they won't need to worry about lapsed hosting bills or ancient storage technology. They can just look at the book.

Obligatory reference to my rant about data preservation:

http://howicode.nateeag.com/data-preservation.html


I find 100 GB mdiscs somewhat cheaper per GB than the 25 ones.


Huh, thanks for the heads-up.

IIRC, when I settled on the 25s, they were a slightly better price/GB than the 100s. Maybe that's no longer true (or possibly never was).


Plus one to this. A Blu-ray is durable, maintenance free and most importantly extremely common. Anyone can use one and this will likely be true for a long time. There will be people around who can get data off Blu-ray’s for an exceedingly long time.

But printed images are the ultimate. They require no extraction whatsoever.


For video files - a little off-topic - I created an app for dealing with huge number of video files: Video Hub App which lets you see previews from your videos (even if the files are scattered across many offline hard drives). Might come in handy for someone.

https://videohubapp.com/en/ - MIT open source: https://github.com/whyboris/Video-Hub-App


Tech solution: iCloud, which mirrors them across all devices, then automatic backups of a Mac and a PC where those files are stored into a RAID nas.

Non tech solution: I just print everything. I refill canon cartridges, making the cost of ink negligible and buy cheapish 4x6 photo paper, and I just print like crazy. After some basic calibration, the prints are quite good enough. I guess longevity of the aftermarket inks is still to be seen. Wife enjoys just picking up some prints to send friends and family. She snaps a pic of whatever she sent, and I reprint those.


The main storage ("media" partition of ~450GB hosted on a DL380G7 with a DS4243 array (mdadm/lvm) "under the stairs") is kept in sync (using rsnapshot) with a secondary storage (SS4200 with an external eSATA enclosure) in another building on the farm - but hosted on the same electrical branch and connected to the same physical network so it is vulnerable to any spikes which make it past the quite elaborate power filtering I installed when I got fed up with repairing or replacing hardware after every thunderstorm. There is a separate backup to a set of spare drives on the DS4243 meant for quick access, e.g. when some experiment has gone wrong. Separate from this I make encrypted regular backups to online file hosting services (currently using free accounts on Mega) as well as to storage hosted by family abroad. I've given up on making backups on DVD a long time ago since this was simply too much work for the relatively small benefits.

The chance of all of these versions becoming unavailable is low enough for me not to worry. If I end up under a tree some day nothing changes in the short term, the data will be available to those who have not yet met their tree. In the longer term I'd expect someone to either pick up where I left off or move the data to some storage which befits their level of expertise and needs. I don't want to force them into any contract with any provider, nor can I decide what they deem to be worth keeping.


I am working on a fully decentralized application to solve this problem. The application is a cross-OS Nodejs app that provides an OS like GUI in the browser to display the file system of the local machine and other trusted machines. Network file copy is currently broken pending completion of a major refactor.

https://github.com/prettydiff/share-file-systems/tree/master...


Tangential discussion: my daughter recently asked to look through our family videos and we've been getting through them the last few nights. It's wonderful. Both kids are enjoying seeing themselves being ... themselves, embarrassingly unbound by the learned shame and anxieties that come from moving into teenage-hood.

I'm glad my backup and storage setup has stood the test of (thus far a relatively short) time.

Most videos are from mobiles, and I use syncthing on each device to automatically back them up to NAS. The NAS has an external USB drive to which back ups are directed, plus another NAS in the shed as a secondary on-site backup - which also has an external USB HDD that gets backed up to. All of the backups other than mobile device to NAS via Syncthing are done with crontab'd rsync scripts.

I then irregularly rotate portable USB-caddied HDDs from my parents and in-laws houses and backup to them as well. This rotation is well overdue, thanks for the reminder.

I don't do cloud. It feels too ephemeral in the context of my history with online services - I change my mind and switch around a bit, and lose interest here and there. I'm the kind of person who would host their own email server if I wasn't already gigabytes deep in a grandfathered free custom domain plan from Google. (I do actually host my own email server using Citadel, but it's a domain that's very rarely used and entirely non-critical).


Love this question - been through a few iterations. For context I'm an amateur photographer (portrait) so I have a lot of RAW files that I tend to keep (almost 3TB now since 2010) from my various Nikon cameras (D90 and D7100 - not amazing but they do the job!)

I have a primary SSD for my OS and applications, a secondary SSD where I store my latest/active Lightroom catalogues/files and a large 4TB (spinning disc) hard drive for local backup.

After a shoot I'll import the files onto my secondary SSD and then manually copy them over to the local HDD once I've done my picks and culled things a bit. That HDD is automatically backed up to Backblaze so there's always an original copy of the files either local or in the cloud. Any images that I edit/pick/export will be manually uploaded to Google Photos (usually in it's own album) and I try to annually pull down files from Google to go into another 'jpg/exported' folder on my HDD that's backed up to Backblaze. My wife and I also have our iPhones backing up to Google photos automatically plus we have Apple storage/backups - so we tend to have double coverage on most things.

Some of it is a bit overkill probably but I'm particular about having access to all my photos and I've definitely gone back to look at and edit old photos/shoots so it's worked out ok for me so far.


I uh, kinda just don't. I save some pictures to Google drive and otherwise just embrace ephemerality.


Yeah, it may sound strange but my most important memories are smells, and you can't store them yet :(


At least future folks won't have to throw away heavy boxes of our photos, while feeling guilty about it. They just won't bother to recover our cloud accounts, when we die.


Another side to this problem is that so many online services are structured as a contract between one person and the company, when what I really want for family photos and documents is something much more family-focused. In particular, if I die, it should make zero difference to the relationship.

I sometimes wonder how much of the way things are is because that's just how the US legal system is, and how much of it is because that's the worldview of the childless twentysomething singles who built these systems. (No judgment: I know when I was a twentysomething childless single building things, I had a person-centric world view, not a family-centric one.)

(An extreme example of this problem is the oculus quest, a device that is begging to be shared between family members, but it requires a Facebook account to use. Further, it is a breach of the terms to have a family facebook account. Nobody in our family with a facebook account - i.e. the older people - wanted to sacrifice theirs to the quest.)

It's weird/sad that a family doesn't have any sort of legal personhood, but a corporation does. A joke solution here might be to incorporate my family and have all these digital assets belong to the family corp. I'm not even sure if I'm joking.


I store them on my NAS, and my NAS has a synchronisation with Amazon Glacier which is a very cheap storage service that is designed for cold backups, meaning if you want to request a backup it can take 24 hours for Amazon to provide it, unlike Amazon S3 where you can access the files instantly but is 10 times more expensive. But you should only need to request a backup if your house and your NAS get robbed which hopefully happens very occasionally in your lifetime.


What I use is an old PC running a NextCloud instance. It has an SSD for the OS and the data sits on a ZFS pool made of two HDDs in mirror mode. I chose ZFS because I was worried about silent data corruption (it happened before in my large photo archive even if it isn't nearly as large as yours) and because when I set this up, there was still a lot of discussion on whether BTRFS was reliable or not. If you wish to replicate this now, I think you should re-evaluate BTRFS (assuming you will be using a Linux distro).

For the cloud backup, I think that you will be saving money by using backblaze [1].

As for the solution for posterity, if you're worried about the possibility of next generations not being able to pay, the only solution I see from the top of my head is to print the photos. Given the amount that you have this would be a daunting task, so only print the ones that you find the most valuable and look for a "safe and stable" (whatever this means... :D) place where to store them.

[1] https://www.backblaze.com/b2/cloud-storage.html


We are a no-google/apple house. My photos come from various phones and two cameras. The phones all have the same Mega account on them and the photos get synced to "Phone 1", "Phone 2", "Phone 3" folders, and the whole Mega account syncs to a desktop.

The cameras have Wifi cards and sync to the desktop that syncs to Backblaze.

On new years day each year, the desktop gets archived to a external HDD and put in a safe.


My PC has a specific drive/partition (a single hard drive which has a single large partition)allocated to long-term archival use - in other words, a 'write once, keep forever' policy.

That partition is backed up daily to two separate external USB hard drives using rsync.

That gives three separate copies. One of those copies is also rsynced to a separate laptop, so a total of 4 copies.

I don't store archival stuff like videos, photos, music in my personal directories and scatter them all over the place. An archive is an archive and needs to be centralised and then distributed outwards from there.

Is there a good solution for posterity? For example, once I die

Unfortunately, you can't control posterity from beyond the grave. What you consider priceless is something that your descendants maybe can't wait to get rid of. I learned this lesson when my brother died. I started off trying to keep as much as possible of his stuff, but as I sorted through it I found that it was worthless to me so I ended up discarding about 95% of it. I saw that when I died, somebody else would do exactly the same thing with all of my carefully-hoarded information.


I have been using Google Photos for the past 9 years with no problems. Last year I decided to try out OneDrive, if any of you are using OneDrive, please leave a backup somewhere else or, at least, don't use the selective sync feature, which only syncs some files and folders locally. I used to use the selective sync, then I toggled it by accident, OneDrive created all the folders and was downloading the files when I decided to interrupt it and turn on the selective sync again. A few months laters, I found out that this little incident made OneDrive assume that my newly created empty folders were the right thing to backup and I've lost a decade of photos. Yes, I know it's 100% my fault, I should've known better than to trust microsoft with anything. And let's be clear, when I say trust, I don't mean that I blindly trust Google either, I just trust them to be competent.


Seeing the OPs question, I'd might add another feature which I feel is tricky. Backup of data is pretty simple (I guess most HN user could easily make an simple upload/backup app for photos quite easily).

- But what about searching... That moment when you're looking for a specific moment or latest photos of your dog? or pool to find a specific pool.

- Dupe detection.

Those unique features that for example, Dropbox is an inferior tool for the job.


Shit yeah, curation is much more time consuming than backup. As true for family photos and videos as it is for music, movies, and TV shows.

I put photos and videos in separate folders, and then subfolder for year, subfolder for month. Within the per-month folders I rename those photos / videos from their defaults to something that describes their content, eg:

"Daughter cracking the shits in egg and spoon race"

Or

"Daughter getting chased by duck".

The ones we want to re-live.

This takes long time reviewing each and every photo and video though. I'm only up to 2014...


Haven't used it yet, but want to set it up when I get some free time:

https://photoprism.app/


I use a combination of Google Photos and Dropbox. GPhotos is great for searching photos, sharing, and viewing on Chromecast. But I don't like that they can transcode your data. Supposedly they don't if you have the storage space, but an accidental app setting can still mess things up.

Then there's Dropbox. It automatically uploads all my raw data, and syncs it across multiple devices. So at least I have local and cloud backups of all my raw photos/videos. A picture taken on my phone camera will immediately appear on my PC desktop hard drive, as long as it's on. The downside is searching through photos can be a slog compared to GPhotos, but that's why I use both.

I used to use Picasa many years ago, but haven't reinstalled it on a newer computer. I've been meaning to run it through my Dropbox collection again to make it easier to sort through locally. IMO, it's still the best photo management software. As far as I know, it's still the only one that does facial recognition locally and without a subscription.


Picasa is no more, I thought?

https://picasa.google.com/


I use a NAS and cloud storage. I recently got a new Synology NAS and, since it supports Docker, it’s straightforward to run nearly whatever software you want.

They have a decent enough Photos app for browsing / sharing photos.

For offsite backup, I send to B2 which is ridiculously cheap, but I pay for, so I’m not the product. I haven’t put Cloudflare in front of it, but that’s something extra folks do.


Same plan for ALL data.

Recent Local + Recent in Cloud(DropBox) --> Copy on Home NAS (Synology) + Local NAS weekly Backup (External USB) --> Continuous Cloud Backup (BackBlaze) --> Offsite Copy Annually

Copy all important and historical data to a cloud storage solutions. Currently using DropBox

Export photos to home NAS (Dual Drive) and NAS Backup to external drive weekly. Daily backup all local drives to BackBlaze

Once a year, copy all photos and important data to a larger encrypted USB HDD for one year storage off site at parents home. I swap the drives about once a year. This is my I need it quick recovery copy should the house burn down.

For encrypted storage, I use only local OS encryption. Got burned by TruCrypt/VeraCrypt not supporting new OS.


I'm paranoid about data loss, so I have a (IMO) pretty darn good backup regime. It's robust and easy. The main thing I worry I'm not protected against is bit flips.

My wife is the one who takes most of the photos in our family. For her computer (MacOS):

1) Back up her computer to USB hard drive via Time Machine every day or two.

2) Back up her computer to a USB hard drive in her office (via Time Machine) when she takes it in, about once per week.

3) When her drive starts getting full, move photos to a larger USB media drive attached to my computer. (Every year or so.)

For my computer (MacOS), including USB media drive:

1) Back up to a "main" and "redundant" USB backup drives continuously (every hour or so) using Time Machine.

2) Back up Mac hard drive, but not media drive, to "main" USB backup drive every night using SuperDuper.

3) Back up to the cloud continuously using BackBlaze.

4) Swap "main" USB backup drive with an identical drive stored in a safe deposit box (every 3 months).

This provides a nice combination of convenience, redundancy, and recent off-site backup.


This is what I've been doing for the past couple years:

1. Pictures are either manually copied to a staging folder on my storage server (for DSLRs) or automatically synced there via Syncthing (for smartphone cameras). For the latter, Syncthing is set up to preserve deleted files in case of accidental deletion.

2. The storage server runs zfs and takes 10-minutely snapshots of all datasets, which are replicated to 2 servers using zrepl: one sitting upstairs and another I rent from Hetzner halfway across the world. Replicated snapshots are kept for 2 years. Everything uses zfs' native encryption, but the replication targets do not have the key.

3. For the really important pictures and other documents, I create 2 additional backups: an encrypted backup to Backblaze B2 via rclone and burning them to M-DISC blu-rays. This is sort of a last ditch thing in case a zfs bug renders the primary backups unreadable.


I have rsync cron jobs that backup to an Olimex NAS [1]. Their mainline support is great and the box has been running find for a few years already. The only problem is, it only supports 2.5" HDDs, so you are limited to 4TB. Power consumption is less than that of my dish washer on standby.

From then on, I backup to Google Archive Cloud Storage over restic [2]

[1] https://olimex.wordpress.com/2020/03/13/bay-hdd-sdd-is-easy-...

[2] https://cloud.google.com/storage/docs/storage-classes


2x4TB WD reds[1] in RAID1 on an old (ca. 2008) desktop I turned into a low-spec Linux box. These days I just keep it powered off most of the time, so I don't even really pay for electricity. Unfortunately even though I'm using reds, which are designed for NAS use cases, I've had two drives fail since I set up this rig in 2015, which I suppose puts me at ~$30/year, though I use the 4TB for lots else as well. It's durable enough for me since it's easy enough to notice when a drive fails and very unlikely that the other will fail before you can replace it.

[1]: https://www.microcenter.com/product/634744/wd-4tb-red-plus-5...


We sync a shared folder with dropbox. Backup with backblaze.

Our local "copy" is basically our personal laptops. This has required manually upgrading our SSDs over the years though because we have around 1TB of photos and videos.

Our off-site copy is backblaze. We don't keep a non-cloud off-site copy.


Historically, I first used a Synology Diskstation NAS (DS620) with 6 bays and 6 x 1TB data center grade HDDs to keep secondary copies of photos and self-recorded videos (I don't have an urge to archive any commercial movies, as I could just buy them again if they got lost). The Synology GUI sits on top of a Linux variant and is easy to use and worked well, and is maintaned reasonably well (I don't use most of its functionality, you could run your intranet on it but if that's what you like, I'd recommend you buy two of them and separate concerns).

After that I decided to migrate as much of my hardware as possible to 19" racks. So next, I upgraded to a Synology NAS RS819 (4-bay, 4x4 HDD = 16 TB raw capacity, 12 TB in a RAID5 setup). This lasts for quite some time, as I'm not much into movies. Importantly, the NAS is normally physically _not_ connected to any networks except when I transfer a new batch of photos for archiving.

Some historic material is on a DAT tape in a fireproof safe but that is close to its expected end of life, which is fine as there are HDD copies.

To view the photos, I access a subset of commonly viewed media on another 19" server, so the NAS is truly for archival use only.

What would be nice, but is not yet implemented is a second site to add redundancy and protection from disasters (of course a backup should be offsite per definition). The two sites could synchronize over rsync. But I am still too concerned of the attack surface when putting archival machines on a network. A solution could be the following: some friends recommend Google's "cold storage" as a tertiary cloud storage, which is affordable, and they use scripts to encrypt their data; I think that is a good idea.

The question regarding mechanisms to manage things in case of one's death is an excellent point raised by the poster - I bet a lot of valuable information assets have already been lost due to people passing away when their relatives are not IT savvy to rescue them (or the deceased didn't take precautions to leave passwords and instructions behind).


I actually print out my favourite photos and put them in albums (mounted on little sticky tapes). Safe against all digital corruption, and provides a nice little hobby for evenings in front of the TV. Of course I have the digital versions on a couple of hard drives.


I make a photo book once a year with the best photos from the year. I use Amazon photos, google photos, etc but the hard copy book is the only one I’d expect to still be around in 100 years.

For an offsite backup, I should start printing 2 and storing one copy at a relative’s house…


I would send the second copy to a relative in another country. Wars and Natural disasters happen and if the relative is not far from you, most likely both copies will be gone.


Synology DS720+ w/ 2 disc Raid 1 mirror.

Nightly update to Backblaze B2 using Synology Sync.

Considering a third local copy stored at a family / friends and updated yearly.

This requires almost no work on my part... just the occasional copy over of pics from phone / camera to my network drive.


I have a raspberry pi with a 1TB USB stick, which is enough space for me atm.

I use SFTP to upload new pictures from phones etc. to the pi. There are many apps that can do SFTP, like e.g. Foldersync on Android.

I have a shell script on the pi that runs once/day and uploads new files to S3 storage, after encrypting them with gpg.

I occasionally get a new USB stick and retire the old one, keeping it as a backup copy.

The whole thing actually works quite well and means:

- Local, unencrypted files I can access easily

- Encrypted files in the cloud, that no external parties can read

- Physical backups that I can put in a safe, or whatever

It’s here: https://github.com/ragnarlonn/savethepictures


9 disks in zraid3 holds my files, scrubs twice per month, ecc ram and cpu cache.

backup server galvanically disconnected, manually plugged in to synchronize, then scrub and shutdown, small window for catastrophe, chances of both servers getting killed at the same time is so small I'd probably be dead too.

main and backup servers are in two different locations (different power substations).

I just use rsync for doing the backup.

Family is connected to main fileserver. It snapshots once per week, so if they destroy something, I can get it back, it keeps two years of snapshots.

Once every 5 years I buy a new fileserver, the old one becomes the backup (assuming it is still healthy and not experienced any problems during it's service).


My friend and I built https://pixaver.com. It doesn’t address all of the points you note here, but it does offer some piece of mind if we get locked out of Google Photos.


Congrats on building such a nice tool! I‘d gladly pay for something self-hosted, or some tool that allows me to download it on my own hardware.

I don‘t want to solve the problem by

A) paying even more money every month, much less even more than for Google Drive & Photos storage itself!

B) uploading my personal photos to a completely untrusted provider (at least I can trust Google to some degree to not leak my photos to any stranger on the internet).


Those are fair points and thoughts I’d probably have myself. I’m not trying to sell you on this, but here are my responses to your two notes.

The first is that we know it’s more expensive than Google Photos storage. That’s in part so that we can actually fund development. As a small operation we can’t run it as a loss-leader or make up revenue from other channels like Google could.

The second is that we’ve been somewhat vetted through Google’s application process. We’ve also been building things for the web for 20+ years. Admittedly, trust is hard to earn from folks you don’t know, but we are decent people.

To be honest, Pixaver probably isn’t right for most folks on HN as most here are pretty technical. (I’m just a daily reader here, and spotted this thread, so I thought I’d share it.)

We’ve geared this more for people like my mom who don’t want to set up a local hard drive or do anything even remotely technical. For someone like her, it’s a couple of clicks to having a second backup without needing to do anything additional. :-)


Do you have any plans to support backups of other Google content, e.g. gmail?


We’ve thought about it, but I think it’ll be a while before we’re able to get to that. We also run two other products, and we’re just two people. So, we’re already spread a bit thin.


Let me know if that happens! I'd love an easy way to make sure my data is safe in the event Google randomly bans me :) I'll check out Pixaver soon to see if it'd work as a backup for my Google Photos!


Nice! If you do try it, please let us know how the experience goes for you. I think the onboarding process is a bit intimidating given the (necessary, yet lengthy) permissions we request.

I’ll mention that you’d be interested in more options around backups to @shelkie. Like I said, I don’t think we can make this happen now, but it’s good to have a sense for what others might find useful.


Hi! I just tried it out and signed up for the 25gb plan. It was incredibly easy. I really appreciate that your prices are reasonable.

Just curious -- where are photos stored for Pixaver? Are you using cloud storage?


Awesome—thanks for signing up! If you run into any issues, do let us know. :-)

We’re using Wasabi for cloud storage. So far it has worked pretty well for this project.


A PC as file server and an Olimex A20 SBC Backupserver with 2 rotating ( one offsite) external encrypted sata harddiscs, rsnapshot and anacron that automatically backups all our servers, laptops and phones in the lan or known public ip.


I have about 6Tb of photos and videos that are stored on an on-site TrueNas Server. This includes my raw files, lightroom library and edits.

TrueNas is configured to automatically backup to Backblaze B2 (which is off-site).

After editing, all the exported JPGs are stored on Google Photos and shared with family. I like Google photos because of the content search and face detection features.

So that's 1 on-site medium and 2 off-site mediums. I used to burn the photos to archival disks and place them in the bank, but it got tedious.

Edit: I use Backblaze because it's the cheapest. Most Photo storage providers don't have support for Raw files. (If anyone has a recommendation let me know).


1. I have an Amazon Prime subscription, and we get Amazon Photos with it. Any photos taken on our phones are automatically uploaded there (I don't recall if these are original resolution or not)

2. Periodically I take photos and videos off our phones and store them on my desktop PC, imported into Lightroom. They are stored on 5400rpm HDDs in mirrored RAID config

3. I get terabytes of OneDrive space with my Microsoft package (I can't remember the name of it right now!), and it automatically uploads all the media from my desktop PC

4. I also have Seafile installed on the desktop, and it automatically uploads everything to my Seafile server, which uses Azure Blob Storage


I have a TrueNAS machine that hosts Nextcloud. All datasets are backed up to B2 with E2EE. Nextcloud has wonderful iOS integration. No other photo backup app I've seen backs up live photos and can restore to the camera roll. Monthly cost is low and no one else has the private keys.

I'm not close enough to the end to worry about posterity. If I die tomorrow then my keys are lost and all of my data is entropy. I expect to be motivated to have my data pass on in legacy but that will only happen if I have a kid who is willing to accept maintenance of familial digital archives. We're the first generation.


I print them... for xmas i printed 600 photos for about 60 eur, it was well worth it.


I have a 4 Tb disk in my main PC with every media on it. It's mirrored to a 1-disk local NAS. I rotate the disk on the NAS about once per week, putting the other disk in a big old safe that was already in the house when we moved in.

So at any one time there are almost complete 3 copies of everything.

There is no off-site backup. The safe is probably not super theft-resistant nowadays but it should provide some protection in case of fire.

I'm reaching near capacity on 4 Tb and don't exactly know where to go after that. I could upgrade all 3 disks to 6 or 8 Tb but I resent discarding the 3 4s that are in perfect condition. So IDK.


I am turning paranoid because my first thought about using someone else's storage, is, how much should I encrypt the archive so that ML can't run using my photo album as someone's training data.


My family "memorable" photos and videos are all stored on Telegram. We lost a bit when we converted the group to supergroup though. My mom is in her 80, really happy to navigate through the old ones when she likes. She can upload new media files to the group. I believe Telegram is the best tool that helps her: it's super easy, fast.

For the rest (e.g, snapshots, non-sense ) I have them copied between devices thanks to sync tools. I also do cold copies when I feel I need to do that.

I hope when I am 80 or 90, I would be still able to find my images on telegram then...


Photos sync automatically from camera to Dropbox Camera Uploads folder. I move them every now and then into a Dropbox/Photos folder (not syncd to local machine as it's too big) after organising them into month folders. These same organised folders are copied onto an external drive from local Dropbox Camera Uploads folder. This external drive is then Rsync'd to a backup drive that I keep in a safe.

I still have Dropbox as a backup this way so it's a cloud backup of sorts.

Not foolproof but it's super simple and good enough for me.


At this time, I have a local copy split across a Windows and a Mac. Both are backed up to Backblaze Personal plan. When someone from my immediate family wants to view an album, they have to download it. Not ideal at all, but that's our process today.

I don't use Google Photos for privacy concerns.

Now that I use Tailscale, I could consider setting up a NAT at home and make files easily viewable using NextCloud, but I worry I won't backup properly to Backblaze B2. Need to figure out how to schedule backups before I make the switch.


I use a NAS, raspberry pi, and s3 with the following workflow:

Photos are imported from SD card using a script. This script creates a new directory, captures some metadata, copies the photos in and creates thumbnails. This directory is rsynced to the NAS then encrypted, compressed and sent to s3. Nightly, from another location a raspberry pi with a large encrypted disk rsyncs the entire NAS.

All viewing and editing of photos is done against the NAS, and any changes are picked up nightly by the pis. The s3 copy acts as an immutable original.


NAS + USB HDD docking station plus couple HDDs in rotation.

While NAS has mirror on it, I do not consider it enough backup. I assume it will not save me from ransomware or a family member stupidly deleting a lot of data or from fire or from theft. It is mostly to give convenient access to it for my family.

HDDs are in rotation and one is always offsite with my family that I visit regularly. Whenever I travel there I make a backup on one of the HDDs and then bring the "old" drive back home to be reused on one of future trips.


Upload them to Google, Flickr and iCloud. Also have two local copies.


I have like 10 photos that are special to me physically printed out.

People are such hoarders when it comes to family photos and videos. You are never going to look at 99% of those photos and videos.


It is true that for my wife and I, when we first had kids, we took a lot of photos and video. Too much. I sometimes think that we missed out on a lot of events in all the event-saving.

But when my folks passed away, my sibs and I very much enjoyed the old photos and movies, with people that had not been around for decades sometimes.


Wait until you have kids...


How do you store 750gb of photo and video?

We do look at them ever so often, and it’s great.

It’s also disturbing to see corruption in some of the first videos we took. I don’t know the source but some videos have weird artefacts now.


NAS appliance with RAID 5 over 4x large disks.

From what I've read, a ZFS filesystem is very good at detecting and correcting file corruption. Next upgrade I'll be looking at a new NAS appliance that has native ZFS support.

I'm currently storing around 400GB.


Currently I have an old NAS with 2 1TB drives in RAID1. At some point my drives will be full and I will migrate to a new system that will be slightly cheaper but requires a partner.

At some point I will buy a simple Raspberry PI (or cheaper system) with one multi TB drive using USB. A friend of mine will do the same. A cron job will run a sync on the drives at night. 50% of the drive will me mine, 50% of the drive will be his. So the system has redundant storage and it is off-site.


We keep all of our pictures on an external USB hard drive. At least once a month or after a large addition of pictures I back it up to my NAS ~30TB in the basement using rsync. That backup gets sent a few hundred miles away to a server (really, just an ancient, headless desktop running Debian tucked under a family member's desk!) with a large HD (12TB) via Syncthing. Syncthing is set to be one-way, pushing changes from my backup to the remote server only.


I use PhotoStructure for local management on external hard drives, BackBlaze for offsite backup, and a combination of Flickr and Google Photos for sharing the highlights.


Keep doing what you are doing. Also, from now on, at the end of each year, look through all your photos for that year, choose about 100, print them, and delete the rest.


I also don't understand the necessity of having all these images. "Remember that? I'd tell you about that day, but as it happens we actually have 1214 photos and 2 hours of footage from that event!"

Or start shooting film, also gives the photos some character.

relevant xkcd https://xkcd.com/1832/


This.

We horde so much and trap ourselves in nostalgia.


Photos and videos are thrown into a multi-drive RAID 6 NAS with TB's capacity as the main central storage. RAID 6 offers redundancy against disk failure and easy plug & play replacement. An external USB backup drive attaching to the NAS does incremental backup daily. Periodically the data are snapshot to other USB backup drives and kept off site.

The NAS server has apps that offer backing up to various cloud services, but I choose not to use them.


Every time our phones reach capacity, I download all pics/video to my laptop. Then I plug in a portable SSD and copy them there in folder name "media until <current date>". When I get to work next, I copy them to an identical portable SSD stored there, at which point I free the space on my laptop. This whole process happens about 3-4 times per year.

I'm currently using returned/refurbished Samsung T5's at about $100/TB.


I do this, alongside a tool for finding duplicate files: https://github.com/adrianlopezroche/fdupes.


I use Czkawka recently for dupes (it has GUI and well-maintained): https://github.com/qarmin/czkawka


Thank you. I'm also looking for a tool which will find when one file contains a scaled image in another file, or is a crop of another.


I have my photos on my personal computer and some of them on my external hdd. I manage and edit them with lightroom. I backup everything from time to time on two external hdds with rustic. I export albums to google photos and personal (mine and my gf) stuff on onedrive. But it's only exports - lower resolution and after-edit jpgs.

[edit] only time I lost some image was because I have waited too long to copy it from my camera to the computer :)


One normal 16TB drive + two external 14TB backup drives which are being used in alternating order.

I would consider it the minimal defense against certain technical malfunctions. I should add another offline copy at some relatives, to protect against the 'house burns down' scenario.

Using an online service could help against the time between the backup cycles, but that is the least of my concerns and online services have their own set of problems.


I have a Synology's NAS for almost 9 years now. It has 2 4TB disks in Raid (SHR). Also I have a external 4TB HDD with a copy of everything in the NAS. The system is extremely reliable. I don't know if I'm lucky, but the first HDD took almost 7 years to brake. I simply replaced it and the system is still working 24x7x365.

As I have the MS Office subscription, the cellphone photos are also backed up in the OneDrive.


If you have terabytes of photos, be honest: no one is ever going to look at them. Pare them down to the good ones, get high-quality prints, and make a photo album. There's really no way to guarantee electrical devices will last without labor and cost. Although the idea of passing digital assets down as heirlooms has never existed before, so maybe there's an PaaS here that goes beyond Google photos.


Me & my wife are using onedrive to send everything to the cloud (basic plan).

Once we're close to filling up available storage, I turn on my local server which downloads it into its RAID disks (then I turn it off afterwards). I wipe my & my wife's onedrives and we repeat the process.

Once a year, usually around Christmas, we're sitting together and choosing photos that we want to print and put into family album.


I have about 1 TB of family photos and videos.

Copy 1: Desktop Computer

Copy 2: Backup 8 TB USB Hard Disk, always online

Copy 3: Backup 1 TB USB Hard Disk #1, always unplugged, rotated with #2

Copy 4: Backup 1 TB USB Hard Disk #2, always unplugged, rotated with #1

Copy 5: Microsoft Onedrive (Pay for 1.2 TB Storage)

Copy 6: Amazon Photos (unlimited photo storage for free with Prime)

Main purpose of the unplugged offline storage is in case of ransomware attack which could conceivably wipe out my online and cloud backups.


My wife and I have the Dropbox app installed on our phones, which automatically uploads all photos to that service (Remote Copy 1).

I have the Dropbox app installed on a desktop computer which is always running and syncing (Local Copy 1).

Once a week I hook an external drive up to the desktop and take a backup via windows-built in backup utility (Local Copy 2).

I have Backblaze installed on the desktop continually backing it up (Remote Copy 2).


Nextcloud on ZFS + rsync.net with rclone and crypto.


We create physical photo books, sometimes two copies. These are a (small, curated) subset of all the digital photos we have.

We keep the digital photos backed up on an external SSD and on iCloud (auto-synced from several devices), too.

Bottom line: we stopped trying to obsessively 'preserve' digital hoards. It's such a waste of our time, and is a questionable goal anyway.


I have more than 4 TB lifetime pCloud storage. Best option for storing photos and videos. Better than everything else out there.


Same. I bought their lifetime 4TB + encryption addon during Black Friday. It's enough for all of my personal data, music and some movies / TV series too. And I never have to pay again, which is nice.


We all have iPhones, so we use an app to automatically upload our pictures to the central nas. At the central nas a 24/7 sync runs to an external usb drive and other backup schematics to two other nas systems. Once in a while i copy (and encrypt) everything to another external usb drive. That usb drive then goes to another location.


Which app if you dont mind sharing?


Something like Photosync[1] does it and I had used it much easier during Flickr days. Unfortunately, in my recent tries, it keeps crashing on large libraries.

1. https://www.photosync-app.com


I agree. Qfile also has its limits. My wife has a gazillion pictures so it regularey locks up.


QFile for Qnap NAS.

Synology has DS file or the newer Synology Photos for that.

For regular iDevice to Nas things i recommend File Explorer Pro. It isn't free, but gets nice updates. Haven't found automated uploads/sync in it though.


I (well the whole family) use iCloud and Google One & Photos as a backup. I stopped using Amazon Photos, well, paying for it given that it syncs the photos just fine too and I have the videos in the other 2.

Every 3 or 4 years I get paranoid and I make a backup in a random HD drive that it takes me days to find, which for a backup it is not ideal.


- PhotoSync on all family devices uploads pictures to local ubuntu server

- Local server processes items and provides UI for family members

- Pics and videos get backed up to Wasabi (S3-compatible) storage using restic (incremental backups, no need to upload hundreds of GBs every night)

- Once in a while I back everything up to external HDD and store it in a drawer at work


I have ~50Gb so this works well for me:

- Local copy on desktop

- Google Photos (with upgraded storage) for "everyday" use (sharing albums with family, viewing on phone, etc).

- The "proper" backup is done with Kopia to B2 and Wasabi. I don't have a large amount of photos, so B2 is really cheap and its just a redundancy for Google Photos and Wasabi.


Independently of any technical solution, learning to not hold onto past personal memories can be beneficial.


Buy cheap NAS boxes for each family. Backup pictures to your NAS, then use remote sync to copy pictures to the other NAS. Now everyone has the pictures locally and it’s backed up remotely. Sync the pics you like to Google for convenient access. QNAP TS-130 is $140. 4TB is about $100.


> is there any way to guarantee these heirlooms remain intact and available

Are they actually heirlooms? If the goal is to keep these safe so generations can continue to not look at them, what's the point? If you haven't looked at something in years, odds are nobody wants to look at them.


The actual object is less important than the exercise and the discussion. Substitute "photo albums" for "medical records", "civil certificates", "passwords + 2FA codes for all DeFi wallets" if you want.


The things you list need to be available for (at the longest) a month or two following my untimely demise.

The OP is talking about generational foolproof storage for photos which is both totally different and a slightly more convenient version of saddling your heirs with a closet full of photo albums and slide carousels.


As someone who was in Brazil and had to dig into records from my Portuguese great-grandparents when I was collecting documents to claim my citizenship, I assure your documents need to be available for a lot longer than you expect.


Being an Apple household, we have two Time Machine drives, one in different building in case of fire etc.


I store photos and video on a Synology NAS in my home and vault to Glacier once a week.

However I recently realized that if I wanted to get the images back out of Glacier it would be ridiculously expensive. I’m looking for another option, hopefully something that is cheap and reliable.


is it expensive because of huge number of API calls? What if achieve files, store them per 100G, or 1G, is it going to be cheaper?


Lots of good answers here already, but wanted to chime in. I recently bought a Blu-ray burner with M-disc support. Plan to buy or put together a NAS of some sort eventually, but also have been working through making backups of my most critical data onto M-disc Blu-rays.


Self hosted Nextcloud


I simply burn the photos in 100GB M-disc blurays. Then I bury them next to a tree in a zip lock bag.


I hope you bury a reader unit as well :) I bought two units of different makes and do small tests like once a year after computer/os updates, just in case...


There's plenty of bluray drives still manufactured, mostly on each modern console. I hope I can find a working in 20-30 years, my PlayStation 1 is still working strong :D


Next to the usual 3-2-1 backup cycle I also keep a downscaled (around 5 Megapixels) full backup on my phone (SD card). The originals have several terabytes. Should everything else fail - at least some will survive. Also it's nice to quickly browse the collection.


My current approach is to store all photos and videos from mine and wife's phone to Microsoft OnDrive and Apple iCloud. The photos are then synced from OneDrive to my desktop PC as well.

Everything automatic, grouped into folders in yyyy-mm format


Yottacloud + systemd timer running rclone. It's been going for a couple years now, and is delightfully low code and low noise. Cheap, too, with unlimited storage for €7.99/month. Currently storing ~3 TB of raw footage.


I use permanent.org It also has a gallery feature https://www.permanent.org/p/archive/03pw-0000


For posterity, there is forever.com, they guarantee 100 years storage for one-time payment. It's like $150 for 12GB, so you must choose which pictures to store.

The sad part is that nobody will ever look at all those terabytes ...


I have about a years worth synced across my devices via syncthing and everything older than I move to an external drive(s) and AWS glacier. I have a little script that does it and only have to do it once or twice a year.


Anyone got any good fireproof safe recommendations that will save a back up external HD from melting? Also keep water/humidity out, block electrical signals from frying it, any other threat factors I am missing?


Backblaze used to have a pricing model where they limit how much you can upload per month but not directly how much you can store in total.

Could be worth it if you have more than 1 TB. Or even more than 200 GB at Google.


I have tons of 2.5" HDDs from laptops, is there a way to make them work in some decentish RAID1 setup? I looked around and Synology has a dedicated box but... it has 6 bays and way out of my budget


My wedding and engagement photos are backed up in a basic S3 bucket. I plan to do the same with any important family photos.

It costs next to nothing. The redundancy levels of S3 are way better than anything I can make.


Take photos (on phone) ... automatically sync (using Resilio Sync) to a NAS I have at home.

Periodically file/classify photos on NAS. Periodically make physical backup NAS to a remote site.

This approach is low cost and convenient.


Does the NAS fix bitflips?


Also interested in this, but hijacking to ask what folks scanning solution is. My dad has tons of slides, and prints. He’s hesitant to send them off, but we have analysis paralysis on scanners etc.


I would suggest you do a test by scanning a few of the newer prints and some of the older prints on any available scanner. See how much dust is on the existing pictures.

When looking at a print physically, a little dust is not distracting. That may change when you digitize it. So you might want to test what condition they are in by testing some of the older and some of the newer ones - maybe some in transparent plastic sleeves, some scattered about and more dusty.

Don't have much advice on what to do if they are dusty, just aware even a little dust can look off-place depending on what is desired.


I did the analysis as well, calculated everything and ended up paying 500 bucks for an external service. Everything else would have cost roughly the same (with buying a good and used slide scanner and selling it afterwards) plus the countless hours for loading and unloading the scanner and retouching, etc. Just use an external service - if he doesn't trust "some online service", I found a lot of small, family-owned businesses in drivable range to go to. They're usually more expensive, but you know where to knock if something goes wrong.


My dad was going to do this himself but it is a massive job. He ultimately sent them to a company and received them back on DVD.

My family even has reel to reel from the 50s or 60s. The problem though is everyone in them has passed on. I am never going to watch video of my great uncle fishing in 1960. Then he is basically a stranger to the next generation.

All this is quite a bit of work for things that will most likely never get watched.


Try DSLR scanning.


Howdy! I'm an early digital photography adopter (1998), so I started feeling this pain a while back.

After the fourth photo service I had migrated to went out of business (and Picasa was then cancelled), I realized I needed something that wouldn't go away on me, and started working on PhotoStructure, a self-hosted photo and video "digital asset manager" that does automatic organization and deduplication to sweep everything into a single, neat, timestamped pile, and, critically, _keeps my original files intact_.

The point isn't to use PhotoStructure, specifically, but to use an application that you could walk away from and not be heartbroken, because that app used a standard filesystem hierarchy, and used standards to store any metadata changes.

My beta users asked me "how do I keep my stuff safe?" so many times, I ended up doing a bunch of research and wrote up this article, which discusses file integrity, why "3-2-1 backups" isn't really what you want, and how to go from there:

https://photostructure.com/faq/how-do-i-safely-store-files

Know that Google Photos is a great _secondary_ backup, but even in "original mode", what you upload aren't always the same bytes as what you download--their API strips off GPS metadata, I've seen changed captured-at times and exposure information changed after the round-trip through GP. (iPhone uploads, in "original" mode may survive the round-trip, but the point is that it's not something they guarantee).

I'd recommend using one of several mobile apps that will backup your original bytes directly to your computer at home. There are three I've tested and listed here: https://photostructure.com/faq/how-do-i-safely-store-files/#...

I'd also recommend using a filesystem on your home server that can detect media errors. I discussed that, and what cloud storage backups I recommend in the above article, as well.

-=-=-=-=

OK, so, now that you've got your stuff safe, let me tell you a story.

My Mom and Dad passed away almost a decade ago, and left several boxes filled with albums, shoeboxes and loose photos.

Almost none of the photos had dates on them. The albums had no writing in them, and most of the photos were of locations or people that I didn't recognize.

It took a while, browsing through these images, to realize that these images had relevance to only my parents: and maybe only the parent taking the photo.

Without additional context, these boxes of memories were almost entirely irrelevant to the next generation.

It was a punishing realization for me. I know these boxes were important and relevant to my parents, but all but a handful of photos had relevance to me.

Digital photography at least has a modicum of metadata automatically.

But you should still consider you "heirloom" of hundreds of thousands of photos and videos to be _irrelevant by default_ to the next generation.

So, how can you add relevance to your corpus of imagery?

I think that can be helped to some extent with

1. software (PhotoStructure shows random "samples" of years to deal with browsing through gigantic libraries, and displays "streams" of related photos using common metadata attributes to browse across hierarchical trees)

2. rating and pruning (so people can browse only the "best"), but

3. I think the real answer to avoiding irrelevance is for you to tell the story behind the image to give it context and relevance. It doesn't have to be a novel, but even a couple words can inoculate the album from irrelevance.

Good luck!


We print the most important photos and put them in photo albums. Properly printed photos on good photo paper and stored in a photo album should last a long time, hopefully measured in centuries.


I use both Google Photos and Amazon photos. Both, because I'm afraid of being one of those cases where Google's automation decides to suspend my entire account without recourse.


First delete all the low quality shots, scale down low quality vids with ffmpeg, that reduces storage reqs. Then I copy to bluray 100gb discs. Once a year rotate to a relatives house.


We make physical photo albums and keep a few copies of our favorites, and send some to family.

Digitally I keep a few different backups but they are raw and would take a while to sort through.


OneDrive + 2 offline copies in two separate disks. Organization is in folders. Intend to keep one disk at a different location. But right now everything is at one place.


I gave up two different periods when I tried to do anything involving a lot of data with OneDrive.

Inevitably the app would fail to sync or would forever get stuck syncing and become useless.

What number of files and total volume stored do you have, if I may ask?

In my case it was 50k files and 500ish GB.


I am definitely beyond 50k files because it also has code. Volume would be around 300 GB. Haven't faced any issues so far.


I have a local disk for fast backups. A NAS for backups with change tracking once a month. And once a year a disk goes off with a family member to a remote location.


It doesn't completely fit your use-case but smugmug is pretty cheap for photos and lots of storage. You also get smartphone sync for free.


Old HP Microserver running Ubuntu with a Mirrored ZFS pool and periodic backups to S3 via Duplicacy. Pretty much bomb proof.


This may sound sarcastic, but I'm serious:

- I hardly ever take pictures or shoot videos. Not owning a smartphone helps.

- I don't care at all about old family pictures, so if family members want to do something "for posterity", I hope they don't give me a giant pile of media and expect me to keep it.

Also, I feel like everyone's always creating content, not deleting it. Won't the result be an unmanageable pile of cat videos where you can't see the forest for the trees?


I use Amazon cloud and photos are free with Prime. If I get close to 1TB in cloud storage with videos, I back up to local FIFO.


It may be a good product idea to build a box PC with redundant hard drives and ZFS set up in paranoia mode to scrub every so often in order to refresh the data it guards, and configure it with safe defaults such that it works as a vault(not having provisions to run Docker etc.). Want to access photos, hook it up with an ethernet cable and access it like this... Once done, leave it in the closet with power supply. I wonder how many would buy such a box..


Don’t overlook the probability of a fire or burglary or flood causing the loss of everything in the closet. RAID and file system redundancy won’t help in that case. You need multi-site backup anyway, and then file system failures are also covered.


True. Perhaps it should be able to back itself up to something like AWS Glacier or Backblaze B2 such that the data can be retrieved by the owners if they have the encryption key somewhere.


I use google drive + Sync to an s3 bucket in cold storage.

The most important ones (e.g wedding pictures) are also stored on a local disk.


I try to keep it as simple as possible. I back files up to a hard drive. No exotic filesystems either, usually just fat


I use Google Photos too, as simple as that.


For now using google one with family sharing. Thinking of buying a synology NAS to store everything locally.


Syncthing to sync from my phone to my nas. And then a nightly crone to back it up with Borg to borgbase.com


Mac with a 2TB internal drive backed up to an external on Time Machine, whole thing synced to iCloud.


I wanted mine backed up outside of Apple and not in weird format (Time Machine bundle). It’s much harder to do than I’d expected.


we copy all our pics locally to a NAS, and clone to a drive which we drop off at my parents twice a year; when we do the dropoff, we give them the latest backup drive and we take back their (less recent) drive.

so, requires a home nas, and two backup drives, and an offsite person,


All photos make their way to a home server -> duplicati -> S3 and backblaze offsite backups


iCloud + Google Photos. I've just resigned myself to spending to keep these memories safe.


Same here. Got 2TB on iCloud, and 2TB on Google. Main is iCloud, automatically synchronised. Backup is Google, kept synchronised as well. Got wife and children as authorised users on the first, and have managed credentials shared with them on the second.

Nothing lasts forever.


Synology NAS, 2 6TB HDD with RAID1. Plenty of space, can customize to work with any use case.


NAS (including RAID config) with external USB drives, at least one drive kept offsite.


As a long-term user of Synology NAS, I already have one central solution to store all photos and videos, a Diskstation 918+ with 10TB storage. The NAS has a "photos" folder, all important family members have accounts and can store all photos in that folder. It's organized by year and one folder for each event.

When I first set up that photo solution I feared that no one would like to use it, but everyone liked the idea to have one central storage for all photos where everyone can also see and download the photos of others.

To make sure this is not the only place where everything is stored, I have another, older NAS sitting at my mom's house that is used as a remote backup solution. If ever comes the situation that both my and my mom's NAS are destroyed, photos probably don't matter anymore, so that's totally fine for me.

Of course, two NAS with much storage is quite expensive, but I guess it's cheaper in the long term, than paying for a multi-TB cloud storage every year.

TLDR: two NAS at different locations, one has a shared folder where all family photos are stored, one is backup only.


Use Synology with its built in AWS glacier backup.

Use the Synology Photos app on ios to automatically backup media from my phone.


Amazon Photos for stuff from real cameras. iCloud for stuff on my phone.


Managed Nextcloud at Hetzner for 6 EUR/month with 500G.


Redundancy - Google Photos for integrations/ease of use, Amazon Photos since it's free with prime w/ unlimited storage, HD backup, occasional bluray backup, photo books of the best photos. All of it


Backblaze on a media server with several hdd @ $6/mo


One generation later, your heirlooms are meaningless.


If I care, I print.


I use SpiderOak to back up all my important files.


Regular distributions of printed photos by mail.


OneDrive and sync software on Android

Same on Windows

Manual copy sync to OneDrive on Linux

Boom done


Photos on Mac iCloud and borg to rsync.net


print the ones you care about the most


I use both Google photos and iCloud


Thanks for this thread.


Google Photos.


As someone who has created a service (www.gatherthefamily.com) to do this, I can tell you there are no great solutions.

As a number of people have pointed out there are several problems with cloud based solutions:

1. Will the solution continue to be available

2. Will the company properly take care of your data, and protect your privacy vs. the companies own self interests.

3. There is an ongoing recurring cost.

With home based solutions you also have a large number of issues:

1. A lot of work that requires technical knowledge (making backups, regular testing, etc...)

2. You have to keep copies in multiple locations and maintain your own disaster recovery.

3. Obsolescence of your hardware, backup solution is a real problem.

4. This takes real time that you would probably prefer to spend doing something else, which means it will probably get neglected at some point, and therefore still highly susceptible disaster issues from hardware failure or much worse.

A number of people have suggested that the old way is better (Have paper copies of the photographs). This is also a bad solution

1. When converting modern digital photos to print you will loose data that is available in the digital photo.

2. Printed photos will degrade over time (Whether in a book or printed individually)

3. These are highly susceptible to destruction via disaster

4. There is no way to search your data

As far as the ongoing cost of storing in a cloud based solution, by the time you figure the cost for any of the other options when done right, The cloud based solution is probably cheaper depending on how you value your time in each of the other options.

As a number of people have pointed out in the comments as well, the photographs are only valuable if you have useful Metadata of the photos. With modern photos that is much better than before as you get a timestamp and often a GPS location with the photo. Ideally you want at least the date, location, event(if applicable), people in photo, description/story. That will take a photo from being useless to others besides the person who took it, to becoming highly valuable. The more information the better. Collecting that metadata is much harder to accomplish with old analog photos that have been digitized as they often have none of that metadata handy.

Storing, searching and using the metadata is solvable with good software. I tried solving many of those issues with my software. the issue that is not easily solvable is the where/how to store the data. My software allows storing on our service or you can import from Google Photos or Dropbox, but as noted those all have issues. If anyone has ideas on how to solve those issues and actually come up with a solution that is good, I would love to hear it.


iCloud


iCloud Drive

NAS that backs up to Backblaze


I recently spent a lot of time considering how to do my own storage and backups for a collection of mkv files I have. Your solution depends on how frequently you need to access the data, how frequently you need to modify the data and how frequently you need to add data to the archive.

For family photos, you might view them once every couple years, just to browse randomly. Every once in a while you might access the data with the intention of finding a specific photo. The data is of course never modified and only expanded once every five years or so.

So, low bandwidth, infrequent access and expansion and no modification. The winner by a mile is Blu-ray.

A hdd will de-magnetize over a very short period of time even if it’s stored in ideal condition. Less than a decade. You need to plug it in and let it refresh those tracks. If you keep it plugged in all the time then something else will die. Any way you slice it, there will be maintenance.

The data on a Blu-ray just sits there. It will outlast you.

Some Blu-ray Discs can have problems. This is because of improper materials and poor manufacturing. If the layers of plastic that sandwich the medium become delaminated or damaged in some way, this will lead to the oxidation of the data medium and loss of data. A properly manufactured disc like a Sony archival disc will never do that. And they make discs with data mediums that don’t oxidize so those ones can hypothetically last for thousands of years. Despite what people might say, a proper Blu-ray Disc is a great way to store data.

So here’s the deal. Get rid of duplicate pictures and pictures you don’t really care for. Compress as tightly as you can. Burn a copy of those files to a set of discs. Then another set of discs from another brand. And then another and continue however many times makes sense for your budget/ value of the data. Send each set to a different family member starting with the most remote. When it time to update the database, send out the latest discs.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: