Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

SeaweedFS is great as well.

https://github.com/seaweedfs/seaweedfs



Tried this for my own homelab, either I misconfigured it or it consumes x2(linearly) memory(working) of the stored data. So, for example, if I put 1GB of data, seaweed would immediately consume 2GB of memory constantly!

Edit: memory = RAM


That is odd. It likely has something to do with the index caching and how many replication volumes you configured. By default it indexes all file metadata in RAM (I think) but that wouldn't justify that type of memory usage. I've always used mostly default configurations in Docker Swarm, similar to this:

https://github.com/cycneuramus/seaweedfs-docker-swarm/blob/m...


Are you claiming that SeaweedFS requires twice as much RAM as the sum of the sizes of the stored objects?


Correct. I experimented by varying the data volume, it was linearly correlated by x2 of data volume.


Create a reproducible issue in their github. The developer is very responsive.


Looks awesome. Been looking for some flexible self-hosted WebDAV solutions and SeaweedFS would be an interesting choice.


depending on what you need it for nextcloud has WebDAV (clients can interact with it, and windows can mount your home folder directly, i just tried it out a couple days ago.) I've never used webdav before so i'm unsure of what other use cases there are, but the nextcloud implementation (whatever it may be) was friction-free - everything just worked.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: