Hacker News new | past | comments | ask | show | jobs | submit login

Wait, how would you split 6 TiB across 6 phones, how would you handle the queries? How long will the data live, do you need to handle schema changes, and how? And what is the cost of a machine with 15 or 20 TiB of RAM (you said it fits in memory multiple times, right?) - isn’t the drive cost irrelevant here? How many requests per second did you specify? Isn’t that possibly way more important than data size? Awk on 6 TiB, even in memory, isn’t very fast. You might need some indexing, which suddenly pushes your memory requirement above 6 TiB, no? Do you need migrations or backups or redundancy? Those could increase your data size by multiples. I’d expect a question that specified a small data size to be asking me to estimate the real data size, which could easily be 100 TiB or more.



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: