Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I have a hard time believing this thing is going to choke on most datasets.

It won't, but 512GB is nowhere near "big data". In fact, most datasets we see should not be called that. My personal rule of thumb is that nothing that can fit in less than a dozen machines can be called "big data". The whole idea begun when we started manipulating datasets that were so huge that moving the data from where it was stored to where it would be processed was no longer practical.



My comment said "petabytes" of data across many machines. The 512GB RAM was one node they tested it on. As in, the 512GB is around how much can be in one node at once rather than what the database can process. If petabytes of OLAP don't count, I'll concede that this database isn't for "big data."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: