Hacker News new | past | comments | ask | show | jobs | submit login

Per client.



They don't have to allow multiple clients on the same license to all have fast upload speeds at the same time.

If you mean per license, that scales with the amount of money and a reasonable limit on 3 licenses doesn't have to allow all that many terabytes all that fast.


Yes, but then you are implementing somewhat arbitrary restrictions on everybody, in order to work around an abusive subset of users. All else equal, you want to be able to advertise uploads that are as fast as possible (eg. to entice valuable customers to migrate from your competitor to you).

But I do think that targeted throttling is a good way to deal with this problem. As I said elsewhere, detecting abuse is heuristic and false positives are horrible when the enforcement is to shut down accounts, but a false positive resulting in throttling is not so bad.


Google gives you 750GB of upload per day and I've seen very little fuss about that, none from normal customers.


I'm not up on the rules, but that's per user, right? Not for the whole account?


Yes, but does the difference matter? Either 750GB per day or 2.25TB per day would provide a good amount of time to find rule breakers before the costs approach $72. And the limit doesn't have to be exactly the same.


To repeat something I said to start this thread (edit: just realized I actually said this in a different thread nearby[0] - my responses here probably only make sense in the context of my responses in that thread, sorry about that), which I think seems to keep getting lost: It is not impossible to enforce this, it is (like any arms race) a costly ongoing burden that is much more difficult to implement than the "you can easily enforce this" responses in this thread.

This is no different than any other "I could implement this in a weekend" thread that you see here. I'm not saying "Dropbox is incapable of implementing enforcement for this ToS violation", I'm saying that I'm confident they've already spent many millions of dollars on it, and have decided (wisely, in my view) that changing the product to more fundamentally preclude this kind of usage is the better trade-off to take.

So, having said that, to answer your question: at 750GB per day, uploading 1PB in a week only requires parallelism of 200. That is not many users for an "enterprise" account. (And I suspect this becomes costly well below 1PB per week.)

You'll be able to think of "well you can just ..." for that as well, and I promise you that there are "the abusers can work around that by doing ..." for those things. Because, like I keep saying, it's just a normal arms race pattern. It's not that there is nothing you can do about any particular thing that people do, it's that you have to keep doing it ad nauseam.

By the way, this change to the product is also just one more parry in this arms race. It is unlikely to fully solve the problem (and I'm confident they know that), just another useful tool.

0: https://news.ycombinator.com/item?id=37265144


> which I think seems to keep getting lost: It is not impossible to enforce this, it is (like any arms race) a costly ongoing burden that is much more difficult to implement than the "you can easily enforce this" responses in this thread.

Right, but my idea is that the cost is less than the payment they receive.

> So, having said that, to answer your question: at 750GB per day, uploading 1PB in a week only requires parallelism of 200. That is not many users for an "enterprise" account. (And I suspect this becomes costly well below 1PB per week.)

I think you missed part of my argument, which is that if you want parallelism 200 then you need 200 licenses, which means dropbox gets $4800. That's much more than enough to pay for the 120 terabyte-months such a user would consume in that week.

There is the worry about how high of a spike it would be versus their buffer of free space, but someone signing up for 200 licenses at once at maximum upload rate and thinking they'll avoid scrutiny is... pretty unlikely. Also if we assume they'd run it similarly to how they used to do it, they'd have to be manually approving increases on that giant pile of data, so that brings even more scrutiny.

Also I think their limit for quite a while was 100TB per week for the entire organization. No need to worry about petabyte spikes then.

> Because, like I keep saying, it's just a normal arms race pattern. It's not that there is nothing you can do about any particular thing that people do, it's that you have to keep doing it ad nauseam.

Which is not a problem if you're getting enough money for the trouble.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: