Small/medium SaaS. Had ~8 hours of 100k reqs/sec last year when we usually see 100-150 reqs/sec. Moved everything behind a Cloudflare Enterprise setup and ditched AWS Client Access VPN (OpenVPN) for Cloudflare WARP
I've only been here 1.5 years but sounds like we usually see 1 decent sized DDoS a year plus a handful of other "DoS" usually AI crawler extensions or 3rd parties calling too aggressively
There are some extensions/products that create a "personal AI knowledge base" and they'll use the customers login credentials and scrape every link once an hour. Some links are really really resource intensive data or report requests that are very rare in real usage
Not the same poster, but the first "D" in "DDoS" is why rate-limiting doesn't work - attackers these days usually have a _huge_ (tens of thousands) pool of residential ip4 addresses to work with.
I work on a "pretty large" site (was on the alexa top 10k sites, back when that was a thing), and we see about 1500 requests per second. That's well over 10k concurrent users.
Adding 10k requests per second would almost certainly require a human to respond in some fashion.
Each IP making one request per second is low enough that if we banned IPs which exceeded it, we'd be blocking home users who opened a couple of tabs at once. However, since eg universities / hospitals / big corporations typically use a single egress IP for an entire facility, we actually need the thresholds to be more like 100 requests per second to avoid blocking real users.
10k IP addresses making 100 requests per second (1 million req/s) would overwhelm all but the highest-scale systems.
We had rate limiting with Istio/Envoy but Envoy was using 4-8x normal memory processing that much traffic and crashing.
The attacker was using residential proxies and making about 8 requests before cycling to a new IP.
Challenges work much better since they use cookies or other metadata to establish a client is trusted then let requests pass. This stops bad clients at the first request but you need something more sophisticated than a webserver with basic rate limiting.
I've only been here 1.5 years but sounds like we usually see 1 decent sized DDoS a year plus a handful of other "DoS" usually AI crawler extensions or 3rd parties calling too aggressively
There are some extensions/products that create a "personal AI knowledge base" and they'll use the customers login credentials and scrape every link once an hour. Some links are really really resource intensive data or report requests that are very rare in real usage