This is an intuition based on some talks I had with an Alexa engineer. When voice prompts became big, companies like Amazon and Google took a “money is no object” approach to adding more features. In practice this meant hiring lots of teams of UX and engineers for each targeted feature. There was a jokes team, a recipes team, geography team, radio team, etc. And a lot of the answers were more hard-coded than the man behind the curtain would have liked you to think.
But now we know that voice prompts did not take over the world and that Alexa is about as useful as a toaster. So fire the teams, cut features people didn’t spend money on, and replace giant, hand-rolled QA-approved NLP processing trees with all the automated tech that makes the front news of HN.
There's been a nice stream of improvements to futex2 since.
NUMA support (finally landing!), https://www.phoronix.com/news/FUTEX2-NUMA-Small-Futex https://www.phoronix.com/news/FUTEX2-Improvements-Linux-6.16 (see also this fantastic recent submission on NUMA in general, absolutely critical performance stuff, https://news.ycombinator.com/item?id=44936575)
Io_uring support in 6.7 (2024), (with a nice write up on it speeding up postgresql aio), https://www.phoronix.com/news/IO_uring-FUTEX-Linux-6.7
Small requeue and single wait additions in 6.7, https://www.phoronix.com/news/Linux-6.7-Locking-FUTEX2