That's primarily because many SPAs aggressively prevent caching of all xhr responses using headers (which are ofc defined from the BE side)
And the reason for that is mainly to prevent edge cases and make sure people in CRUD apps see up-to-date content.
The experience with a no-cache max-age=0 server-rendered site would be very similar.
All of the headaches around custom routing code + restoration of state are pretty much obsolete wirh bfcache, introduced more than 5 years ago.
If you build a dynamic page and use HTTP headers to prevent caching, stop complaining about it. Most people making this argument want to offload dealing with external APIs to the frontend, and then complain about the result when no requirements around API response caching were defined, let alone time was given to define them.
No, this proves my point. Back in my days of backend programming, Varnish was the "magic" that made servers (especially CMSs) feel fast. Memcached is another common go-to cache.
Browser caching isn't a replacement as the first request for every endpoint is still going to be slow.
Yet somehow, APIs aren't treated with the same requirements for performance as web pages.
And the reason for that is mainly to prevent edge cases and make sure people in CRUD apps see up-to-date content.
The experience with a no-cache max-age=0 server-rendered site would be very similar.
All of the headaches around custom routing code + restoration of state are pretty much obsolete wirh bfcache, introduced more than 5 years ago.
If you build a dynamic page and use HTTP headers to prevent caching, stop complaining about it. Most people making this argument want to offload dealing with external APIs to the frontend, and then complain about the result when no requirements around API response caching were defined, let alone time was given to define them.