This doesn't make sense to me. Surely scaling the window according to the settings that apply to the monitor it's on is a job for the window manager, and not a job for the software running inside the window?
If the window manager were to upscale a window, you would get the typical artifacting, such as blurriness, from the upscaling. The application toolkit is the best place to do scaling, as it knows how to render text etc.
This is how it works on Windows as well; if the application reports it correctly supports scaling, Windows will let the application handle scaling (otherwise Windows will do it for the application, with the usual caveats).
>if the application reports it correctly supports scaling, Windows will let the application handle scaling
In contrast, unless I am severely mistaken, MacOS has the application render the window normally, then applies a scaling algorithm to the output when the user has set a scaling factor (which on a Mac is set usually through System Preferences > Display) that is not an integer. (I.e., has set fractional scaling.)
I much prefer how Windows and Wayland do it. (When I use Windows, I have the luxury of free choice in the apps I use: I spend .99 of my time in VS Code, Google Chrome and a few recently-written Microsoft-provided apps like Settings. A Windows user who needs old apps or apps written by less sophisticated developers might have a much worse experience.)
In fact, I am leaving MacOS after 11 years largely because of its relatively bad implementation of fractional scaling. I find it too blurry. (If my lifestyle required the use of a laptop, I might have stuck with Apple.) But I am unusual in ways that probably make a good implementation of fractional scaling much more valuable to me that it would be to the average user.