People are responding to this comment by saying "Screenshots work," and yes, they do, somewhat. However, you're limited to using the screenshot tool built in to your Wayland compositor, rather than any third-party tool. For example, my favorite screenshot tool supports automatic timed screenshots, of particular windows or the whole screen or of a specified region, of one desktop or several, and various customizations. Those features aren't supported by my compositor, and I don't have a choice. So Wayland limits my options.
Another area missing functionality is remote desktop. Yes, most Wayland compositors support VNC or similar remote desktop, but without the functionality of previous solutions. For example, I used x11vnc extensively: with it, I can share just one window or a whole screen, with various security options, and I can do so programmatically from the command line. Because VNC is now a feature built into the compositor, I no longer have a choice which remote desktop tool to use, and adding features is not a priority for maintainers of the compositors.
So, while, yes technically, you're right that some analogues of screenshots and remote desktop exist on Wayland, they do so without any of the modularity and flexibility that Linux is (allegedly) proud of.
> ...without any of the modularity and flexibility that Linux is (allegedly) proud of.
Linux seems to have been infiltrated by people who simply don't believe in this anymore, and so we keep getting "modern" "batteries included" integrated monoliths that suck at everything but supposedly make things easier because this new crop of people don't want to have to know about or compare multiple options :(. And like, because for specific use cases the monoliths can have better performance, that gets used as a way to undermine what used to be the unique selling proposition of Linux :(.
The good news is that this is not unfixable; there is a venue and a governance organization to standardize the protocols and compositor interoperability that is needed. And there has been notable progress.
Specifically when it comes to screen share / remote desktop, standardization of the API to invoke sharing is happening by way of the XDG portal API efforts. There is active collaboration there e.g. between the browser vendors and the compositor vendors to agree on an interface.
> there is a venue and a governance organization to standardize the protocols and compositor interoperability that is needed
I'm not sure if that applies specifically to this case, but I'm not a fan of some standardization of protocols that is going on in Linux desktops, because:
- desktop managers can't properly keep state of your windows anymore on shutting off/on monitors. Claimed reason why this is not fixable: "the standard requires it"
- I dislike MM/DD/YYYY date formats. File managers used to have their own setting to choose your date format. Now they use some common standard so you have to set your Linux locale to something non-US. This then breaks other things. It's much nicer to be able to configure things per-application.
- Kolourpaint used to have a good color picker that showed you the numerical and hex values of colors. Now it has a horrible one that is only some palette where you can edit individual entries. You can't easily see the RGB values of a pixel in an image you open in Kolourpaint anymore. All due to "standardization", using a shared color picker component
- Focus stealing prevention no longer properly works, all due to adhering to some desktop standards
- Desktops like KDE and Cinnamon using horrible non-human editable json/xml/base64-encoded UTF-16 formats for their config
- Some Qt or GDK applications not having icons sometimes, due to them expecting some common icons available somewhere. This issue used to not exist, applications came with their own icons that just worked
I much preferred how unix and the desktop used to be, where you could manually edit textual config files to do whatever you want, and had the power and flexibility to change almost anything in intuitive ways
>I'm not a fan of some standardization of protocols that is going on in Linux desktops
Standardization and Linux desktops are sometimes mutual conflicting terms.
There's doesn't seem to be much standardization on this front but tribes of people saying "new things should be this way" and other tribes saying "n'ah mate, that sucks, we'll keep using our own better way".
Linux desktops don't have the Apple/Microsoft dictatorship powers to actually enforce any kind of GUI standards so we get this constant hassle and even more fragmentation.
> Linux desktops don't have the Apple/Microsoft dictatorship powers to actually enforce any kind of GUI standards so we get this constant hassle and even more fragmentation.
Do Apple and Microsoft actually have them? The GUI experience on Windows is relatively disjointed, and Apple famously did it to itself with e.g. randomly chosen brushed metal windows and other app-specific background textures. It appears far more common now for popular applications to ship their own look & feel and even theming. Let's say Discord, or even first-party, VS Code.
I'd say the accomplishments of standardization and interoperability on the Linux desktop are actually more substantial than on either Windows or MacOS. Consider that e.g. KDE and Gnome applications share a lot of important standards and are generally able to run in either vendor's shells. Windows and MacOS each effectively only have a single GUI frontend, so this kind of collaboration between vendors doesn't need to be accomplished, and it's not happening between the app makers, either.
>brushed metal windows and other app-specific background textures.
You're about a decade late with this complaint... and wrong. The brushed metal look was for application windows that resembled "appliances", which would remain open as you loaded/saved different projects. This was to distinguish them from the normal white document windows, which are tied 1-to-1 to a file.
I'm not saying they always followed this rule exactly... but there definitely was a rule. And the Apple of the era would publish detailed Human Interface Guidelines detailing all the thinking and intended practices. Third party mac software of that era was famously extremely consistent too, because of this shared base.
I only have one request from Linux desktops: mac style shortcuts with meta instead of ctrl. I also know it will never happen because everyone took their cues from messy Microsoft instead of the superior Apple practices.
You can usually set a custom date format without changing the locale, and you can usually find a tool to edit the nasty config file it's buried in. It is a pain though.
> For example, my favorite screenshot tool supports automatic timed screenshots, of particular windows or the whole screen or of a specified region, of one desktop or several, and various customizations. Those features aren't supported by my compositor, and I don't have a choice. So Wayland limits my options.
Xorg doesn't support his either. Your screenshotting tool runs the timer and takes the screenshot when the timer expires. You complaint is like saying "the Linux kernel won't render markdown for me". No, it won't, you're looking at the wrong part of the stack.
For example, you can simply run `sleep 3 && shotman --output` to take a screenshot of an output in three seconds. There might be other GUI tools with a pretty timer setting. I've no idea if such a thing exists or not, but it's perfectly possible to write without any changes to the compositor (e.g.: the compositor provides the necessary APIs for thi).
You're missing my point. The Wayland architecture does not give me a choice in my screenshot, video recording, and remote desktop tool: you have to use the functionality built in to the compositor. Many compositors (notably GNOME and KDE) don't support the Wayland semi-standard interface for these things.
You mentioned shotman, but that isn't even a real screnshotting tool: it's just a screenshot GUI that depends on existing functionality that may or may not be present in your compositor.
If all the compositors actually supported a standard for providing this functionality; and if that standard provided enough flexibility to do what I want, then I wouldn't complain. That's the essence of modularity. But the standards are poor and poorly-supported, and I literally can't do what I want. Your trivial example of using the sleep command to simulate delayed capture doesn't address the actual issue, and fails to recognize that no amount of hacky solutions will allow me to use a VNC screen sharing tool that shares only a specific window, because the Wayland compositors have chosen not to support that feature and failed to provide a standard way for external tools to do so.
>The Wayland architecture does not give me a choice in my screenshot, video recording, and remote desktop tool
Yes it does. Use the screenshot, screencast and remote desktop XDG portals for that. It's understandable you're confused because the X11 way was to try to jam everything into X extensions whether it made sense or not. The overall trend lately is to move APIs into other components (XDG portals, pipewire, DBus, systemd, etc.)
Portals are a really weird thing. They're mostly pushed via the Flatpak/GNOME crowd, and are intended to be cross platform by using an external messaging bus: d-bus (e.g.: they don't use the X protocol and Wayland). The same interface can be used in both Xorg and Wayland, and that's the only upside of portals.
However, their implementation is not meant for native applications and are pretty inefficient. For example, the screenshot portal will save the image into disk and return its path -- so if you wanted to render the image on screen, you now need to read and decode a file. The screenshot is encoded, serialised to disk, serialised FROM disk and then decoded again just to get pixels from the compositor back into the compositor.
Portal really don't make sense if you're writing native wayland clients, since portals just really end up using the wayland protocol under the hood, with two or three services in the middle handing messages over just to expose a "standard" API.
The portal only allows an external application to request the service from the compositor. He portal does not allow an external application to provide additional functionality beyond what is already provided by the compositor.
In my use case, I want to screen share a specific window, not the whole screen; this is functionality previously provided by a third-party X11-based tool, which cannot be provided from an XDG portal (even on those compositors that support portal at all).
There's discussions in progress about adding the ability to capture a single toplevel window to the screencopy protocol. Nobody's decided that it's a bad idea, it merely hasn't been implemented yet.
Regarding the fact that GNOME doesn't implement a lot of functionality: I don't think it's fair to criticise the protocol and architecture because one implementation decides not to implement a given feature. This is like saying "C sucks because some specific compiler doesn't support the feature I need". It's the compiler you've chosen that sucks in that case, not the language.
I'm sure there's some X server implementation out there that's also lacking features you want -- but you wouldn't blame the protocol itself for that implementation's shortcomings.
> Xorg doesn't support his either. Your screenshotting tool runs the timer and takes the screenshot when the timer expires. You complaint is like saying "the Linux kernel won't render markdown for me". No, it won't, you're looking at the wrong part of the stack.
Xorg also doesn't prevent you from using an application that does have these functions. Wayland does prevent you.
> but it's perfectly possible to write without any changes to the compositor (e.g.: the compositor provides the necessary APIs for thi).
The necessary APIs don't exist, so it isn't possible to write the extra functionality without changing the compositor.
Honestly, I don't understand why anybody defends Wayland at this point. It's been, what, ~~six years? Six years~~ FOURTEEN FUCKING YEARS we've been telling you that we want to be able to use our screenshot and recording applications and that removing functionality that we use day-to-day wasn't acceptable. Are the Firefox devs the same people making Wayland? Is that why they're so unresponsive to user needs?
> Is that why they're so unresponsive to user needs?
It's because Wayland is a set of protocols, rather than a single dominant implementation.
In order to do it right, cross-compositor and flexible, you have to get different actors with different interests and bandwidth agree on a standard. This takes time.
In order to get it working fast, developers need to make a compositor-specific implementation first and then put in the extra work of getting it through the standards discussion as well as switching their compositor ecosystem to the new standard.
Until this happens with all parts that you care about, you're going to be annoyed either about interoperability or functionality. Pick your poison, and cue Moxie's "The Ecosystem is Moving" blog post. Also, keep using X11 until Wayland has the features you want. Most DEs/WMs haven't ripped out X11 support yet, and hopefully won't until their support of Wayland protocols is solid enough.
I acknowledge that the Wayland team is not responsible for the implementation decisions of the GNOME team.
However, the Wayland team is responsible for the set of protocols that they've developed and that they've asked others to implement. The Wayland team has failed to define protocols that are flexible enough to provide basic functionality that users expect, even if they were implemented perfectly.
This should not be a process that depends on individual developers building compositor-specific remote-desktop tools first, then praying that someone likes them enough to put it in the standard. Wayland built the standard, they just built an insufficient one.
>If the Linux kernel had worked the same way, we'd never have gotten Linux.
Yes we would. In fact it's exactly how Linux works and has worked for decades. New features get added in drivers and then get rolled up into driver subsystems when enough hardware supports them.
I you feel so strongly about missing features, I suggest you report the issue and propose and approaches to it. Complaining on HN that "there's functionality missing" won't change anything.
I believe that the only thing missing is capturing individual windows with screencopy. There's actually an issue for that and ongoing discussion on how to implement it. Nobody's decided it's a bad idea, it's simply a matter of nobody having done it so far.
Xorg never "implemented" this feature. It's just that any client can read what other clients are doing. So Skype can screen-scrape your password manager by default. Obviously this was not a good choice, so Wayland has dedicated protocols for this functionality, with the intent of restricting them only to privileged clients.
Finally, while no _standard_ protocol exists to address this, there are compositor specific ways of capturing a single window, so most users have their needs met.
> Xorg also doesn't prevent you from using an application that does have these functions. Wayland does prevent you.
Surely, as much as seatbelts prevent my movement. And literally every wayland compositor has this functionality, that there is not a universally supported standard here sucks, but that’s just the way of life of the bazaar. There is no one company exerting a cathedral approach over the whole thing, and I believe that is what most of the linux users prefer. It just so happens to have some tradeoffs as well.
> I still use X11, where and rely on the following features
To which I would add:
(1) taking screenshots from the command line (xwd)
(2) moving windows around from the command line (xwit)
(3) keyboard-free copy-paste between different programs: select to copy, middle-click to paste
(4) global indicator of where my cursor is (xeyes)
(5) define global key combinations that launch programs (xbindkeys)
(6) globally configure the keyboard in a particular way, key by key, not according to some pre-configured "language" (setxkbmap, xmodmap, xkbset)
(7) set keyboard repeat/delay to absurdly unusable values (xset)
(8) run graphical programs from a headless virtual machine (ssh -X)
I do not mind whether I'm running x11 or wayland, but I care a lot for all these features that are crucial to my workflow. How likely is it that items (1)--(8) will be supported somehow under wayland?
I'm a bit concerned that wayland gives a lot of attention to things that are meaningless to me (tearing, scaling, etc), and does not care at all about these basic usability features.
None of these discussions ever go anywhere. They will tell you that wlroots is standardizing all these things etc etc but as it stands today every compositor is at best reimplementing a subset of the features you describe. If I had enough money I would pay the Wayland people to work on something else and let the project collapse because they are constantly undermining the reasons I use Linux on the desktop.
There is a way to protect against LD_PRELOAD: use sandboxing. X11 is bad for security because it's an intentional escape around the sandbox.
The "tear it down and build something else argument" is also going nowhere, anything else you build is going to run into the same questions of how to go about implementing these features.
Most of these things work fine (though with wayland-specific tools, rather than X-specific tools), except:
> (4) global indicator of where my cursor is (xeyes)
AFAIK, no tool allows you to do this. The issue is: clients generally can't snoop on input when they're not in foreground (e.g.: to make key-loggers less trivial).
A privileged client could do this using something like layer-shell. It's a hack, but it should work. Somebody needs to write it tho.
> (5) define global key combinations that launch programs (xbindkeys)
Compositors implement this themselves for the above reason.
> (8) run graphical programs from a headless virtual machine (ssh -X)
I guess this should work with XWayland? Keep in mind, most X applications work via XWayland. This means normal applications; xbindkeys or xeyes won't work since they rely on reading input when other applications are in foreground.
Pretty much any accessibility or automation software suffers here without having to be a root. Wayland makes accessibility impossible is developer with its fragmentation. Want to make tools that work across now Wayland ecosystem good luck with that.
> Pretty much any accessibility or automation software suffers here without having to be a root
Even with root they'll have a harder time; there's a difference between "I can trivially tell the X server to give me details about events and then interact with arbitrary windows in arbitrary ways" vs "by running as root, I can watch raw input events come in from the mouse/keyboard and then try to figure out what they mean".
It’s almost like there are layers to security. You don’t have to be root, but nor should any random npm install script be able to steal keystrokes and send them away. While we are it, linux desktop security just sucks all around.
Thanks for your answers! It's reassuring that most things will still be possible.
Still, xeyes is the favorite program of my kid who is starting to learn linux... I'd have a hard time selling wayland to him.
> to make key-loggers less trivial
Really? Is that a serious reason? Key-loggers are still very much possible because any program can read the memory of all other running programs of the same user. This seems to me like a pointless limitation that creates a lot of unnecessary complexity. But what do I know.
> Really? Is that a serious reason? Key-loggers are still very much possible because any program can read the memory of all other running programs of the same user.
That's not true. Reading other processes memory requires root even for processes of the same user.
I was going to reply in an impulsive manner (saying, derisively "so how does strace work, then? and why can you attach gdb to already running processes?").
But it turns out that you are right! Well, at least in modern ubuntu/fedora desktop installs, a few years ago they disabled the ptrace(2) system call. ("The maniacs! They did it!", I'm tempted to cry in anger.) You can still read children processes, but unless you change the yama defaults in the kernel, ptrace(2) no longer works for processes that are not children.
I didn't notice because I surely have changed this default in my old laptop because I use ptrace daily. But you are right that modern linuxes are likely thus crippled.
Still, it seems that an attack model based on defending against a rogue user process is quite doomed. Can't this process change the init files of other programs that will run, even if they aren't its children?
> Still, it seems that an attack model based on defending against a rogue user process is quite doomed. Can't this process change the init files of other programs that will run, even if they aren't its children?
No, they often can't. Things like Flatpak make it so apps have no unnecessary access to the filesystem.
Fortunately, things like snap and flatpak are still fringe projects with no realistic perspectives for a widespread adoption. Which would be ridiculous, anyway: why would you want all the programs in /usr/bin to be isolated "apps"?
I don't care about tearing. I'm sure my X11 tears all the time when watching videos but if I notice it I shrug it off. The point is that one of the main drivers of Wayland was to be frame perfect, have no tearing. Commendable goal, why not? but it's turning out that they had to write code to get no tearing and more code to allow it.
Wayland is frame perfect, and it's great.
What's the issue with allowing users or the compositor to disable that in really specific case without sacrificing the frame perfect principle for all other uses?
It's the best of both worlds
With X11, you defaulted to having tearing but could run a compositor if you wanted in order to not have it. Then Wayland showed up and loudly proclaimed that that was unacceptable and we had to be pixel-perfect, every frame, every time. ...Then a few years later, Wayland added the ability to tear, because it turned out to be useful, bringing it back to parity with X. This isn't... necessarily... technically invalid, but the optics aren't great.
You don't understand how this works I think. The idea of this protocol is for the compositor to change how it does page flip in specific conditions. So just allowing a player, for example, to prefer visual bugs over latency on a specific game over bug free image but having everything else still being bug free. I'm not sure you know how building software works but it's called iterating. The main need is bug free rendering, which is achieved, an iteration is a niche need of buggy visual prefered over latency, this will be allowed.
There's plenty of tools for this. I wrote a light and simple one that shows a
tiny thumbnail, [`shotman`](https://git.sr.ht/~whynothugo/shotman). But there's
plenty others of all sorts.
> not have unnecessary mouse/keyboard to monitor delay
There's none that I can notice. Mice cursor are rendered with hardware support
when available. Not sure what you reference is here.
> 4K fullscreen, windowed fullscreen and windowed gaming with nvidia drivers, for native and steam proton games
All this works on non-nvidia. I can't speak for nvidia, I'd suggest avoiding
nvidia if you want to use Linux (or any other floss OS). Generally, nvidia on
Linux isn't very pleasant.
> choosing to have vsync enabled or disabled in games
> There's none that I can notice. Mice cursor are rendered with hardware support when available. Not sure what you reference is here.
Some compositors don't use the hw cursor plane. All others that do don't update the cursor position asynchronously on mouse events, just once on the next swap. Those cases add an extra half-frame latency on average. It's worse at lower refresh rates.
> There's plenty of tools for this. I wrote a light and simple one that shows a tiny thumbnail, [`shotman`](https://git.sr.ht/~whynothugo/shotman). But there's plenty others of all sorts.
Your tool works on wlroots compositors, which is to say it doesn't work on GNOME or KDE.
He is referring to input lag: the delay from input to visible change on the monitor. The double buffering and compositor step added by Wayland adds some delay.
You may not personally notice it, or be used to it, but it's always there. Some people are more bothered by input lag than others, and there are so many compounding factors today that make it worse than ever. It's not only the double buffering in graphics subsystems like Wayland. Monitors are also often adding a frame or two delay to bump up their pixel "response times" and reduce ghosting by looking at future frames.
I swear, using Wayland or MacOS or Windows 10 on the fastest hardware today still feels slower and laggier than 10 year old hardware running plain X11 or Windows 7 (with the compositor turned off).
Given the choice between tearing and input lag, I'll take tearing any day.
Don't these results show a 20% slower based on the median?
Plus, the test uses a display at 120Hz, which is a substantial head start -- by making each frame of delay half as long compared to us mere mortals running at 60Hz.
There are a couple of tools that can programmatically perform input events. They have some caveats because they work at a lower level so you might need to give them root permissions when that wouldn't have been necessary before, but they do work. The question is if that's all that you were using xdotool for; it's not limited to faking input events. For instance, xdotool can monitor events and trigger actions when you change focus on windows, or specific windows, and it can search across all desktops to find a specific window, bring it to the foreground, resize it, and then feed it input events. If all you need to do is simulate keyboard and mouse events, there are alternatives, but that's not all the tool could do.
This 1,000% for accessibility software or automation frameworks. We don't want a privileged root privileges for emulating mouse/key presses in any application. As a side note Monitoring events might be reasonable, but I wish service could register to monitor without root privileges. Therefore, allowing user discretion.
Screenshots work. No idea what xdotool is. Did not feel any delay. Not sure about games, Factorio works fine with Intel GPU. Not sure about vsync either.
I also noticed this delay in mouse movement but I couldn't find anything online or even people noticing the same problem.
For me it happens in the second monitor (120Hz with HDMI) there's a lag in the cursor movement. When I switch to X11 that lag is gone and the movement feels more fluid.
The strange thing is that I don't see that effect on the laptop screen (144Hz). I'm using Plasma btw
No it doesn't do anything like that. I am not sure where this idea came from but it's totally incorrect.
I have also been using the wl-clipboard program with no issues like the parent commenter. I use it for my password store, which decrypts a file containing the password with GPG and pipes it into the clipboard!
Just do yourself a favor and stick with using what already works. Don't upgrade to the new shiny thing unless you already know the reasons why you want to.
Among all the other technical issues, Wayland devs have an attitude issue. Users asked for fractional scaling for years and were faced with flat refusals and insulting quips from the wayland developers. The only reason they've added it now is because Valve asked them for it. Wayland devs think users are worms and corporations are gods.
The thing is, the old way is also clearly broken. For me, X11 has always been a slow, stuttery, screen-tearing mess across a range of hardware, while Wayland is much better at shipping complete frames on time during normal desktop use. So it's a choice between the broken old thing and the broken new thing.
I once tried bringing this up in the discussion, but I believe this protocol still works with some sort of "logic pixels" abstraction. And there is concern about rounding.
If rounding / scaling artifacts are avoidable, this is fine. On the other hand if this stuff occurs in practice like with macOS (at least at one point, with the 2x render and downscale), this is totally fucked.
Wayland is much more low level than, say, CSS. So I see zero problem if the scaling factor can instead be a black box which applications are familiar to treat however they like. This is how "*.dpi: 120" in a .Xresources file works, for example. Not all applications might be able to support fractional scaling, but those that do can do it perfectly without any risk of scaling artifacts.
This was repeatedly rejected for reasons that I don't understand.
There has been "fractional scaling" support in the way macOS does it forever: if you wanna scale at 1.8x, let the application draw at 2x, then make the compositor downscale everything. So it's not like it hasn't been possible before, it's just more efficient now I suppose?
As I've pointed out in the MR, this only allows compositors to communicate to windows a scale factor (how many framebuffer pixels to send for each logical pixel of window size), but does not switch window sizes and subsurface positions/sizes from integer logical pixels to fixed-point logical pixels (rounded to integer physical pixels) or integer physical pixels. So Wayland is absolutely not ready for pixel-precise window resizing and positioning in the presence of fractional scaling.
Is this something which applies at a constant scale factor across the entire session, or can this theoretically be applied on a per-application basis? Is it flexible enough to allow "zooming in" on a running application?
I wonder if this merge will improve the odd fractional scaling issues I've been having with my framework laptop. Blurry text and windows for electron apps really sours the experience of using the wayland with the laptops resolution.
This merge is not going to improve anything by itself, in the same vein as update to HTTP protocol is not going to improve any existing HTTP servers and clients.
On the specific topic here, fractional scaling, macOS gets it disastrously wrong, only supporting downscaling, which means that it’s impossible to get pixel-perfect output on fractional scales, which are the norm on most of their devices. I seriously can’t comprehend how they ended up going that way, given that they were the ones in a position to force developers to get things right, far more than Windows, but no, they just didn’t even try. On the contrary, Windows has worked with fractional scaling from the start of its high-DPI story, and although it did take a while to get it right in all circumstances (I think the last pieces came in 2017, with DPI_AWARENESS_CONTEXT_PER_MONITOR_AWARE_V2 and also incidentally GDI scaling to automatically improve most older, DPI-unaware apps), it’s been robust since on all but a few badly-made apps that claim DPI-awareness but actually lack it (and you can forcibly change how they behave if necessary).
They used to be, but not anymore. However, they switched, because even if not pixel perfect, it is good enough. At sizes used by their laptop, most users are not going to notice that it is not pixel perfect.
The system they use gets them something: 1) apps do not have bother with fractional scaling, so they do not get to bother about rounding errors and 2) they get to use output encoder for scaling, which is essentially for free, instead of GPU, which will take its toll on power consumption and GPU load.
I don’t have any Apple hardware, but my recollection from seeing those who do a few years back (including when they first introduced the Retina displays) is that all of these laptops were defaulting to non-integer scales, and at least a little uncomfortable at 2×.
I don't really know how HiDPI is implemented on Windows, but I do know some very popular VST plugins (music production software, which usually uses its own GUI and scaling integration) which still doesn't scale properly in popular plugin hosts, so you can't really talk about "getting it right in all circumstances".
This problem doesn't exist on Mac where the OS is kind of responsible for all the scaling. So, between MacOS's approach which may be not pixel-perfect but "just works" and Windows's one which usually works perfectly but still has problems in certain categories of software, I'll probably choose the former.
Firstly, your VST situation will be that the plugin, the host or both are not implementing things correctly: one or the other has claimed DPI-awareness inappropriately. That’s on them.
Secondly, you misunderstand the relative situations of macOS and Windows. The difference is not to do with their approach to scaling, but rather their approach to supporting old software, and the diversity of hardware. Social factors, not technical. Apple could absolutely have done fractional scaling, and it’d have worked just fine for them; the present situation in how well high-DPI stuff works across platforms would be indistinguishable: macOS would still have basically nothing broken, and Windows would still have quite a few widely-used pieces of software getting it wrong. Here’s why:
Apple developers are very probably using a high-DPI screen, so they’ll immediately notice if anything is off. Apple is controlling, and anything published through their store will have any obviously-bad scaling noticed and rejected. Apple doesn’t care about long-term compatibility, so earlier software that didn’t implement scaling properly largely just doesn’t run at all any more.
Windows developers are probably using a low-DPI screen, and are publishing to a largely unmoderated platform that still has software from decades ago run just fine.
Functionally there’s no difference between fractional and integral scaling in how well software will support them. If you try to implement things completely from scratch, fractional will be a little harder because you have more rounding decisions, but in practice you always use a library that takes care of that for you (normally a full widget toolkit, but VST plugins will often use something somewhat lower-level but still above the physical-pixels layer except when you deliberately opt for fine control), so there is literally no difference at all. The difference in how well things work on these different platforms is purely other factors.
They did¹, but it turns out the simpler solution just works, and that pixel-perfect purity doesn't matter because your eyes can't resolve HiDPI pixels any more than they can resolve the grid of subpixels that they're made of.
Thank you for saying this. With Apple being the "UX" and "media users" company, this level of negligence and apathy shocks me so much I keep on self-doubting macOS actually works the way I remember. It's crazy, just crazy.
Hmm maybe I'm missing something but I have not had any scaling problems in Windows. Maybe an argument could be made that adjusting DPI is as easy as MacOS as of Win 11 (maybe Win 10 too but I didn't use Windows too much then), but I've never had a problem increasing scaling on any of my 4k screens
It is easy to find windows app, that is either blurry (basically almost all installers), or is confused by hidpi (take any qt app and you have 1:1 chance it will get it wrong, like this: https://imgur.com/a/Ad0ZkmX).
How on Earth nobody thougt about making Wayland pluggeable, so somebody could perfect the art of screenshoting, other would perfect drawing windows, other capturing video, then this plugins would have been re-used by everybody out there.
It was a good idea, but the implementation clearly isn't working.
My bet: Wayland will be the Windows Vista or the OS/2 of Linux.
-taking screenshots
-auto keyboard/mouse scripting with xdotool
-not have unnecessary mouse/keyboard to monitor delay
-4K fullscreen, windowed fullscreen and windowed gaming with nvidia drivers, for native and steam proton games
-choosing to have vsync enabled or disabled in games
-EDIT: I should another important one for me: xclip, read from and write to the clipboard in the command line
How's the current situation of Wayland for these? Thanks!