Is this with newer Apple Silicon Macs? My 2020 M1 Mac Mini looks unremarkably normal on my 1440p display. I'm also going between that and my 14" M1 Pro Macbook Pro, which of course looks beautiful but doesn't really make the 1440p on the Mini 'bad'.
Edit: Adding that both of these machines are now running macOS 15.1 at this time.
In my experience, you can’t do any sort of scaling with sub-4K displays. This is “since M1”. Intel Macs, even on the latest macOS, can do scaling eg 1.5x at say 1440p, which last time I bothered with an Intel Mac required a workaround via Terminal to re-enable.
But that workaround is “patched” on Apple Silicon and won’t work.
So yes if you have an Apple Silicon Mac plugged into a 1440p display, it will look bad with any sort of “scaling”- because scaling is disabled on macOS for sub-4K displays. What you’re actually doing when you’re “scaling” on say a 1440p display is running that display at 1920x1080 resolution- hence it looks like ass. Back before Apple Silicon, running that 1440p display at “1920x1080” was actually just scaling the UI elements up to appear as though you had a 1920x1080 display- since it was still utilizing the full …x1440 pixels of the display, “1920x1080” looked nicer than it would now.
So brass tacks it’s just about how macOS/OS X would obfuscate the true display resolution in the System Preferences -> Displays menu. Now with Apple Silicon Macs, “1920x1080” means “2x scaling” for 4K monitors and literally “we’ll run this higher-res monitor at literally 1920x1080” for any display under 4K resolution.
> Back before Apple Silicon, running that 1440p display at “1920x1080” was actually just scaling the UI elements up
I’m almost sure that macOS can’t do that. It’s always done scaling by rendering the whole image at 2x the virtual resolution and then just displaying it on whatever screen you had. For example, for “looks like 1080p” on a 1440p screen it would draw onto a 2160p canvas (and do 2160p screenshots).
Yeah I was never able to get that to work on M1/M2 Macs. Intel, sure, but none of the workarounds (including BetterDisplay) worked on the ARM Macs. Do they now? I last tried in 2022.
If your 1440p monitor looks “fine” or “good”, it’s because the scale is 1x - for many people, including myself, UI elements are too small at 1x 1440p. I had to buy a 4K monitor so I could have larger UI elements AND crisp UI elements.
You may just not be seeing the visual artifacts on your screen because you don't know what they look like, or mentally adjust to what that screen looks like.
The same way someone might not notice motion smoothing on a TV, or how bad scaling and text rendering looks on a 1366*768 panel, or different colour casts from different display technologies. All three took me a while before I could tell what was wrong without seeing them side by side.
> You may just not be seeing the visual artifacts on your screen because you don't know what they look like, or mentally adjust to what that screen looks like.
Does any of that matter, though? Who bothers with the existence of hypothetical artifacts in their displays they cannot even see?
It matters once you get used to something better. Our brains are really good at tuning out constant noise but once you start consciously recognizing things it’ll remain noticeable. If your vision slips you won’t constantly be walking around saying everything is fuzzy but after using glasses you’ll notice it every time you take them off. Low-res displays work like that for many people – back in the 90s, people were content with 800x600 27” monitors but now that would feel cramped and blocky because we’ve become accustomed to better quality.
Their official network drivers are called Killer and are very malware-looking ("unknown developer" blocked by windows). It's unbelievable. Looks like some weird spyware junk too.
No idea then. Maybe it's something to do with Windows 10, but it definitely was showing me tons of "software blocked" and "unknown developer" warnings.
Further tangent, but I remember seeing Killer NIC's booth at ComicCon in 2007 with that very cool, very pointy heatsink (I assume that is what it was, anyway...?).
Yeah with the wired stuff back in the day it was a little wild in that their idea was basically put a mini PC on a card which offloads work. Never really turned out that beneficial for gaming but it was at least unique. Then later in their life they just started OEMing Intel and Qualcomm NICs and sticking a driver layer for overcomplicate QoS which wasn't even offloaded anymore. The good news was if you just use the base driver it means you more or less end up with a standard e.g. Intel NIC though.
Loads better than it being "Acer ZXGT423LV" which is the trend with smartphones and many other devices today. I'll happily take an "embarrassing" name over that mess.
Eh, knowing tech companies, we will end up with the worst of both
Acer Spectral MeatGrinder RPG Bazooka 16 ZXGT423LV-2024
Which should be unique enough, but then they just reuse the name and only change the code part
I write this as I look at an ROG Zephyrus G16 (2023) GU630HE (Last part is important, I think I had to check if it was the model with one of the RAM slots soldered)
The S1 and S2 split was primarily due to deciding against bricking older devices with limited system memory (32/64 MB of storage as well as memory which could only hold so much code). So, S1 is effectively a deprecated system to allow you to continue using those older players but no longer receive system updates (and music services, from what I recall, are in a "YMMV" status).
what? I mean I understand what you're saying, but literally what?
the whole point of technology is to surprise and delight customers and abstract these details away! I don't care about megablaaadywhowhos of storage or gigapoops of transfer. why are you threatening to brick my expensive hardware, and why then are you bifurcating the apps?
it's laziness and laziness at best
abstract the details away and give me the same delightful experience regardless of my hardware
I don't have a reddit account, and I was able to read it after agreeing that I was over 18. After that, it displayed the diatribe just fine for me. <shrugs>
reply