Hacker Newsnew | past | comments | ask | show | jobs | submit | rob74's commentslogin

When O'Leary accuses others of "scamming and ripping off unsuspecting consumers", what he really means is that only Ryanair should have the right to scam and rip off Ryanair passengers...

...or getting run over by a tank like suv with limited vision around the car (it's tank like after all) while cycling.

Keep in mind that AAPL came pretty close to becoming absolutely worthless around the mid 1990s before Steve Jobs rode to the rescue. Which is to say, you would really need a crystal ball to make such predictions. I could definitely see an "alternate universe" where Apple fared a bit worse and Commodore didn't mismanage the Amiga as much, then Commodore could be in the place where Apple is now...

yea, most people would have dumped the stock back then and not HODL'ed it through the 2000s until now...

Some people found the faith and got onboard in 1998, though, when the iMac took the world by storm. If only I held for a bit longer… :D (I never had $40k to invest, though, so the returns would be smaller anyway).

It's fascinating that the biggest CRT ever made had a 43" diagonal, which is at the low end for modern flatscreen TVs. But yeah, I can see why the market for this beast was pretty limited: even with deinterlacing, SD content would have looked pretty awful when viewed from up close, so the only application I can think of was using it for larger groups of people sitting further away from the screen. And even for that, a projector was (probably?) the cheaper alternative...

In the late aughts I worked a summer at a company that was designing an articulating (flat screen) TV mount. I went with the engineers to one of the Intertek testing sessions. We wanted it to be rated for a 60" TV, but I was given the impression that the weight formulas they used for testing were based on CRT screens. The salesperson who came with us was giddy seeing the thing loaded up with 1000lb of steel plates and not giving way, but the actuators could not lift and our advertised rating was not more than 200lb.

Even at just 43" it still weighed 450lbs. I bought a 27" CRT some years ago and even that was a nightmare to transport

I have one of those Sony WEGA CRT TV's, which were widescreen and even had HDMI.

https://www.mediacollege.com/equipment/sony/tv/kd/kd30xs955....

148 pounds! A total nightmare to get into our car and into our house.

WORTH IT.


I remember having the 36" version in ~1997. I wouldn't want to guess how much it weighed, it was insane. I remember how impressive it was watching the Fifth Element Laserdisc on it.

I had the first high-def Sonys in the US market. I worked at a high end audio video store in the mid 90s and they gave it to me cheap as they couldn't get rid of it.

https://crtdatabase.com/crts/sony/sony-kw-34hd1

Even at 34", the thing weighed 200lbs (plus the stand it came with). I lived in a 3rd floor walk up. I found out who my true friends were the day we brought it back from the store. I left that thing in the apartment when I moved. I bet it is still there to this day.


I had the 40" version and I left it in the house when I got divorced. That thing was insane to move. Needed minimum three people to lift it.

Most likely it's a central component of the buildings statics calculation meanwhile

They put it on a floating surface, now it's the building's earthquake counterweight.

I'd forgotten how heavy CRTs are. A local surplus auction has a really tempting 30's inch Sony CRT for sale cheap, but when I saw it was over 300lbs I had to pass on it.

I remember I had a 27inch crt on my desk. The desk top bended after a humid rainy season so I had to fix it by adding multiple metal supports.

A lot of those CRT screens had a pretty low refresh frequency, you were basically sitting in front of a giant stroboscope. That was particular bad for computer screens where you were sitting right in front of them. I think they pretty much all displayed at 30Hz. I can imagine how a gigantic screen can get pretty uncomfortable.

I recall a lot of people playing counterstrike at 640x480 to get at 100+hz refresh rates. The lower the resolution, the faster you can refresh. I don't recall the absolute limit but it would give the latest LCD gaming panels a serious run for their money.

In the meanwhile, oled monitors can go to 480hz.

If you pay extra for that. Meanwhile _any_ CRT could trade off resolution for refresh rate across a fairly wide range. In fact the standard resolutions for monitors were all just individual points in a larger space of possibilities. They could change aspect ratio as well. This can be quite extreme. Consider the 8088 MPH demo from a few years back (<https://trixter.oldskool.org/2015/04/07/8088-mph-we-break-al...>). See the part near the end with the pictures of 6 of the authors? That video mode only had 100 lines, but scrunched up to make a higher resolution.

Well, we are discussing a CRT TV that was $40k new a life time ago, so perhaps the fact that it costs $599 to get a 480Hz OLED today is not a consideration. To the point though: it is a fallacy to believe that CRTs could arbitrarily shape their resolution. While the input signal could cover a wide range of possible resolutions and refresh rates depending on the bandwidth supported, the existence of apperture grilles or shadow masks imposed a fixed digital reality that limited the maximum possible resolution to much lower values than the typical 4k panels that we have today. The "pixels" didn't become larger on lower resolutions: they just covered more dots on the mask. We can get much better results today with scaling than we ever could on CRTs, as awesome a technology as they were 40 years ago.

Sure, but 99% of that cost was paying for the absurd physical dimensions of that particular television.

> The "pixels" didn't become larger on lower resolutions…

Strictly speaking, the CRT only had discrete lines not pixels. Within a line the color and brightness could change as rapidly or slowly as the signal source desired. It was in fact an analog signal rather than a digital one. This is why pixels in many display modes used by CRTs were rectangular rather than square.

> We can get much better results today with scaling than we ever could on CRTs…

I say it’s the other way around! No ordinary flat–panel display can emulate the rectangular pixels of the most common video modes used on CRTs because they are built with square pixels. You would have to have a display built with just the right size and shape of pixel to do that, and then it wouldn’t be any good for displaying modern video formats.


Seems irrelevant to bring up cost for something that is streamline-priced today, but sure, let's move on.

> Strictly speaking, the CRT only had discrete lines not pixels.

The electron gun moves in an analog fashion, but when it hits the glass surface, it can only go through specific openings [1]. These openings are placed at a specific distance apart [2]. This distance specifies the horizontal, digital, max CRT resolution.

> No ordinary flat–panel display can emulate the rectangular pixels of the most common video modes used on CRTs because they are built with square pixels.

Today's panels have achieved "retina" resolution, which means that the human eye cannot distinguish individual pixels anymore. The rest is just software [3].

[1] https://www.youtube.com/watch?v=13bpgc8ZxTo

[2] https://en.wikipedia.org/wiki/Dot_pitch#/media/File:CRT_mask...

[3] https://www.reddit.com/r/emulation/comments/dixnso/retroarch...


Any CRT given its driving circuits and deflection mechanisms can take it. But yeah, refresh rates of CRT aren't really tied to price.

all CRTs televisions were either 60Hz or 50Hz depending on where you are in the world

Yes and no. Half of the screen was refreshing at a time, so it was really flashing at 30Hz. You still had a visible stroboscopic effect. True 60Hz and 100Hz screen appeared in the late 90s and made a visible difference in term of comfort of viewing.

I think you're mixing monitors and TVs together.

CRT TVs only supported vertical refresh rates of 50Hz or 60Hz, which matched the regional mains frequency. They used interlacing and technically only showed half the frame at a time, but thanks to phosphor decay this added a feeling of fluidity to the image. If you were able to see it strobe, you must have had an impressive sight. And even if they supported higher refresh rates, it wouldn't matter, as the source of the signal would only ever be 50/60Hz.

CRT monitors used in PCs, on the other hand, supported a variety of refresh rates. Only monitors for specific applications used interlacing, customer grade ones didn't, which means you could see a strobing effect here if you ran it at a low frequency. But even the most analog monitors from the 80s supported atleast 640x480 at 60Hz, some programs such as the original DOOM were even able to squeeze 70Hz out of them by running at a different resolution while matching the horizontal refresh rate.


For some reason I remember 83Hz being the highest refresh rate supported by my XGA CRT, but I think it was only running at SVGA (800x600) in order to pull that rate.

Some demos could throw pixels into VRAM that fast, and it was wild looking. Like the 60Hz soap-opera effect but even more so.

I still feel that way looking at >30fps content since I really don't consume much of it.


> some programs such as the original DOOM were even able to squeeze 70Hz out of them by running at a different resolution while matching the horizontal refresh rate.

400p at 70 Hz was the default resolution of the VGA, pretty much all the classic mode 13h games ran at 70 Hz.


Yeah I remember I could not use a CRT computer monitor at 60Hz or less for any length of time, as the strobing gave me a headache.

I'm guessing you're talking about interlacing?

I've never really experienced it because I've always watched PAL which doesn't have that.

But I would have thought it would be perceived as flashing at 60 Hz with a darker image?


PAL had interlacing

For anyone this deep on the thread, check out this video (great presenter!) explaining TV spectrum allocation, NTSC, PAL, and the origin of 29.97 fps.

https://youtu.be/3GJUM6pCpew


TIL NTSC: He explained that NTSC stands for Not The Smartest Choice, but I always assumed it meant Never The Same Color.

Memories shattered. Yeah, you're right and I would have watched interlaced broadcast content.

I saw interlaced NTSC video in the digital days where the combing was much more obvious and always assumed it was only an NTSC thing!


Except CRT televisions weren't like that at all.

The only time the electron gun was not involved in producing visible light was during overscan, horizontal retrace, and the vertical blanking interval. They spent the entire rest of their time (the very vast majority of their time) busily drawing rasterized images onto phosphors (with their own persistence!) for display.

This resulted in a behavior that was ridiculously dissimilar to a 30Hz strobe light.


Did they really do that, or did the tubes just ran at 2x vertically stretched 640x240 with vertical pixel shift? A lot of technical descriptions of CRTs seem to be adapted from pixel addressed LCDs/OLEDs, and they don't always seem to capture the design well

They did exactly what you say. Split the image and pixel shift. It was not like 30Hz at all.

The limiting factor is the horizontal refresh frequency. TVs and older monitors were around 15.75kHz, so the maximum number of horizontal lines you could draw per second is around 15750. Divide that by 60 and you get 262.5, which is therefore the maximum vertical resolution (real world is lower for various reasons). CGA ran at 200 lines, so was safely possible with a 60Hz refresh rate.

If you wanted more vertical resolution then you needed either a monitor with a higher horizontal refresh rate or you needed to reduce the effective vertical refresh rate. The former involved more expensive monitors, the latter was typically implemented by still having the CRT refresh at 60Hz but drawing alternate lines each refresh. This meant that the effective refresh rate was 30Hz, which is what you're alluding to.

But the reason you're being downvoted is that at no point was the CRT running with a low refresh rate, and best practice was to use a mode that your monitor could display without interlace anyway. Even in the 80s, using interlace was rare.


Interlace was common on platforms like the Amiga, whose video hardware was tied very closely to television refresh frequencies for a variety of technical reasons which also made the Amiga unbeatable as a video production platform. An Amiga could do 400 lines interlaced NTSC, slightly more for PAL Amigas—but any more vertical resolution and you needed later AmigaOS versions and retargetable graphics (RTG) with custom video hardware expansions that could output to higher-freq CRTs like the SVGA monitors that were becoming commonplace...

Amigas supported interlace, but I would strongly disagree that it was common to use it.

CGA ran pretty near 262 or 263 lines, as did many 8-bit computers. 200 addressable lines, yes, but the background color accounted for about another 40 or so lines, and blanking took up the rest.

Everything runs at 262.5 lines at 60Hz on a 15.75KHz display - that's how the numbers work out.

The irony is that most of those who downvote didn't spend hours in front of those screens as I did. And I do remember these things were tiring, particularly in the dark. And the worst of all were computer CRT screens, that weren't interlaced (in the mid 90s, before higher refresh frequency started showing up).

I spent literally thousands of hours staring at those screens. You have it backwards. Interlacing was worse in terms of refresh, not better.

Interlacing is a trick that lets you sacrifice refresh rates to gain greater vertical resolution. The electron beam scans across the screen the same number of times per second either way. With interlacing, it alternates between even and odd rows.

With NTSC, the beam scans across the screen 60 times per second. With NTSC non-interlaced, every pixel will be refreshed 60 times per second. With NTSC interlaced, every pixel will be refreshed 30 times per second since it only gets hit every other time.

And of course the phosphors on the screen glow for a while after the electron beam hits them. It's the same phosphor, so in interlaced mode, because it's getting hit half as often, it will have more time to fade before it's hit again.


Have you ever seen high speed footage of a CRT in operation? The phosphors on most late-80s/90s TVs and color graphic computer displays decayed instantaneously. A pixel illuminated at the beginning of a scanline would be gone well before the beam reached the end of the scanline. You see a rectangular image, rather than a scanning dot, entirely due to persistence of vision.

Slow-decay phosphors were much more common on old "green/amber screen" terminals and monochrome computer displays like those built into the Commodore PET and certain makes of TRS-80. In fact there's a demo/cyberpunk short story that uses the decay of the PET display's phosphor to display images with shading the PET was nominally not capable of (due to being 1-bit monochrome character-cell pseudographics): https://m.youtube.com/watch?v=n87d7j0hfOE


Interesting. It's basically a compromise between flicker and motion blur, so I assumed they'd pick the phosphor decay time based on the refresh rate to get the best balance. So for example, if your display is 60 Hz, you'd want phosphors to glow for about 16 ms.

But looking at a table of phosphors ( https://en.wikipedia.org/wiki/Phosphor ), it looks like decay time and color are properties of individual phosphorescent materials, so if you want to build an RGB color CRT screen, that limits your choices a lot.

Also, TIL that one of the barriers to creating color TV was finding a red phosphor.


There are no pixels in CRT. The guns go left to right, ¥r¥n, left to right, while True for line in range(line_number).

The RGB stripes or dots are just stripes or dots, they're not tied to pixels. There would be RGB guns that are physically offset to each others, coupled with a strategically designed mesh plates, in such ways that e- from each guns sort of moire into only hitting the right stripes or dots. Apparently fractions of inches of offsets were all it took.

The three guns, really more like fast acting lightbulbs, received brightness signals for each respective RGB channels. Incidentally that means they could go between brightness zero to max couple times over 60[Hz] * 640[px] * 480[px] or so.

Interlacing means the guns draw every other lines but not necessarily pixels, because CRTs has beam spot sizes at least.


No, you don't sacrifice refresh rate! The refresh rate is the same. 50 Hz interlaced and 50 Hz non-interlaced are both ~50 Hz, approx 270 visible scanlines, and the display is refreshed at ~50 Hz in both cases. The difference is that in the 50 Hz interlaced case, alternate frames are offset by 0.5 scanlines, the producing device arranging the timing to make this work on the basis that it's producing even rows on one frame and odd rows on the other. And the offset means the odd rows are displayed slightly lower than the even ones.

This is a valid assumption for 25 Hz double-height TV or film content. It's generally noisy and grainy, typically with no features that occupy less than 1/~270 of the picture vertically for long enough to be noticeable. Combined with persistence of vision, the whole thing just about hangs together.

This sucks for 50 Hz computer output. (For example, Acorn Electron or BBC Micro.) It's perfect every time, and largely the same every time, and so the interlace just introduces a repeated 25 Hz 0.5 scanline jitter. Best turned off, if the hardware can do that. (Even if it didn't annoy you, you'll not be more annoyed if it's eliminated.)

This also sucks for 25 Hz double-height computer output. (For example, Amiga 640x512 row mode.) It's perfect every time, and largely the same every time, and so if there are any features that occupy less than 1/~270 of the picture vertically, those fucking things will stick around repeatedly, and produce an annoying 25 Hz flicker, and it'll be extra annoying because the computer output is perfect and sharp. (And if there are no such features - then this is the 50 Hz case, and you're better off without the interlace.)

I decided to stick to the 50 Hz case, as I know the scanline counts - but my recollection is that going past 50 Hz still sucks. I had a PC years ago that would do 85 Hz interlaced. Still terrible.


You assume that non interlaced computer screens in the mid 90s were 60Hz. I wish they were. I was using Apple displays and those were definitely 30Hz.

Which Apple displays were you using that ran at 30Hz? Apple I, II, III, Macintosh series, all ran at 60Hz standard.

Even interlaced displays were still running at 60Hz, just with a half-line offset to fill in the gaps with image.


I think you are right, I had the LC III and Performa 630 specifically in mind. For some reason I remember they were 30Hz but everthing I find googling it suggest they were 66Hz (both video card and screen refresh).

That being said they were horrible on the eyes, and I think I only got comfortable when 100Hz+ CRT screens started being common. It is just that the threshold for comfort is higher than I remember it, which explains why I didn't feel any better in front of a CRT TV.


Could it be that you were on 60Hz AC at the times? That is near enough to produce something called a "Schwebung" when artificial lighting is used. Especially when using flourescent lamps like they were common in offices. They need to be "phasenkompensiert" (phase compensated?/balanced), meaning they have to be on a different phase of the mains electricity, than the computer screens are on. Otherwise even not so sensitive people notice that as interference/sort of flickering. Happens less when you are on 50Hz AC, and the screens run at 60Hz, but with flourescents on the same phase it can still be noticeable.

I doubt that. Even in color.

1986 I got an AtariST with black and white screen. Glorious 640x400 pixels across 11 or 12 inch. At 72Hz. Crystal clear.

https://www.atari-wiki.com/index.php?title=SM124


If they weren't interlaced then they were updating at 60Hz, even in the 80s. You're being very confidently wrong here.

I did 1024x768@85 just fine.

If it supported it, 100 Hz paired with a mouse set for 200 Hz was nice and smooth.

Simplified as in easier to use, or simplified as in less language features? I'm all for the former, while the latter is also worth considering (but hard to get right, as all the people who consider Go a "primitive" language show)...

Yes. So Microsoft (which manufactures hardware itself and has close ties to other hardware manufacturers) needed to find... other ways to, er, motivate people to buy new hardware anyway. Which brings us back to the blog post we are commenting on.

Not sure Windows as a subscription service is the end goal though. But maybe we should all wish for M$ to do that, maybe that would be what's needed to finally bring about the Year of The Linux Desktop™.


I don't think selling more hardware is the primary motivation. The motivation is ensuring everyone has TPM 2.0 enabled on their device.

This allows Microsoft to protect parts of their software even from the user that owns the hardware it's running on. With TPM enabled you finally give up the last bit of control you had over the software running on your hardware.


Unbreakable DRM for software, such as for your $80 billion game business or your subscription office suite.

As a bonus, it prevents those pesky Windows API compatibility tools like Wine from working if the application is designed to expect signed and trusted Windows.


The mass exodus to Linux gaming is already causing a push back against kernel level anti-cheat.

People who 5 years ago didn't give a hoot about computing outside of running steam games are now actively discussing their favorite Linux distro and giving advice to friends and family about how to make the jump.


As much as I hope it to be mass exodus, and as someone who switched over to CachyOS as my main OS in Nov 2025, I'm not sure that 3% of the steam user base really qualifies as a 'mass' exodus.

https://www.notebookcheck.net/Linux-gaming-growth-SteamOS-sh...

Going back to my Windows install every now and then to do things feels uncomfortable. Almost like I'm sullying myself! The extent of Microsoft's intrusiveness kind of makes it feel like entering a poorly maintained public space...at least compared to my linux install.

I'm not sure that the majority of people feel this way about Windows 11. They just put up with it in the same way as they do YouTube ads, web browsing without ublock origin, social media dark patterns etc. But certainly, never been a better time I think to move to linux for my kind of user, i.e. the only mildly technologically adept.


Yeah but which 3%? It's important.

There are a lot of Steam gamers with 5 games in their library who log on once a month. There are a few Steam gamers with 5000 games in their library who are permanently logged in. There's folks who play one game obsessively, and folks who tinker around with many games.

I'm willing to bet that the 3% are the kind of people who buy a lot of games.

I'd love to see that "what percentage of games have been bought by people on which platform?" metric. I think it'd be a lot more than 3% on Linux, even if you count Steam Deck as a separate platform.


I agree. Would be fascinating how that 3% breaks down. Although excluding the SteamOS/steam deck users that desktop segment drops to about 2.25%, seeing how 25% of Linux installs are steamOS.

I think SteamOS being available for PC and promoted by Valve could be a game changer. It provides a trusted and familiar pathway for a different way of doing things. But while it would perhaps reduce Windows installs, I can't see it help grow a user base of DIY linux tinkerers, if that is of any importance. I can kind of see it being a bit like Android makes the majority of phone users linux users, but not entirely sure what that means for linux desktop.


Agree.

I think SteamOS's desktop mode will get used more as people discover it. I was kinda impressed that I could just switch out to a desktop on my Steam Deck, and then used it to play videos while travelling.

The whole "it's better than a console at being a PC, and better than a PC at being a console" thing. It'll be interesting to see if it takes off.


I think you'd lose that bet. The kind of people who buy a lot of games are also the people who are not going to be tolerant of game compatibility issues on Linux; they want to play the game, not futz with their OS.

2 years ago I would have agreed with you, but the game compatibility issues really aren't there any more. Proton has made huge strides, and the Steam Deck has forced a lot of game companies to make sure that there aren't any issues.

> I'm not sure that 3% of the steam user base really qualifies as a 'mass' exodus.

Major tech reviewers are talking about Bazzite. Reddit gaming forums are full of people talking about Win11 vs Linux.

Microsoft only has two strangle holds on PCs - gaming and office apps. For home users they literally have 0 lock in now days other than familiarity. No one is writing native windows apps outside of legacy productivity apps and games. Even Microsoft is writing Windows components in React now days.

I moved to Linux earlier this year and literally none of my apps were unavailable. Everything is a browser window now days.

15 years ago that would've been crazy, I had tons of native windows apps I used every day.


I know linux gaming is getting a buzz and I'm happy to see it. I'm honestly surprised it took so long for people like Gamers Nexus to review linux, but thankful that they did.

But by saying 'For home users they literally have 0 lock in now days other than familiarity.' I think you severely underestimate how powerful familiarity is in anchoring non-tech users to particular platforms. However dysfunctional they can be.

As I mentioned, I moved to linux myself earlier this year. But the first time I tried it was probably around 2004. And I've dipped in and out occasionally but not stuck with it until this year, when I've found it to be a significant improvement on the Windows alternative.

Microsofts own creation presents a real opportunity for an uptake in linux adoption. But I do think it still presents sufficient friction and unfamiliarity for average non-tech users to take on. The only significant issue I had with your initial comment was with your reference to a 'mass' exodus, even if it is confined to the gaming community.

Happy to be proven wrong of course. And perhaps to the annoyance of my friends, willing to help anyone I know interested with a linux install.

But looking forward to the Dec 2025 steam survey. Looking forward to the tiny contribution my little install will make to the linux numbers!


Distros like bazzite launch into steam upon boot. Steam is the OS, everything happens through steam.

Give people chrome and most won't be able to tell the difference from Windows.

Windows 11 was a large change to the UI, arguably just as large a change as from Windows 10 to any of the contemporary Linux DEs.


I've been playing the most recent POE2 league on my Linux desktop for the past week while my friend on windows is having random crashes.

>I think you severely underestimate how powerful familiarity is in anchoring non-tech users to particular platforms.

What familiarity? Microsoft has changed the look and feel of the OS to the point that it no longer retains that familiarity from version to version.


Unfortunately Linux requires zero effter to create cheats on, might as well run no anti cheat. And the root stuff is overblown as user space programs can already read all your files and process memory of that user. How many bother with multiple users?

Not all gamers are playing games where cheating is an issue. It's really only the MOBA Call of Battlefield AAA crowd who care about that. That's not the largest group of gamers, and certainly not the largest market for games.

Fortnite and Call of Duty are consistently the #1 and #2 games every year. The others like GTA, Battlefield, League of Legends and Valorant also have anti-cheat that blocks Linux. It's not a minor issue.

The top game tag by sales [0] is #singleplayer, which obviously doesn't care about anti-cheat.

There's a demographic of gamers who only play the one competitive multiplayer game (such as Fornite or CoD). They don't buy many games, they're not the most lucrative market for game publishers, but they do keep those titles in business. And yes, for them, anti-cheat is important and they're unlikely to move to Linux.

[0] https://games-stats.com/steam/tags/


The push back on kernel level anti-cheat on security grounds has always felt odd to me. If you don't trust them to run kernel level code why do you trust them to run usermode code as your user? A rogue anticheat software could still do enormous damage in usermode, running as your user, no kernel access required.

Being in kernel mode does give the rogue software more power, but the threat model is all wrong. If you're against kernel anti-cheat you should be against all anti-cheat. At the end of the day you have to chose to trust the software author no matter where the code runs.


it isn't about what I allow them run on my computer, it's about what they don't allow me run on my own goddamn computer. you can't run modded biıs, self compiled kernel or unsigned drivers. with secure boot enabled.

The concern isn't that the anti-cheat vendor would do something nefarious, the concern is that it opens up privilege escalation exploits.

If malware does get executed in user mode it could take advantage of the anti-cheat kernel module to make the attack even more damaging to the OS.


it will never be unbreakable, and only needs to be broken once

intel can't even get SGX to work


To the benefit of everyone backing up their media libraries.

Maybe instead Microsoft could allow Windows 11 to install and run on machines that are otherwise capable and just flash red screens at you all the time where otherwise ads would show up that constantly nag that "THIS COMPUTER IS FUCKING INSECURE!" or something. It would be equally as annoying but I'm sure running latest Windows 11 but with TPM 1.0 instead of TPM 2.0 will be more secure than running Windows 10 without bug fixes and security patches.

(But my understanding is there were other things like bumping minimum supported instruction sets that happened to mismatch a few CPUs that support the newer instruction sets but were shipped with chipsets using the older TPM)


We want to delete the fallback code paths... You'll just get failures from bitlocker instead of install failures, or windows hello failures, or ...


Registry keys and autoattend.xml config keys are not clever people finding a way, it's people using stuff Microsoft put there to do just this for now. I.e. Windows 11 has not been strictly enforcing these yet, they are just "officially" requirements so when they eventually decide to enforce in a newer version (be it an 11 update or some other number) they'll then be able to say "well it's really been an official requirement for many years now, and over 99% of Windows 11 installs which has been the only supported OS for a while now are working that way" at that time. If they just went straight from Windows 10 to strictly enforced Windows 11 options it'd've been harder to defend.

Windows 12 will close the loophole: your CPU will require a signed code path from boot down to application level code. No option to disable Secure Boot or install your own keys. But there needs to be an installed base of secure hardware for this to happen, hence the TPM 2.0 requirements for Windows 11.

Since Windows 12 hasn't even been mentioned yet, I wouldn't worry about what you're describing at all.

You're missing the point, the TPM 2.0 requirement is there to drive adoption, not to actually prevent you from installing Windows 11.

Hardware key storage is a low level security primitive. Both Android and iOS have mandated it for far longer. It's a low level security primitive that enables a lot of scenarios, not just DRM.

For example - it's not possible to protect SSH keys from malware that achieves root without hardware storage. Only hardware storage can offer the "Unplug It" guarantee - that unplugging a compromised machine ends the compromise.


9front with factotum tells a different story.

If you want to protect keys you get a yubikey or something like that.

And if you want to play sound, you buy a sound card. Computers integrate components that approximately everybody needs. Hardware storage for keys is just the latest example

The main component of a yubikey is that it requires a human presence to hit the button and access the secret.

Do new computers have such a button? I've failed to locate it.


Touch is one way of demonstrating proof of presence. Biometric is another. Pin is a third. Yubikeys typically support touch or pin. Windows Hello (which is TPM based) supports bio or pin.

Ah yes Android and iOS, they have truly become bastions of user freedom since mandating secure enclaves. That really puts my worries to rest. /s

User freedom is not the only axis by which we judge operating systems.

It is not, but to me personally it is a very important one and it is not one I will give up without a fight.

> With TPM enabled you finally give up the last bit of control you had over the software running on your hardware.

The overwhelming majority of users never had any kind of control over the software running on their hardware, because they don’t know (and don’t want to know) how the magical thinking machine works. These people will benefit from a secure subsystem that the OS can entrust with private key material. I absolutely see your point, but this will improve the overall security of most people.


> The overwhelming majority of users never had any kind of control

Uninterested is vastly different than unable, especially when that majority is still latently "able" to use some software that a knowledgeable-minority creates to Help Do The Thing.

The corporate goal is to block anyone else from providing users that control if/when the situation becomes intolerable enough for the majority to desire it.

Most people don't move away from their state of residence either, but we should be very concerned if someone floats a law stating that you are not permitted to leave without prior approval.


> motivate people to buy new hardware

Open source drivers, and a sense that Linux support will forever be top priority, would be a motivator for me. Most of my tech spend has been with Valve in the past few years. I'd love if there was another company I actually enjoy giving my money to.


May I suggest Framework (https://frame.work/linux).

> So Microsoft which manufactures hardware itself

The only computer lineup MS ever sold directly, to my knowledge, were the Surface things - an absolute niche market.


>finally bring about the Year of The Linux Desktop™.

Do we actually want that?

If Linux ever reached mass adoption, big tech companies would inevitably find a way to ruin it


Governments around the world are finding ways of doing that well before big tech will get to it.

This big push for Age/ID verification & "trusted" operating systems is going to ruin what's left of free (as in freedom), general purpose computing. Governments are getting frothy at the mouth for every device to have remote attestation like google play protect/whatever iOS does.


> So Microsoft (which manufactures hardware itself and has close ties to other hardware manufacturers)

You mean the Microsoft vacuum cleaner ? /s


They mouse is actually a good piece of hardware... as long as you don't make the mistake to plug it in Windows for it to install a driver.

I think OP was referring to CDs, which AFAIK don't have DRM.

My link is to the CD DRM!

This is rather misleading. Standard CDs as sold had (and have) no DRM.

The scheme you link to is intended to prevent further copies of CD-Rs but you can copy a CD you bought as often as you like.


Unless the CD comes with a root kit that interferes with that copying. https://en.wikipedia.org/wiki/Sony_BMG_copy_protection_rootk...

I assume someone typed it in (possibly on a mobile device with autocorrect) rather than copy & pasting it (which you would have to do twice, for the URL and for the title).


> But they fail to correct the belief that people naturally form given what is placed before them: that the proffered reconstruction of ancient sculpture is roughly what ancient sculpture actually looked like.

I'm pretty sure many museums with reconstructions of classical statues have a note on this topic somewhere on a plaque beside the statues - but who reads those?


I fail to understand what's the point in even having those reconstructions there if we are fairly certainly they looked nothing alike the originals. Making them pure white seems less dishonest.


I do agree, but there is still a valid logic behind what is shown because it's only using the pigments that there is direct evidence on the statue for - but to stop this confusion, maybe there should be three versions of each statue at these kind of exhibitions (assuming these are all replica castings and they're not re-painting the originals!) - a blank one to appreciate the unpainted form, the reconstruction of the base layer that only has the pigments found in the crevices (what this article is complaining about), and then an artists impression of what it probably looked like properly shaded (given that we have the evidence of painted statues as shown in the article).

Then you could still have the evidentially "pure" one, but also have a more likely rendering to reduce confusion.


If anyone else would be generous enough to offer a free IDE with free AI code completion, they could give VS Code a run for its money, but as far as I know this hasn't happened yet? Zed for instance is available for free, very AI-centric, and you can use it with any of the popular LLMs, but you still have to pay for the LLM...


Bring your own AI token is totally fair, given the cost of AI.

The annoying thing is removing a perfectly working intellisense default.

If Jetbrains removes their on-device (non-AI) code indexer and suggestion systems then I will no longer be a paying customer for example. Despite being a All Products Pack user for the last... idk, 15 years?


Intellicode is being removed, not intellisense. Intellicode seems to have been a (free) version of copilot using local models.

Intellisense, which is an “on-device (non-AI) code indexer and suggestion system,” is still in place.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: