Hacker News new | past | comments | ask | show | jobs | submit login

People have been making this argument to me about Linux for more than 25 years. The most cutting version that I ran across was:

> Linux is only free if your time is worthless!

Something never quite sat right with me about this argument, and your comment finally made me understand what it is: the understanding you gain from tinkering is priceless, and it's exactly the experience that you use to help everyone around you: it turns you into an expert.

So yes, I may just want to turn the key and have my car work. But when it doesn't, I often wish I was that guy that had tinkered with my car, so I can better understand what was wrong, and whether I can fix it myself or if I needed a professional.

I run Linux on all my machines, and my family generally uses Mac (both sides), but all those years tinkering with Linux, they still come to me for help with their Mac machines that they insisted would Just Work.

All that out of the way, I agree with your fundamental premise: hackintosh is likely in the rear view mirror for the next generation of tinkerers.




I think there's a difference with Linux, because it's something you own and control and can dive into and see every part of. I hate investing time in proprietary technologies, because I know I can be stopped or locked out. With open source software, simple electronics, old cars, fabrication and woodworking, the time I spend learning feels worthwhile.


Even this "I hate investing time in proprietary technologies, because I know I can be stopped or locked out" is a hard-gained insight. Hackintosh is one of those things that made me understand this. Nothing like spending weeks to get your hackintosh working smoothly with all the hardware just to find out that the next update breaks everything. I've come to see it as a necessary part of the journey


This is my current state of thought. Proprietary software perceives me as an enemy who needs to be locked out of as many features as possible to allow for more money to be extracted out of me while also investing the least amount possible back into the product. The only timeframe where proprietary software is groundbreaking and at the forefront of technology is when they have not yet captured and locked in a large market share.


In my experience, doing a hackintosh actually teaches you that Apple hardware is not that special and macOS works only because they make it easy for themselves.

Then it becomes clear that if you don't really have an absolute need for macOS it is not worth the trouble since Windows/Linux actually make better use of the hardware with little trouble in comparison. By extension you develop a feeling that desktops Mac are really overpriced and don't have much of an advantage in the Apple Silicon age, since efficiency don't get you much but the performance delta for a given price is insane.

In fact, buying a PC that is equivalent to a base Mac Studio will cost you 1k euros less, even if you go with "nice but not that necessary" things (especially for a personal computer, like 10G networking).

But yeah, you also learn that it's better to not waste time trying to confort to Apple agenda, but that's also true for real Macs in my opinion.


This is a great point. I sort of detest becoming an expert at proprietary stuff, because I know they'll just change it before long. I've lamented about this elsewhere as modern software creating "permanent amateurs". Even those that want to invest in expertise often find their knowledge outdated in a handful of years, and those that don't want to invest can easily justify it by pointing out this effect.


Microsoft, at least before Cloud happened, supported their tech stacks with backward compatibility for decades.


Proprietary or not, tinkering help you develop an intuition of what might be wrong.


Meanwhile, the article is clear about how proprietary code absolutely prevented the author from understanding why the Wifi and Bluetooth failed with specific apps.


Yeah I mean, whoever made the original statement is just not an OS engineer.


> just not an OS engineer.

Or not just an engineer


I know plenty of people with stamps who don't care to fiddle with their OS or change their own oil. People who work on putting things in orbit and beyond, people who build bridges, people who design undersea robots and airplanes. They're most definitely engineers.


Yeah fair.


Nah I can believe they'd be a chemical engineer or even a software developer that writes iOS apps or something like that.


I wanted to say steam/power engineer, but even they understand the value of tinkering.


This is the reason I still buy older cars. I can't stand owning a car only to find out that I can't work on it myself. Even if I don't have the time or tools needed for a specific job, if its something I could do on my own it means the job should be that much easier and cheaper to have a mechanic do.


I fully empathize - and yet, there are benefits from tinkerers/hackers messing around on proprietary hardware/software. Hackintosh - and similar communties - led to projects like Asahi Linux, Nouveau, Panfrost, etc.


> I think there's a difference with Linux, because it's something you own and control and can dive into and see every part of. I hate investing time in proprietary technologies, because I know I can be stopped or locked out.

The problem with this approach is then you get a generation of engineers with tunnel vision thinking the One True Way to achieve your goal is the same way your GNU (or whatever) software did it.

Invest time in learning your technologies, whatever they are. There's valuable knowledge in proprietary stuff just as there is in OSS.


I agree with your point in principle, and yet I installed Ubuntu on my work laptop this January after using Windows professionally for my entire (5 year) career. I've found myself moving in the opposite direction from the person in the root comment, because I find that it's getting harder and harder to find tolerable proprietary software. It feels like everything is glacially slow, laden with ads and tracking, reliant on an internet connection for basic functionality, or some combination of the above.


"There is valuable knowledge worth learning in the technology" != "this is strictly better software on every axis and you should switch to it for your daily work"


As someone that learned to program on BSD and shortly thereafter, Mac OS X and Linux....

I honestly don't know how people use Windows machines as a dev environment 24/7. It would drive me mad. Everything's so wonky and weird. Everything from symlinks to file permissions is just backwards and fucky.


Back in the day it was alright because Microsoft gave you a fairly good dev environment in the form of Visual Studio, with the focus of it being squarely on desktop application development instead of tinkering with the system or running web services. It didn't stop people from doing it anyways but it's part of the reason why everything is so janky. Then the web took over and Microsoft tried for ages to make .net and Windows Server work until they realised they can't tune an OS that was never meant for backend development and just put all their focus on WSL. In the year 2024 there is almost no reason to be doing any non-desktop dev in a Windows environment unless it's on WSL. And you get the benefit of having an actually sane window management system and external display handling unlike MacOS, not to mention how nice PowerToys is.


I mean this in the nicest possible way: 5 years is likely not long enough for the “just work, stupid” desire to really, really, really set in. Nor is a couple of months enough time for the potential rough edges of desktop Linux to set in.


Given that I've been using Ubuntu on the desktop since I was 11, I'm not worried.

The reason I switched was because Windows didn't work. Win11's desktop makes early-2010s KDE look like a smooth, bug-free experience. My laptop (a 10th gen X1 thinkpad) was plagued with driver problems. At least twice a month, I'd have to reboot when I joined a meeting and discovered my mic wouldn't un-mute. Switching to Ububtu solved both of these problems, and I don't have to deal with an awkwardly bifurcated environment where a couple of my CLI tools run in WSL while everything else is native. Oh, and my Zephyr build times are a good 25% faster now.


After 17 years of using Linux I realized that I was tired of tinkering with shit, so I caved and bought a macbook air. Not even two years later I was back on Linux, because I realized that the amount of tinkering I do on Linux is actually very small; the experience I already paid my time for means that Linux is simply easy for me to use, while MacOS is a pain in the ass in innumerable small unexpected ways. The path of least resistance, for me, is to continue with Linux.


I work in IT, so I’m paid for my time to solve all kinds of issues with Windows. At home, such issues are unpaid work. Linux has the advantage of having issues be mostly of my own choosing. Stick to the golden path and you’ll hardly ever have them. And the easy configuration and recovery options allow you to jump into a new install with minimal hassle.

Everyone will have the same headaches with Windows as Microsoft’s choices are required these days. Millions of people have quite lucrative jobs solving them. I’d rather not bring work home so I run Linux.


I’ve been using Windows throughout my childhood and start of my CS career - now I use Windows for specific software (audio/music) and Linux for developing (about 8 years I guess). I had a 1-year stint with macOS because I was developing an iOS app, and have been the troubleshooter for people with macs at my previous job, so I consider myself somewhat ‘multilingual’ when it concerns OSs.

As a power user, Linux is just so much nicer. I constantly get frustrated, especially with macOS, about stuff that I can’t easily. In Linux my stuff works and if it doesn’t it can be made to work (usually). In Windows/Mac it’ll often take considerable effort to make the system work the way I want, or it’s just not possible.

I think with proprietary software ‘it just works’ is only a thing if you’re happy with the basic experience that is tuned to the average person. If you have more complex needs, you should be using Linux (and if you know your stuff or use the right distro, things will likely also ‘just work’).


FYI, Ubuntu is a heavily advertised distro. Its pretty bottom barrel for quality.

If you want a modern linux distro, try Fedora Cinnamon or something that isnt on Debian branch.


It is not surprising that you posted this flame bait from a throwaway account.

What is wrong with Debian?


[flagged]


It is _stable_, not outdated. You are practically guaranteed that if you’re running Debian Stable, and live only within the official apt ecosystem, you will not have software-based instability.


Grandma doesn't care about this. They just want their screen to work.


Debian Sid makes a better desktop distro than Ubuntu. The drivers are up to date, the instability is greatly exaggerated and installing nonfree codecs is easy (so easy with virtually any distro that it shouldn't even enter into the equation...)

This said, I prefer OpenSUSE Tumbleweed, which is rolling release yet more stable than Sid. Rolling release + extensive testing + automatic snapshotting gets you the best of all worlds.


Bruh. I use Debian and Arch interchangeably, barely notice the difference.


Yeah, that’s why when I update my Arch MacBook Air once a year or two it works well, but Debian dies and needs to reinstall for some unknown reason. Before that, I believed Debian is so very stable. My experience shows the opposite.


Frankly there is no value in learning user-hostile proprietary technologies in a way that the owner of said technologies actively wants to discourage and prevent.

Like learn the proprietary tech in the environments it's intended to be used in but if you can't use it in that environment I personally wouldn't waste my time with it. With FOSS tech at least you can make the argument that you can learn stuff by maintaining it properly but with a proprietary stack in an unsupported and actively user hostile environment the best you are going to do is learn how to maintain a fragile truce with the software gods.


Peeling all the way all the politics / idealism from your comment and the value proposition between these two options is basically the same, with the difference being that on a proprietary stack there’s a higher chance of things breaking in a way that you low/no likelihood of fixing. It’s all good and well that it seems like this makes you personally want to throw up in your mouth a bit or whatever, but you are claiming objectivity that clearly isn’t here.


Yeah I'll learn as much as I absolutely have to in order to get my paycheck. Any more and you need to give me a raise.


That's not a good way to make money. It's not how FAANG pays people, and if it is how your employer pays people then you should always be learning so you can change to better jobs.

A funny thing about "never work for free" advice is that a lot of highly paid jobs (investment banking, high end escorts) are about doing tons of client work for free in a way that eventually gets them to pay you way too much when they do pay you.


I learn the interesting stuff, I just don't learn proprietary tech that I really don't ever want to be dependent on for my wages.

In fact most of the essential skills for my job I've learnt in my own time, and continue to learn. I invest my own money in equipment and training courses. I love learning. But only when it's interesting to me, not because it'll make more money to somebody else. If it'll make you more money, pay me.


> Frankly there is no value in learning user-hostile proprietary technologies in a way that the owner of said technologies actively wants to discourage and prevent.

Security research. And, uh, applied security research.


This is a gross misunderstanding of the GP's point though. It's not that they are against doing any of these things. In fact, they said they were more than happy to do it in their youth. I am in full agreement with the GP's sentiment as well.

Mucking about and tinkering with things while one has the time, desire, and stuff to learn is a young "man's" game. I did all of that and absolutely learned a helluva lot. It did everything I needed from it. I got cheaper/better computer than what I could afford. I learned a hell of a lot about not just the hardware pieces I chose, but also why/how certain things about the OS that I never would have.

But now, I too just don't care. It was interesting, but I'm not that interested about maintaining an OS or how it works. I just want it to work. So for all of those that are willing to do all of that today, I'm all for it.

your comment came across to me as just another one of those "if you don't feel the same way i do, you're wrong". that's not true. people can just be in different places in their life. been there, done that does not mean you can't go there and do it too. we're just focused on different things now


There’s another perspective: even if OP is done, if we shut the door (or let it be shut by companies like Apple) then the currently-young won’t be able to tinker and won’t grow to gain the same knowledge.


They are free to continue that kind of work, it just gets harder. Look at Asahi Linux. While it might not be Hackintosh in the same sense, it is the same spirit. Hackintosh worked because the systems were built on commodity hardware. Now that Apple is using custom chips, they've definitely made it a bit more difficult, but in my experience that just brings out the really talented that step up to the plate to take a swing.


I agree that tinkering is a side effect of curiosity, and that curiosity leads to expertise, which has value.

I parleyed my curiosity in hardware into my first job. (My car-fixing skills alas didn't take me anywhere.) Hardware was fun for the first 10 years of my career, but now, well, it's just not interesting.

I played with Linux as well along the way, but I confess that too has dulled. Building your first machine is fun, building your 10th is less so.

The past couple years I've gone down the solar energy rabbit hole, and I'd love a wind turbine (but I just can't make the economic argument for having one.) If I do end up getting one, it'll be to prove to myself that it was a dumb idea all along.

In some ways we never stop tinkering. But the focus moves on to the next challenge.


> Building your first machine is fun, building your 10th is less so.

Building a Linux box led me back to Apple.

I had been using UNIX at home, school and work for several years, and decided it was time to build my 3rd Linux box. Went to CompUSA out of idle curiosity to see what equipment they had, and the only computer in the store with Internet access was a Mac.

I hadn't used a Mac since the SE/30 days, and I suddenly realized that the NeXT acquisition which I'd mostly ignored had changed everything. Why build a Linux box and be locked out of tools like Photoshop when I could have UNIX workstation that ran commercial software (for, admittedly, significantly more money).

Never looked back.


> Why build a Linux box and be locked out of tools like Photoshop

That's what VMs are for. You're never really locked out. It may not make sense to go that way if Photoshop is THE thing you work with of course.

> when I could have UNIX workstation that ran commercial software

Because for lots of software MacOS is a second class system. Partially because there's just no way to test things on it without investing lots of money in hardware, so many people don't.

If you're doing lots of sysadmin / software maintenance style work, MacOS just provides unnecessary pain.


> That's what VMs are for.

For me, the psychic angst of using Windows is much, much worse than any Mac-related inconvenience.


> If you're doing lots of sysadmin / software maintenance style work, MacOS just provides unnecessary pain.

Amazingly a significant amount of the software that you use on a daily basis, perhaps unwittingly, is developed and maintained with macOS and Windows!


I'm working on packaging things for Darwin platform. And helping people deal with homebrew/compilation issues. I'm painfully aware how much is developed by people with no access to or interest in MacOS. And unless something targets windows explicitly (not wsl), you can basically expect issues going in. In a twisted way, I'm one of the enablers of the current situation where things are usable on a mac.

Sometimes you can tell by the simple fact that the git repo contains files in one directory that conflict in naming. Linux has no issues with "Foo" and "foo" coexisting.


Nothing to do with Linux, and everything to do with case sensite filesystems common on UNIX.

macOS uses case insensitive filesystem by default for backwards compatibility with HFS+.

You can turn case sensitive on HFS+ and APFS if so desired for the "Linux" experience, via Disk Utility or the equivalent CLI tool.

And if looking to have some fun on an ecosystem that doesn't expect it, you can equally turn it on on NTFS, via fsutil.


> > I'm painfully aware how much is developed by people with no access to or interest in MacOS. [...] Sometimes you can tell by the simple fact that the git repo contains files in one directory that conflict in naming. Linux has no issues with "Foo" and "foo" coexisting.

> Nothing to do with Linux, and everything to do with case sensite filesystems common on UNIX.

Of the three major desktop operating systems (Microsoft Windows, Apple macOS, and the Linux family), only Linux has case sensitive filesystems by default. Therefore, it's likely that someone who didn't care about filename case conflicts was running Linux.


Indeed, why get a lesser experience with Linux laptops, when I can use Apple and Microsoft platforms, and use Linux in a VM when I really need to.

The Year of Linux Desktop is delivered on a desktop VM.


If that works for you, great. My default works better with Linux with only occasional other system. Makes me least angry. (Also because Linux is the only system that handles sleep/hibernation for me without issues, ironically...)


I think the awkward part of your first post is that you appear to start with a value judgement that tinkering is for poor people who's time is worthless. That's not remotely fair to either poor people, or rich people who like to tinker. No one's time is worthless. Not your time. Not mine. It's all just time.


Fair enough, and no I didn't mean to impinge time is worthless. It's not the value of time that changes, but the amount of it you have.

In a work context a shortage of time (more customers than you can handle) means you need to discriminate, which means you can't make everyone happy. Which usually means differentiating based on value. (Aka, you get more expensive. )

For personal time you also become more discerning. Spend time with spouse, or build another computer, or lie under a car etc. Life has more demands, so there are more choices.

Incidentally, one of those choices is to work less.

The tinkering never goes away, but I prefer to tinker in profitable areas now. (I get to tinker for work.)


All my PCs and servers run Linux, and its certainly not out of some idealism or anything. I'm fundamentally lazy, but I have a high standard for how things should be. As a result, I tend towards the highest quality, lowest cost (time, money, etc.), and thats Linux for me. Specifically, the setup I run on almost all my machines, which is the most optimal way I have found to write and run software, and play games.

If Windows was easier to use, more stable, less of a hassle, easier to fix, I would use it, but its neither of those (for me). When I have a windows problem, I can either try magical incantations to fix it, reinstall, or give up, and each of those takes much longer than most things I could possibly do on my linux systems. Even if my linux box fails to boot, the drivers break and my ssd doesnt mount, all those fixes together take less time and effort than finding a fix for the most trivial of windows problems.

The most trivial problem on Windows has been that the right click menu doesn't fully populate on first right click. I reported the issue, and thats all I can do. Its been a year and nothing has changed.

On linux, a less trivial problem (a calculator crashing with a series of very weird inputs) was solved by me opening it in gdb and fixing the code, making a PR and having it merged.

I guarantee a lot of people are on linux because its easier, and for no other reason. I dont need it to "just work", because I will break it. I need any possible fix to be possible in bounded time.


Windows has been disconnected from user needs for a very long time. Any logical person would've put a "right click" icon in the Control Panel that would give the user full control of what does and doesn't appear in the menu, their order, etc.


I also use Linux on all my machines but that's because (perhaps after years of tinkering) it is currently the most turn-key laptop/desktop OS. Things just work, they don't break without a good reason, and weird limitations don't randomly pop up.

Windows at work, despite being maintained by professional helpdesk staff, or Macs my family have, with all the ease of use designed by Apple in California, are not like that.

Just the other day I tried to download an mkv file over https on a Mac and I couldn't get it to exceed 2.5 MB/s. Same network, same server, my laptop breezed at over 20 MB/s and Apple took out that walker for a stroll at a very leisurely pace. It didn't come with `wget` either.


If you sincerely believe this, you've tinkered enough that the massive knowledge barrier that is Linux seems like nothing to you.

I would never sit my 70 year old mother down in front of a Linux machine. We're not at "caring that video files download too slowly" - we're at "how do I put a file on a USB".


Put USB stick into computer, click on "Files" in the program chooser, select the USB drive (helpfully listed as "USB drive" even), drag your files there?

Same as on Windows and MacOS really. I don't dispute that Linux has rough edges, but putting files on a USB stick is not one of them tbh.


It really works very well for my father-in-law and he's over 75 now. Debian gives me a peace of mind I would never have with him using Windows.


I have very little Linux sys admin knowledge and have been using it on my home notebook for 5years and my work one two years now.

Really no issues with the OS.

I was using the very excellent 2015 Mac book pro before, but despite hardware that isn’t quite as nice (not bad though) that hardware I can’t go back to Mac OS. I know I pay a premium to get it pre installed over windows, but it’s not bad.


I do sit my 75-y.o. mother in front of a Linux machine, and it's fine.


> Something never quite sat right with me about this argument, and your comment finally made me understand what it is: the understanding you gain from tinkering is priceless, and it's exactly the experience that you use to help everyone around you: it turns you into an expert.

I have plenty of other things I’d rather tinker with and become an expert on, though. My computer is a tool to let me work with those things. It’s not fun when I have to debug and fix the tool for hours or days before I can even start working on the things I want to work on.


This is me. The range of things I want to tinker with has grown. Various house projects, jiu-jitsu, cooking, etc... are all things I tinker with and learn from. Building computers, I've done and don't feel the need to do again. I even built a Gentoo install long ago when I was learning the nuts and bolts of linux.


Exactly. Why do I want to be neck deep in some XML config hell when I could be playing music?


> Linux is only free if your time is worthless!

This argument is quite out of date. You'll lose a whole lot more time on forced Windows 10/11 updates than you'd spend managing a reasonable Linux installation. ("Reasonable" meaning avoid things like Arch or Ubuntu, and pick decent, natively supported hardware.)


That argument doesn’t sound very convincing to me. How would I know an avoiding Ubuntu is reasonable? That still seems to be the go-to distro for many people I know that like to use Linux but aren’t Linux experts. How do I know which hardware is natively supported?

With Windows 10/11 I’ve never had any problems, either with pre-built computers or my home-built PC. Hell, running Ubuntu in WSL has been relatively smooth as well.

My experience with Linux as an OS has been fairly good for many years, regardless of the distro. It’s the applications that could be an issue. Feels like it’s only very recently (post Steam deck in particular) that gaming seems to be viable at all. And it’s hard to beat the MS Office package for work. I recently got the idea to have two user accounts on my home computer where I have an account dedicated to working from home, logged into my office 365 account from work.. and it was honestly amazing how suddenly everything was just perfectly synced between my work and home computer.


If you have recently endured Windows Update for Patch Tuesday, you know that you are forced to reboot during this process. This activity will deny you "the five 9s," i.e., 99.999% availability in uptime.

If you have recently performed the analog activity on a Linux distribution, which is likely either apt update/upgrade or yum update, you will notice that a reboot is not required. These update approaches cannot alter the running kernel, but ksplice and kernelcare offer either free or low-cost options to address that.

Windows update is enormously painful compared to Linux. There can be no argument of this fact.


> This activity will deny you "the five 9s," i.e., 99.999% availability in uptime.

Which is something 99% of personal computers don’t care about even slightly. These days restarting your machine is a very inconsequential event, your browser can effortlessly reopen all the tabs you had active, macOS will even reopen all the windows for your native apps.

I don’t mean to defend Windows Update, I just think “you have to restart your computer!” is not a particularly good reason to damn it.


Windows update is agony compared to apt/yum.

A complete patch Tuesday session is twenty minutes of reduced performance, followed by a "don't reboot your computer" of unknown time both before and after the reboot.

Anything is better than that, especially when some updates either reboot immediately or kindly give you five minutes to close everything down (was tmux made precisely for Windows update?).

Exposure to apt/yum really makes Windows intolerable, just for this alone.


> especially when some updates either reboot immediately or kindly give you five minutes to close everything down

I have been a Windows user since XP. Never, not even once did Windows decide to reboot without asking first. Never.

The only way this could've have happened is if Windows kept asking you over the span of a week or 2 to restart to apply the updates and you kept postponing it.

Either way, "Hot Patching" will soon be a thing on Windows so restart won't be required every month [1].

[1] https://www.windowscentral.com/software-apps/windows-11/micr...


I'm on a corporate desk/laptop, and I'm guessing that happens about three times per year.

That puts you in a tmux habit.


> Which is something 99% of personal computers don’t care about even slightly

to the point that I know people that still turn their computers off when they are not using them.


Let's get this out of the way first: "the five 9's" is not a requirement for personal computers. That argument therefore is invalid.

But even then, Microsoft is testing "Hot Patching" windows installation so critical updates install without requiring a reboot [1].

When that comes out, I wonder where the goalposts will be?

[1] https://www.windowscentral.com/software-apps/windows-11/micr...


What are you doing on a desktop computer that can only be off for five minutes a year?

A laptop is even dumber to complain about, because they're (suppose to be) suspended every time you close them.


That would be a firing offense at my company. Company files stay on company hardware. Personal files stay on personal hardware, and never should the two meet.


I would never work at your company. I use my own tools, thank you.


My personal vim config on a company laptop? No problem whatsoever, neither for me, nor the company.

A bittorrent client without preauthorization with IT and security? It's basically asking to get fired.

My vacation photos on a company laptop? Tricky - not a huge deal but not recommended. Better upload them to your cloud backup quickly.


Yup,. you get it, exactly! It's not a surveillance state, but don't be stupid, and certainly don't LEAN into it.


You're own tools are your own personal files?

Interesting. How do your vacation pictures help you do your job?


Not the OP, but personal files are not just vacation pictures. I work in R&D and I have my org-mode/roam on various scientific and technical topics going back 15 years or so. I use these for work to benefit my current company, and maintaining two parallel versions of these is rather inconvenient.


Isn’t that exactly what a cloud drive is for? There’s a difference between using your personal notes for business purposes on the one hand, and keeping company property and data on a machine totally outside IT control. That’s just a massive lawsuit waiting to happen, and it’s bad for the employee too - why would you want the liability?


I would't store company data or code outside of approval services, but one might say that my notes, including notes on the people I meet and projects I work on, can constitute proprietary information - so yeah, it is a bit of a grey area still.


I am speaking of company property in only the narrowest sense, ie. physical objects and IP (and I guess property but I've been WFH for a decade, so,..


I don't want or accept IT control of my personal machine.


> How do your vacation pictures help you do your job?

This question is why I don't want the company laptop.


Are you required to maintain PCI compliance? Do you touch customer personal info?


That may be sensible if you want or need stronger security and isolation.

However, many companies do support BYOD, especially on mobile where it's a pain to carry two phones around.

There is some support for this. For example, Apple supports dual Apple IDs and separate encrypted volumes for personal and corporate data. Microsoft apps (Outlook) also have some support for separating personal and corporate data.

The benefits of BYOD can include lower equipment costs, lower friction, and potentially higher employee happiness and productivity.


Mobile is a totally different story, to me. The security model allows them to be compartmentalized in the way a desktop never could be.


> How do I know which hardware is natively supported?

You buy preinstalled. Works for me.


Yeah preinstalled. And I never had issues with Ubuntu breaking in ways like arch or gentoo. Breaking includes trying to install some new thing or uograde and having random other stuff have to be googled.


That is patently wrong. I run Fedora on my Framework because it is the most supported and recommended distro for it and I mostly just need a web browser for most of the things I do on it. I've had kernel upgrades break wifi completely, the fingerprint reader doesn't work properly out of the box, 6GHz Wifi isn't supported (though neither is it supported in Windows 10), VLC (which I hate using) is the only media player that supports playing from SMB shares on Linux, Wayland isn't compatible with Synergy type software (and my web browser doesn't work well with xorg), etc.

Most of these things worked without any fuss in Windows and I can't think of any notable Windows issues I had to deal with on the laptop before I installed Fedora.


This is a great linux post because while taking the time to type out distros to avoid is worth it, saying what distros to try is not.


This is 100% false.

I have been running Ubuntu then Arch as my daily driver 2004-2017. As I started a consultant working for Western companies I thought they will care about me being clean copyright wise so I went 100% Linux. This was obviously not so but what did I know? I deeply regret doing this now. (I was dual booting before.)

With Ubuntu, upgrades every six month or so meant you were better off reinstalling and reconfiguring -- no matter which way you went, it was 2-3 days of work lost to tinkering the system. With Arch, the whole system doesn't shatter, it's just this and that doesn't work and it's frustrating. Bluetooth, multifunction scanner-printers being in the forefront. In fact, I needed to sell a perfectly working Samsung MFC at one point because Samsung ceased to make drivers, the old ones didn't work with newer Linux and while open source drivers surfaced that only happened years later. Let's not even talk multimedia. https://xkcd.com/619/ is ancient but the priorities are still the same.

Neither systems were great on connecting to weird enterprise networks, be it enterprise wifi or strange VPN. At one point I was running an older Firefox as root (!) to be able to connect to the F5 VPN of my client because the only thing supporting 2FA for that VPN was a classic extension -- and the binary helper disappeared in the mists of time. The only Linux related discussion was ... the IT head of my client asking how to connect Linux to his VPN now that he turned 2FA on and being told it doesn't work. https://community.f5.com/discussions/technicalforum/linux-ed... well I made it work but faugh.

I have been running Windows 10 + WSL since 2018 January and all is well. It reboots sometimes while I am asleep and that's about it. You need to run O&o shutup like once in a blue moon. Right now I am on Win 11 as my primary laptop is being repaired, you need to run ExplorerPatcher but that's it. It's been indeed six years and there was never an update where the OS just didn't start up or a hardware driver decided to call it quits after an upgrade.

Also, updates are not forced, I control my machine thanksmuch via Group Policy.

https://xkcd.com/619/ is ancient but the priorities are still the same.


I am Linux user since 2006, Ubuntu then Arch.

Bluetooth mouse, keyboard, headphones, controller works. Intel iGPU works, including hardware accelerated video in browsers. VPN: Pritunl worked without issues, Perimiter 81 initially failed, works with update.

Wayland, Pipewire, Wine, Proton - Steam Deck is widely successful multimedia device. Priorities are same, NVK joined open source drivers.

Linux does not connect to "enterprise wifi or strange VPN" - ok.


> avoid things like Arch or Ubuntu

which one you would recommend?


Well I'm just a rando, and you didn't ask me, but I agree with the sentiment, so: Fedora. Or openSUSE. I'd be more comfortable giving a newbie Fedora.

I was a Debian devotee for nearly 25 years, but I've found it to be less foolproof and fault-free lately, and it has always lagged behind current package versions in Stable, forcing you to run Testing (or -backports) or even Unstable to get newer versions-- with corresponding potential for breakage.


Debian Stable was very out of date 25 years ago, but ever since mid '00s (after Ubuntu got popular) it improved by miles. Debian Stable is akin to Ubuntu Stable LTS. Ubuntu Stable non-LTS is a 6 month snapshot from Debian Testing, does not get supported for long. If you run Debian Unstable, you're probably running something akin to a rolling distribution. What is best all depends on your goal and purpose of the task. Personally, I very much like the Debian ecosystem and would prefer any Debian(-based) OS. However these days, Docker can trivialize a lot (and also mitigates your mentioned issue), ZFS and other filesystems allow to rollback in case of issues (useful on a rolling distribution, but also on Debian Unstable), and hypervisors allow snapshotting, multiple OSes, and all that, too.

For a server I'd recommend Proxmox (especially since ESXi is now only for enterprise). From there, have fun with different OSes.

Proxmox on a desktop is a bit meh, but possible. There's a lot of useful Linux desktop OSes out there. For example if you want to perform pentesting you can use Kali Linux. The one which interests me most from a security standpoint however, is Qubes OS (Fedora-based, sadly, but you can run anything on top of it). For gaming, SteamOS is neat (Arch-based, these days) and could even be fun to have a kid play around with Linux, too.

As for macOS, I played around with Hackintosh a couple of times in the past with success. But I never liked it much because you'd lag behind on security patches, and every new update would be praying it'd work. I did get it to work under Proxmox though, that was fun, but had to install a second (dedicated) GPU for that. I latest M-series ARM-based Macs work very well, only disadvantage is the fat price upgrade for RAM and SSD (often even soldered!). That part is terribly sad.


This is absolutely false. I run dual-boot Windows and Linux on hardware that has 100% Linux support. Windows just works, the same cannot be said for Linux unless all you do is use a browser and listen to Spotify.


There are pain points on both. Audio on Linux is still annoying if your system isn't very vanilla, while Windows sucks at bluetooth, configurability, and has a lot of annoying anti-user "features".


Windows does not “just work”. On my work computer my programs randomly rearrange themselves after lunch, windows always has trouble switching between my audio devices, random slowdowns. Windows is pretty shit these days tbh. It’s pretty much like Linux was 10 years ago.

However, I rarely have issues on Linux anymore, mostly because of something is broken on Linux, I can fix it.

Frankly, I hate that I’m forced to use windows as work. I feel like I need to constantly deal with BS windows annoyances. When I go home and work on Linux it like breathing a sigh of relief. My desktop actually feels fast and efficient.


> On my work computer my programs randomly rearrange themselves after lunch, windows always has trouble switching between my audio devices, random slowdowns

> I rarely have issues on Linux anymore, mostly because of something is broken on Linux, I can fix it.

Perhaps your Windows knowledge is not up to the level of your Linux knowledge? It might be that a Windows expert could fix every issue you’ve listed and more.


I'm a long time macOs user at home (pre-X).

I've worked daily in Windows enterprise environment for 15+ year (which mean that when it won't work I usually "just have" to get help from a colleague.

I've been in charge of a debian/postgresql cluster for 10+ year which I managed to keep upgraded on a reasonable schedule.

But Yet, since for some utterly opaque random reasons Windows updates on my home gaming PC stoped working two months ago I feel totally clueless about how to even begin to debug this crap.

There seems to be absolutely no clear working procedure out there to fix that, only people with the same problem shouting out to the void. All them poor souls trying byzantine procedures that have been duplicated ad nauseam from stack overflow to windows help forums through reddit and back.

The consensus seems to reinstall windows from scratch (by choosing amongst a handful of ways for which risks/benefice looks unclear).

That really piss me off and but I guess it's user fault because "my Windows knowledge is not up to the level..."


That’s very possible, but I don’t want to invest time gaining knowledge in a proprietary platform. Microsoft already owns most of the default stack programmers use these days. I don’t want to contribute my energy to entrenching them further.


> It might be that a Windows expert could fix every issue you’ve listed and more.

So in other words, it doesn't "just work."


Wasn’t that Apple’s tagline?


The original comment claimed that Windows "just works" while Linux doesn't. Which can't be more false in 2024.


Let me take a guess:

You have exclusively used Debian-family distros.

Try a desktop distro like Fedora. Debian-family is a server distro that got famous after Conical/Ubuntu did marketing really hard.

Ubuntu is the Apple of Linux, they are famous from marketing, not quality.


I have used all distributions. They all have their own pain points. Debian-based distributions are actually the most painless in my experience.


> unless all you do is use a browser and listen to Spotify

So what exactly isn't working?


These have been pain points for me. Not saying they're impossible to solve on Linux, but it's nontrivial especially compared to Windows

Change trackpad scrolling speed

Set up suspend-then-hibernate

GPU drivers (I have a box with an AMD APU and no idea how to actually utilize it)

Many games (Proton is amazing and a huge leap forward, but om average it's still more work than gaming on Windows. eg fiddling with different versions of Proton or finding out that a game's anti cheat will ban you for using Linux)

Higher res video streaming (I think this is usually a DRM issue?)

Full disclosure: I'm posting this list because I'm hoping that someone will tell me I'm wrong and that Gnome actually has an easy way to set the trackpad scroll speed


> Change trackpad scrolling speed

If you're on X11, I think you'll have to use xinput to set it manually.

If your on Wayland, in KDE at least this is available in the standard settings application.

> Set up suspend-then-hibernate

On KDE at least that's just one of the options in the power settings ("When sleeping, enter:" has "Standby", "Hybrid Sleep" and "Standby, then hibernate").

> GPU drivers (I have a box with an AMD APU and no idea how to actually utilize it)

Worked OOTB for me, do you have amdgpu drivers installed? What exactly isn't working?

> Many games (Proton is amazing and a huge leap forward, but om average it's still more work than gaming on Windows. eg fiddling with different versions of Proton or finding out that a game's anti cheat will ban you for using Linux)

I find that Proton mostly just works for me, but indeed EAC is a problem that I don't know how to solve (and also don't really care about since I'm not into playing public multiplayer games).

> Higher res video streaming (I think this is usually a DRM issue?)

You should check if HW Acceleration is enabled in your browser, but IIUC Netflix will indeed refuse to provide higher quality streams to Linux (and also Windows depending on your browser), you might be able to resolve it by googling a bit, maybe using a browser with DRM support and switching out your user-agent?

> I'm hoping that someone will tell me I'm wrong and that Gnome actually has an easy way to set the trackpad scroll speed

Gnome is notorious for removing user choices, so I wouldn't be surprised if this was impossible on Gnome/Wayland. Xinput might work on Gnome/X11. Switching to KDE should work on Wayland ;)


Alas, I'm using Gnome. There's a setting for changing scroll speed with a USB mouse but not for a laptop's track pad. I don't see anything for standby-then-hibernate either.

>Worked OOTB for me, do you have amdgpu drivers installed? What exactly isn't working?

Based on their compatibility list[1], it doesn't look like amdgpu supports my hardware (Richland chipset). Most distros I've tried don't even boot unless I add "amdgpu.dpm=0" in GRUB.

[1]: https://wiki.gentoo.org/wiki/AMDGPU#Hardware_detection


> Alas, I'm using Gnome.

That is unfortunate, but at least that's not a difficult problem to fix ;)

On Wayland at least input stuff is IIUC solely on the compositor, so if they don't want you to control scroll speed, you won't.

For power, maybe the instructions on the Arch Wiki can help? https://wiki.archlinux.org/title/Power_management/Suspend_an...

> it doesn't look like amdgpu supports my hardware (Richland chipset)

I see, looks like your card is too old for the official open source amdgpu support, meaning you should either install the unofficial open source ati drivers as per https://wiki.archlinux.org/title/ATI or try the official proprietary drivers from AMD (which I assume will be too outdated to function on a modern kernel?).


Thanks! Turns out, I don't really need those things.


Not OP, but the fact that I have an easily accessible text file on my desktop with the exact commands to run in my terminal to recompile the graphics driver when upgrading packages breaks graphics again should speak volumes. I don't really mind, because running 3 commands in the terminal a few times per year is not particularly difficult for me. I could see it being difficult for non-devs though.

What does get annoying is when such an OS upgrade breaks the wifi drivers and I have to setup a bluetooth hotspot on my phone to access the github repo and fetch the latest driver version for the wifi dongle.


> You'll lose a whole lot more time on forced Windows 10/11 updates

Utter fantasy.

They complete whilst I sleep, taking zero of my time at all.


At this point I feel like Linux may be more likely to just work than a windows machine. I just had the unfortunate experience of setting up windows 11, and the number of ‘please wait while we get things ready for you’ was truly astonishing.


It's not. You can go and pick up any computer that is currently on the market, doesn't matter if it's 300 or 3000 dollars as long as it is a (n IBM) PC and it will run Windows.

Will it always be flawless? No. Will it always work perfectly out of the box? No. But it will work and generally you have a good chance of it working as you wish assuming you are fine with Windows and what MS does with it.

I bought an Asus Zephyrus G15 (2022) specifically because it was recommended to me because it is supposed to be great for Linux and it's probably the worst Linux experience I have ever had. As the first piece of hardware that I specifically picked for Linux support.

Because most DEs don't do fractional scaling but all high end laptops have too much DPI to not have fractional scaling.

Nvidia is still not providing proper Linux drivers.

Asus can't program to save their lives but the tools that replace the Asus stuff on Windows are still better than the stuff that is replacing the Asus stuff on Linux (asusctl/ supergfxctl vs G-Helper).

I once had a machine where the nvme drive was simply not working. That was when Kernel 5 came out. It broke on Fedora but worked in Mint until Mint got Kernel 5.

During my last Linux adventure, KDE just died when using WhatsApp as a PWA (where I live, WhatsApp is essential software to have a social life).

And even after years of Wayland being around, it's still impossible to have apps that aren't blurry in most DEs because X11 is still around.

You're complaining about software updates and user friendly loading screens. The issues that drive people away from Linux and to Windows are literally unfixable to 99% of the techies that try Linux. I'm not fixing an nvme driver in the Kernel. That's not my area of expertise. But I still need my machine to work and on Windows, it does.

Rufus let's you create an ISO that skips most of the windows 11 nonsense btw.


Good for you.

I’ve had literally zero of any of those issues in my past 4 years of using Ubuntu.

I had a hell of a lot in the past, so I trust I can judge when it’s reached “better than windows” level.


I think that everyone knows that's a pretty ridiculous statement. Installing Windows 11 is basically putting in a USB stick, waiting about 8 minutes, clicking a few things and typing out your login and password. I love Linux, first started playing with it about 20 years ago now. There's not a single dist I've ever seen that is that simple. Just a basic fact, sorry.


Now, that is a ridiculous statement. Installing Windows has never once been a smooth experience you describe. It's been long wait times, dozens of reboots, and never ending cycles of Windows Updates. Always has been for the last 20+ years.

Today, it's even made worse by the fact that MS is intentionally driving Windows UX to the ground in exchange for short term profits. Installing Windows isn't "clicking a few things." It's going out of your way to disable piles upon piles of anti-features MS throws at you, whether it be spyware, bloatware, or the hyper-aggressive nags to get you work against your will. The length die-hard Windows users go to to "de-bloatify" their Windows installation these days is absurd.

It's true that Windows had a superior end user UX over Linux 2 decades ago. But that has changed with improvements on the Linux side and poor, poor decisions on behalf of MS.


You're greatly over-exaggerating how much effort it takes for a power user to set up Windows. I had to do it the other day on a Dell MiniPC (sadly couldn't use Linux since I needed HDR) and it's just the following.

1. Set up USB stick in Rufus with all the setup skips enabled 2. Select install options, skip key, next next next 3. Wait for it to install 4. Say no to MS account, put in username, password, and security questions 5. Wait for a reboot and setup 6. Connect to internet, run Windows update, reboot when done 7. Uninstall the few bloatware apps in the start menu, most of them are UWP so the uninstall button does it immediately, takes no more than 5 minutes 8. Disable web search from group policy 9. Install Windows Terminal, Powertoys, and another web browser.

I could easily automate steps 7, 8, and 9 through powershell and winget if I wanted to. The total install time was less than 10 minutes plus the time it took for Windows Updates to install and I have a pretty clean environment.

In comparison, with Fedora running on Gnome I'd have to spend a solid amount of time messing with dconf settings to get fractional scaling to show up and for my touchpad to scroll at the correct speed + installing extensions to get a UX as good as Powertoys has by default, and on KDE I would need to spend the same amount of time messing with settings and installing KWin scripts to get functional tiling (although that might have got better since I last tried it).

Oh and on MacOS I would be up and running in almost no time, because there's no way to fix the absolute dumpsterfire of a UX it has so I don't even bother.

So all options kinda suck, Windows just sucks in its own ways.


From your comment it sounds like you affirmed GP's claims...


> There's not a single dist I've ever seen that is that simple. Just a basic fact, sorry.

You having that experience does not make it a basic fact.

I didn’t even have to do the actual installation, as it was a prebuild machine. The only thing I had to do was the ‘clicking a few things and typing out username and password’ part.

Comparing the two between Ubuntu and Windows, I’m forced to conclude that Ubuntu has the easier version, or at least faster. And windows has the advantage/disadvantage of needing my MS account to set up an operating system.


> There's not a single dist I've ever seen that is that simple.

It is that simple with Ubuntu and similar distros. It has been that simple for many years.


> Just a basic fact, sorry.

Ubuntu, Linux Mint and Elementary OS and I guess a few others will beg to differ. And it takes way less than 8 minutes.


As I found out this week, making the Windows 11 USB stick is far harder than it ought to be if you don’t have Windows already.


If you use UEFI, all you need is to copy the files from the ISO over to the USB stick.

Am I missing something?

(And the same applies to UEFI capable Linux-distros)


Windows installer images have some files too large for most tools to understand and also I believe the USB stick needs to be exFAT formatted too. Virtually any tool for making a USB stick would fail in various ways on macOS.


> Windows installer images have some files too large … I believe the USB stick needs to be exFAT formatted

That’s true. I forgot!

While it’s not part of the UEFI spec many (most?) consumer BIOSes will be able to UEFI boot from NTFS as well, so formatting as that might also be an option.

Both that and exfat should be easy to do on Linux. No idea about MacOS.

Which brings me to

> too large for most tools to understand

This I don’t understand. What tools? What tools do you need beside “cp”?


I don’t know if you can just copy the files over. It seems you also need to make the USB stick bootable?

There’s dozens of guides on doing this on a Mac, they all seem outdated. I found a tool called WinDiskWriter on GitHub and it was the only GUI tool that worked.

Suspect there’s more to it than just cp. There’s wimlib too for handling the larger files on the install ISO.


> It seems you also need to make the USB stick bootable?

That’s for legacy MBR boot. It has no function with UEFI boot.

Same for Windows as it is for Linux.


Interesting. This is the first traditional UEFI machine I’ve had. Anything prior to this has been Seabios/Coreboot in recent times.


TLDR: UEFI just checks in NVRAM for pre-existing boot-configs (stored paths to EFI-executables for installed operating systems) and if doing "dynamic" booting from some random media, it checks if the the volume has a EFI-executable for the given architecture (for instance in \BOOT\bootx64.efi for Intel x64), and if it does, it loads that file.

UEFI usually boots straight into native long mode without any weird 8086 compatibility modes being employed (which the OS then has to unroll), so for the OS its simpler to deal with.

It can also serve as a multi-boot menu on machines which has several OSes installed.

It often comes with a MS-DOS like "UEFI Shell" you can boot into... To manage UEFI itself. So if something doesn't boot, you can just boot into the shell instead and try to fix things from there.

It may sound complex, but once you get into it, its really much easier to work with that legacy MBR boot, and all the "magic" things you have to do their to get things booting.

I definitely recommend reading up on it.


> hackintosh is likely in the rear view mirror for the next generation of tinkerers.

Part of this might be that making Hackintoshes is so much harder now, but part of it might also be that OOTB desktop Linux is luxuriously good these days compared to where it used to be. Ubuntu and Pop!_OS linux are absolutely on par with MacOS for a user who meets the (admittedly higher) entry requirements for using Linux.


Your comment makes me think of my 3d printing journey. A lot of printers require maintenance and tinkering just to keep them functional. To an extent, since they are targeted towards “makers” who like to play with these things, that’s fine.

But sometimes the thing you’re trying to build is of central importance, and you want the machine to stay out of your way.Tinkering with the machine takes away time you could be exploring your ideas with a machine that’s already fully functional.


Sometimes the holiday is the destination. Sometimes the fun is in the getting there, not being there.

Tinkering can be fun. But these days I mostly want results, achievements etc. I want to tinker to a successful goal, not just tinker for tinkers sake.


This argument makes a lot of sense. I get more upset than I probably should about car issues, likely because I never spent the time to tinker with them, so I feel rather helpless… and I don’t like feeling helpless.

In my youth I did a lot of tinkering with computers and it has paid dividends. It gave me a career.

These days though, I want to be able to tinker on my own schedule. I want my primary computer, phone, and car to “just work”. That means any low level tinkering needs a second thing. That can work fine for computers, because they’re small and relatively cheap. The idea of having a project car isn’t something I ever see myself doing, as it’s big and expensive.

I can still tinker on some things with my primary computer without it being a problem. Tinkering on writing software, running servers, or whatever, isn’t going to kill my ability to do other things on the computer. A lot of tinkering can be done without tinkering with the OS itself.


To be honest, none of that stuff has been true for 15+ years anyway.

Linux just works now. You put in the Ubuntu/Debian/Arch/whatever USB, you install it, it just works.

I can't remember the last time anything broke on any of my desktop machines and it wasn't my fault for intentionally doing breaky things.


> the understanding you gain from tinkering is priceless

You pay with time. It's priceless, if you are a romantic or lack foresight (because what you did with your total will be way more important than what is left). Otherwise it will always be the most expensive thing you have (and we must still be able to spend it without care, because what would life be otherwise).

> But when it doesn't, I often wish I was that guy that had tinkered with my car

Don't. Instead build a network of experts you trust and make more money doing what you do best to pay them with. Trying to solve the world on your own is increasingly going to fail you. It's too complicated.


Disclaimer: This became more of a rant than I intended. I've become pretty unhappy with the general quality of the "professionals" I've interacted with lately.

I just can't agree with this take. It sounds that simple, but it's not.

I happen to enjoy learning and fixing.

It would take me a long time to build that trust. Nobody cares about my things and my family's safety like I do.

Most people are a long way from making as much money as an expert would charge them.

In the last couple of years, I have had some terrible times when I call for help.

When the dealership is charging $200/hr to have a kid plug in the car and follow a flowchart, I'll just take a look myself.

Plus one time they left my fuel pump loose and I had to pay (in time and money) for an extra round trip with Uber, and the fuel it sprayed onto the road. They didn't fix the original problem, which cost me another round trip.

Another time, I had technicians (experts) out to look at my leaking hot water tank 4 times before they decided it was time to replace it. I wasted the time calling, babysitting, coordinating, figuring out how to shower without hot water, etc.

If this is the average "expert" count me out. I'll do it myself. Plus, throwing money at a problem isn't near as fun.


> When the dealership is charging $200/hr to have a kid plug in the car and follow a flowchart, I'll just take a look myself.

Regrets about not becoming more of investing the time to be an intuitive handy man is a very different category from "let's see if there's a video on yt to help me fix that in 5 minutes". My message is definitely not "don't get your hands dirty" but "be practical". Doing the yt/google/chatgpt thing to get an idea is mostly practical.

> If this is the average "expert" count me out.

You disclaimed, no problem — but I did write "build a network of experts you trust". Just calling someone and being annoyed that they are not good (and I agree, most of them are not) is not that. It's going to take time and money, but decidedly less so, because you get into the habit if doing it, you learn, you see red flags, network effects are real (people know people) and relationships on average last long enough. That is my experience, at least, but I have no reason to believe I would be special here.

> Plus, throwing money at a problem isn't near as fun.

That's true, in my case, only for very few problems. Most problems I would rather not solve myself.

I'll admit: All of this is a concession to reality, at least my perception of it. Learning is fun. I would really love to be good at a great many things. It's just increasingly unreasonable to invest the time necessary, because things get more complicated and change more quickly.

Staying good at a few things, learning whatever is most important next, and getting better at throwing money at the rest, will have to do.


I'm enjoying this thread. I want to add that building a network of experts has other costs too.

Sticking to a network will limit the variety of people you get to meet, everything else the same. Local maxima.

It also isn't practical in some circumstances; if I travel for work or move cities every few years, the local network for mechanics gets lost. The cost of keeping the network would be staying in one place.

So, these are all options.


>> The cost of keeping the network would be staying in one place.

One man's bug is another man's feature:). You describe staying as a bug, I've lived in the same house for 24 years, and, for me, it's definitely a feature. I'd positively hate moving to another suburb, never mind city.

And yes, I've developed relationships with local service providers. My plumber, my electrician, my mechanic, all know me by name. I've found the people I can trust and they eliminate those hassles from my life.

But, and this is my point, I'm not you. My context, my goals my desires, are all different to yours, and that's fine. We're all in different places, being different people, and that's OK. It doesn't have to be "us versus them". We might enjoy different thinks, and have different perspectives, but that's OK.


There's levels to tinkering though. When I was running Ubuntu, a lot of the tinkering came down to searching for what config files to update. Sure, that freedom is nice if you care to use it, but it's mostly just searching, configuring, experimenting. This is hardly fun or instructive.

A deeper form of tinkering is actually working on the code. I think an instructive example is writing your own X windowing system with xmonad. You get to see exactly how a whole windowing system works.


These days, if you have the skills and tools to swap a transmission you have to tow it into a dealership and beg them to flash the transmission so it will work in your truck. If you want to avoid that you better know where to find the strategy code and match it up before purchasing another transmission. Same goes for touch screens and a whole slew of essential parts. While we weren't looking the rug was completely pulled out from underneath us. Now your family mechanic is beholden to the dealership.


> People have been making this argument to me about Linux for more than 25 years. The most cutting version that I ran across was:

> > Linux is only free if your time is worthless!

But it is exactly why I quit Linux and returned to macOS. I used to run Linux on cheap 2nd hand ThinkPads and for 3 years on Macbook as main system. But after another upgrade destroyed gain all network connectivity I have quit.

macOS isn't perfect but it works in most imposrtant areas and I can tinker with small stuff when I feel like it.


Ubuntu is so easy to use. I enjoyed using Arch before, but got to a point where I also just wanted my PC to work without any tinkering. Ubuntu is very good at that.


Your argument is excellent and made me evolve my point of view about Mac. I use Mac for efficiency, and yet, I was wrong about what kind of efficiency I’ve been developing. Tinkering is so important, even if just for the fun of it.


Do you not have any hobbies to "waste time" with? I would assume that most Hackintosh enthusiasts do this as a hobby, not for a living or even to save money on hardware.


On my days we used to tinker in proprietary 8 and 16 bit home computer systems, The Way of Linux (TM) is not the only path to enlightment.


Linux is for work. I wouldn’t consider running anything else (Windows, MacOS, FreeBSD, etc.) for my services.


Consider FreeBSD, because it’s great.


Just for my router


You need to understand the bias of many HN commenters. They are running businesses, aspire to run businesses or employed by businesses that are monetizing the work of tinkerers and packaging it for a mass market where they can sell higher volumes or mine more personal data. There are a lot of people who will recommend spending massive amounts of time and money learning and renting proprietary services over learning fundamental concepts and owning your own stack. I just ignore them along with the crypto bros before them and the AI pumpers now. Renting proprietary closed services to people who don't know better is their bread and butter.


I really like the way you put that.


That was a beautiful analogy.


a frustrating freely accessible experience being priceless is not mutually exclusive from your time being worthless

but I’m sure your point will inspire someone


For you, me, other people on HN who generally make a living by understanding computers, definitely.

For a layman who just needs to connect to WiFi, edit some documents and print them without having to update a kernel? No.


> For a layman who just needs to connect to WiFi, edit some documents and print them without having to update a kernel? No.

when it was needed to do it last time, in way more troublesome than Windows system updates?


Even as a dev with 3 environment I've not had to tinker my kernel since I left gentoo something like 15 years ago, Ubuntu takes care of it..


>Something never quite sat right with me about this argument, and your comment finally made me understand what it is: the understanding you gain from tinkering is priceless, and it's exactly the experience that you use to help everyone around you: it turns you into an expert.

Yeah, but an expert in what? There are only so many hours in a day. Like if you care about learning about some rando soy d driver, or why all you photos come out pink under Linux, but not Windows[], that’s great. Go knock yourself out.

But if you want to do something that’s not rando debugging, then maybe it’s not for you. Like, I like Unix. It’s lets me do my work with the least amount of effort. What I don’t like is being a sysadmin. Some folks do, and that’s awesome. But that’s the reason why I got rid of desktop Linux 20 years ago.

[] Both of these are actual lived experiences. I do not care about you chiming in about either of these.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: