The abbreviations I wrote are unambiguous. When I first learned about Unix, I basically guessed - I assume as most first timers do - that the folder is basically the location of miscellaneous files ("et caetera").
Oh, let alone the fact that a bunch of the abbreviations are utterly non-intuitive to first timers.
/bin - binaries - nobody born after circa 1980 calls them that anymore. Executables, applications, apps, etc.
/boot - from a lame Baron Munchausen joke from 1970. Should probably be /startup.
/dev - dev is SUPER commonly used for "development". Easy enough to solve as /devices.
/home - okish, probably one of the best named that are actually in there. I'm shocked it's not /ho or /hm.
/lib - reasonable. Though these days in the US it might trigger political feelings :-p
> The abbreviations I wrote are unambiguous. When I first learned about Unix, I basically guessed
They're completely ambiguous to someone who doesn't speak English.
> /mnt - the whole metaphor of "mounting" is... debatable
What? Have you never heard of mounting a picture on a wall? Mounting an engine? That's the metaphor.
> Anyway, a lot of people have done this criticism better than me and it's boring at this point.
Your original complaint was about "src", suggesting calling it "source", which is still ambiguous by your own standard. Source of what? How is someone going to know what "source" means if they've never heard of booting a computer? Who is the audience for this change?
Some of your suggestions aren't meritless, but your jumping-off point certainly was.
Yes, Eastern Europe is very safe. For example, Ukraine had very few natural disasters and floods over the decades. Most disasters were man-made, such as famine, Chornobyl and war. If Nature doesn't get you, humans will...
There are large parks, and much of London is leafy. The limiting factor is wind. So far it's extremely unusual in the UK to have very dry conditions with strong winds, so fires are much less likely to spread.
> Then, a reverse of the above at 6am the following day.
I understand that's not the case, but I want to imagine your AppleTV coming back to life at 6am every morning, resuming the show mid-sentence at 100% volume. Instant alarm clock.
Prototyping tech is one thing; making it a widely adopted success is another. For instance, Apple was the first to bring WiFi to laptops in 1999. Everyone laughed at them at the time. Who needs a wireless network when you can have a physical LAN, ey?
> For instance, Apple was the first to bring WiFi to laptops in 1999. Everyone laughed at them at the time. Who needs a wireless network when you can have a physical LAN, ey?
"The original model, known as simply AirPort card, was a re-branded Lucent WaveLAN/Orinoco Gold PC card, in a modified housing that lacked the integrated antenna."
> On the other hand, people who laughed at them removing the 3.5mm jack can still safely laugh away.
Then laugh at Samsung and their flagship line of phones as well, since they haven't had headphone jacks for a while now. "After Note 10 dumps headphone jack, Samsung ads mocking iPhone dongles disappear" (2019):
>> On the other hand, people who laughed at them removing the 3.5mm jack can still safely laugh away.
> Then laugh at Samsung and their flagship line of phones as well, since they haven't had headphone jacks for a while now. "After Note 10 dumps headphone jack, Samsung ads mocking iPhone dongles disappear" (2019):
I totally do. One of the problems with Apple is the industry seems to mindlessly ape their good and bad decisions. Their marketing has been so good, many people just assume whatever they do must be the best way.
At the time I felt like Apple was getting rid of the 3.5mm jack as a potential bottleneck for future iPhone designs (as one of the limiting aspects of form factor), but there still doesn't seem to be anything design-wise to justify it, even several years later. It is very clear now that it was merely to encourage Air Pod adoption.
I would say this was obvious to the cynical of us from the very beginning. Unless you are trying to go portless (for water resistance perhaps?) or have a very thin phone, there’s very little benefits of removing the jack… except to drive airpod sales, of course.
I mean to go thinner than 6/6s I can see the 3.5mm causing trouble. Part of me is still sad they bounced they went the other direction when it comes to iPhone thickness
I'm more suggesting that bad decisions should be litigated against fast-and-early, so other companies aren't encouraged to follow in Apple's footsteps. If every company had their own Lightning connector, there would be no choice but to force them all to converge. The original sin is letting it happen at all, in the first place.
Who decides which ideas are good and bad? I assume you wouldn’t want regulators to have retroactively forced Apple to keep floppy disk drives in their computers. Or cdrom drives? It’s just the “obviously bad” ideas that should be banned, right?
Do you have a crystal ball that lets you know ahead of time which choices are good and bad? Even in retrospect I’m not sure Apple made the wrong choice with the lightning connector. It’s a better connector in just about every way than micro-usb, which was the only standard alternative at the time. Apple’s experience with lightning was rolled into the design process for usb-c, which as I understand it they were heavily involved in. USB-c might not exist as it does today without Apple’s experiments with the lightning connector.
Even if we pretend you’re better at picking winners and losers in the tech world than Apple and Samsung, do you think regulators are going to be as canny as you are with this stuff? US politicians don’t seem to understand the difference between Facebook and the internet. Are you sure you want them making tech choices on your behalf?
If you ask me, I think regulators would make a dogs breakfast of it all. If they were involved we’d probably still have laws on the books mandating that all laptops still have parallel ports or PCMCIA slots or something. The free market can sure take its time figuring this stuff out. But competition does, usually, lead to progress.
I would gladly laugh, but it's nearly impossible to buy a good phone now. TBH I don't care that much about my phone not having a 3.5mm (even if I would need to use wired headphones, which is very rare now, I can use an USB adapter for that), but there are basically no phones without this stupid hole in the display, or with a good dedicated (not under screen) fingerprint scanner (because who needs that, when you can have face recognition, right?) All top-line phones are like $1500 now, but still are considered like disposable products that are naturally expected to be changed every 2 years. Batteries are not removable, yet devices are not actually (reliably) waterproof.
And maybe I'm wrong, but somehow it feels each improvement like that was actually pioneered by Apple. In the dreamworld of free-market enthusiasts this should have made Apple bankrupt or iPhone a very niche consumer device, but in the real world everything just became iPhone. There are some rare exceptions, but these are either outright experimental and gimmicky (because being different is their identity), or just bottom-of-the-line products that have these "intentional defects" that should make you chose the more expensive option.
And your experience for phone use cases will be better, because walking with wired headphones in gives you nasty telephonic effects (sound transmission along the cable) and they get tangled up.
The two aren't mutually exclusive. Those who wish to use Bluetooth headphones can happily use them, while those who prefer wired can continue to use them. There's no reason smartphone manufacturers shouldn't support both.
They do, you can get the little dongle and it has superior audio quality to most audiophile DACs on the market.
But that 3.5mm port takes up a lot of room that could be used for more battery, backup antennas for when the user's hand is covering one of them, vibration motors etc.
I'm pretty sure you could find justification even for why top models don't support microSD card expansion. The problem with this line of argument is that if they wanted to, they could support both without any issues. The real reason is money. It's more profitable to have Bluetooth only when you also make AirPods, and not include storage expansion when you sell built-in memory options at a 400% markup.
Interesting that you suggest laughing at their decision to remove the headphone jack, when it was actually just the first of an industry-wide shift that has done so by other companies.
Was that really the case? I remember they were mocked for e.g. offering wifi only, firewire only etc., while the respective removed alternatives were way more common.
In the consumer space at least, WiFi was nowhere to be seen on a typical PC when Apple adopted it. Same with USB. So while it technically originated and existed elsewhere, there was no serious traction on it prior to Apple adoption.
What you say is also true: many people weren't ready to ditch the old when Apple decide to deprecate it.
This has been true for ages. They were the first to ditch the floppy disk drive and later cd drive in their computers. Both choices were very controversial at the time.
They were also the first to usb-c nirvana - they were shipping laptops with thunderbolt in 2011, and moved to usb-c in 2016, giving you four full-capability outputs at a time when most laptops had one, sometimes, and it took another 5 years before most laptops adopted at least one as standard and premium laptops sometimes had two.
(People look down on the move to usb-c which I don’t quite get, everyone seems to fawn over usb-c in other contexts but macbooks, amirite!?!?. Yes it’s nice to have a hdmi port but fundamentally if you buy into this vision that usb-c does everything but you also want to use a bunch of legacy ports (vs thunderbolt video, thunderbolt networking, etc) then obviously you’re going to have to have dongles, and people supposedly buy into that vision in other contexts. Apples implementation of that vision was fundamentally at least a decade ahead of the curve there, if you’re going to do that you want lots of ports and you want every port to do everything, not “this one is the only one that can charge fast”, “that one doesn’t have video output”, “if you run both ports they drop to some weird lower capability because you’re dividing the controller”. Those complaints are the things people don’t like about the base-tier M-series processors today, and apple’s previous models solved that problem long before anyone else did.)
Hell until very very recently a lot of the time the competition didn’t even have thunderbolt/Pcie tunnelling… you got 10gbps usb-c and a grab-bag of charging and display features, and you’re gonna like it. That’s still the case with motherboards and it’s literally only with this years’ release that we’re finally getting usb4/thunderbolt as standard on high-end boards. Literally more than a decade from when apple started putting thunderbolt on laptops, almost a decade from the era of 4x tb3 full-spec ports.
… and reminder that in classic usb fashion, usb4 still doesn’t even guarantee Pcie tunneling support. So really it can still be just a normal usb-c 10gbps connection in a silly mustache and trench coat. Even on the next-gen stuff. What’s the term for doing an ok, moderately competent but not even exceptionally good job while your competition repeatedly shoots themselves in the face, again? But it’s by design - the intent is deceiving and manipulating the customer into buying last year’s junk, it’s working as intended for USB-IF’s real customers and stakeholders.
> People look down on the move to usb-c which I don’t quite get
People loved Thunderbolt for replacing Firewire. They hated Apple's choice because these USB-C Macbooks shipped with precisely zero USB-A ports and relegated every user to carrying around a dongle.
The year is 2024, we're almost a decade out from Apple going all-in on USB-C and the predominant peripheral connector is still type-A. I don't like it either, but plugging our ears and pretending like it's not a problem is silly and only makes consumers mad.
> and reminder that in classic usb fashion, usb4 still doesn’t even guarantee Pcie tunneling support.
That is in fact the correct default to use. Ever heard of Thunderspy? https://thunderspy.io/
> Intel helped make this move possible, but it doesn’t manufacture laptops. Apple took the heat for “donglegate”.
And rightfully so. They took Intel's technology and told an unprepared and uninterested industry to switch or die. Naturally, very few manufacturers switched over and Apple's all-or-nothing strategy made more people mad than happy.
Having 4 lanes of Thunderbolt connectivity is awesome. It doesn't really fix the fact that none of them can easily connect to a wired keyboard or mouse.
> On the x86 desktop, usb-c is still surprisingly rare.
My motherboard only has one TB connector, everything else is type-A too. Most of the bandwidth is broken out over SATA or PCIe internally, and frankly I don't regret it one bit. 99% of my life, there is nothing plugged into that Thunderbolt port.