The early Atoms had pretty good performance per watt compared to Intel's other offerings. The whole 'netbook' and 'nettop' market segment was pretty much enabled by the Atom chips, and similar machines are still around nowadays. The E-cores found in recent Intel generations are also very Atom-like.
about a year after 'netbook's came out, the iPad was in the wild and it destroyed any chance of these ever catching on.. sure, they were cheaper, but the user experience on a tablet was just so much better. (and tablets got cheaper fast)
I basically only see them referenced mockingly these days but man I loved the netbook era. A 200 dollar computer dual booting Ubuntu and Windows XP (just to play Counter Strike 1.6 and Age of Expires) was a dream come true for high school me.
I got the original iPad as a graduation present and as futuristic as it was ended quickly lost its lustre for me thanks to Apple's walled garden.
Took a few more years until I was rocking Debian via Crostini on the first Samsung ARM Chromebook to scratch that low cost Linux ultraportable itch again (with about triple the battery life and a third as thick as a bonus).
I feel like the 2012 atoms made some sense. What baffles me is that atom was complete shit until 2020. Intel sold laptop chips in 2022 that didn't support FMA or AVX2 because they used an atom designed e-core that didn't support them.
I mean, the NeXT, Atari ST and Mac computers around that time were all m68k-based... And the Atari ST was the cheapest by far, since it was competing in the home computer market.
The Atari ST and similar machines like the Amiga and compact Macintoshes other than the SE/30 were not its competition, any more than the Sega Genesis was. Its immediate competition included Sun and SGI workstations (as well as other workstations) and the Mac II series - and for specific tasks, loaded 386DX and 486DX PCs. Sun was pivoting at that time to the SPARC platform and SGI to the MIPS platform, both away from Motorola 68K.
There were some high end Ataris and Amigas (Atart TT 030, Amiga 3000, etc.) but they came out a bit later. There was even the A3000UX that ran a Unix port!
Still, I agree. The 68K workstation was essentially obsolete by the time NeXT shipped. Sun was shipping early Sparc systems around the same time. The writing was on the wall. No wonder they didn't stick with their own hardware for very long.
Jon Rubenstein was said to have been cooking up a NeXT computer prototype based on Motorola 88k chips and would have been a serious contender in the workstation market, had it been realized sooner. Sadly, it ended up getting canceled right around the time NeXT became a software-only shop.
Honestly, Motorola is entirely to blame for losing out on the workstation market. They iterated too slowly and never took Intel seriously enough. I say this as I wistfully eyeball the lonely 68060 CPU I have sitting on my desk for a future project...
That would've been cool! The NeXT hardware was interesting. I have a Turbo slab in my retro collection.
Yeah, it seems Motorola lost their lead with the 68040. Intel was getting huge clock speed gains with the later 486/DX2, DX4, etc. From what I recall, a similarly clocked 040 was faster than a 486 on most benchmarks, but there was simply no way to compete with Intel's high clocks.
snake_case is just more readable overall. PascalCase and/or camelCase are okay in moderation and add a kind of emphasis, which is why e.g. Rust uses PascalCase for types (other than core built-in ones) and enum case constructors. SCREAMING_SNAKE_CASE is ok for even rarer things that should stand out - C/C++ uses it for preprocessor macros, Rust for program constants and global variables. Ada_Case is rare and mostly shows up in languages with case-insensitive identifiers, like Ada itself.
AIUI, even the Raspbian is only supported by a somewhat hacky downstream kernel to begin with. It takes time to achieve proper upstream support, and even then the community can only succeed due to how popular the Raspberry Pi hardware is.
I mean, "Objective C without the C" is just Smalltalk. It exists already. But that doesn't help you if you want any amount of backwards compatibility with the existing ObjC ecosystem. So you're kinda forced to go with the low-level approach.
What would stop Apple's hypothetical "Objective-C without the C" from talking to existing Objective-C code? After all, Swift can use UIKit just fine. Even mixing C++ and ObjC is reasonably easy.
In a sense, MacRuby was trying something similar, but the dependency on a GC doomed it.
There's a limit to how "lean" a safe, low-level language can be. Rust is leaner and simpler than Swift but not by much, and pretty much all of its outward features are "load bearing" to a far greater extent than Swift's.
(People have tried to come up with simpler languages that still preserve safety and low-level power, like Austral - but that simplicity comes at the cost of existing intuition for most devs.)
> It seems Swift 6.2 is still unfinished and is acting more like Java. Eternal evolution of language. While it is popular among tech and HN crowds to have new language and framework to play around and work
You can have both. Rust feels "mature" and "finished" if you stick to the stable featureset, but it's still bringing compelling new features over time in spite of that. But this can only be achieved by carefully managing complexity and not letting it get out-of-hand with lots of ad-hoc special cases and tweaks.
> The E-cores are so weak that it just about takes 2 of them to match the performance of an AMD core.
That's not "weak". If you look at available die-shot analyses, the E-cores are tiny compared to the P-cores, they take up a lot less than half in area and even less in power. P-cores are really only useful for the rare pure single-threaded workload, but E-cores will win otherwise.
We're not comparing to Intel's P cores but AMDs cores. 8 of AMDs cores fit in 70.6mm2 on a high performance process, and take up a fraction of that space on a high density process (see the 192 core Zen 5c chips)
> In the 80s the idea of a library of functionality was something you paid for, and painstakingly included parts of into your size constrained environment (fit it on a floppy). You probably picked apart that library and pulled the bits you needed, integrating them into your builds to be as small as possible.
If anything, the 1980s is when the idea of fully reusable, separately-developed software components first became practical, with Objective-C and the like. In fact it's a significant success story of Rust that this sort of pervasive software componentry has now been widely adopted as part of a systems programming language.
You're talking about different 80s. On workstations and Unix mainframes, beasts like Smalltalk and Objective C roamed the Earth. On home computers, a resident relocatable driver that wasn't part of ROM was an unusual novelty.
reply