Computers are still cool, I still have my collection of Unix workstations from 20yrs ago all rotting away in the garage, SGI indigo, sparcstations, HP servers,, decstations, IBM machines. The only thing that made them cool back them was they were different! They were not x86, they had weird/different CPUs and gcc -S provided a very different output. They ran weird Unix OS such as IRIX, HPUX, AIX, BSD, SunOS & Solaris which allowed one to see a different take on things. sometimes they brought their own advantages, Solaris had really cool software, the only thing amazing about the SGI was the Monitor and the color, just beautiful. These were fun to network, hookup and share with Unix fiends so we didn't get into trouble using other folks computer.
Yet, I find that I'm still experiencing that same sort of fun, the raspberry pi, Arduino, GPUs, Xeon Psi, Google Coral, all sorts of cheap FPGA, if you want hardware, you got it! Tons more are now available with documentation. If you want software, you got it, Linux, BSDs, and you can still run NetBSD on your toaster. If you want programming languages, there's more available now than ever. Even getting access to a solid Lisp or Prolog system back then was a pain, you can get access to many choices for free today. Then you have latest and newest shiny things like Rust, Go, Javascript/Typescript, Haskell, Ocaml, Idris, elm, dart, etc.
Computers have never been this freaking cool! Craziest thing is that a $5 raspberry pi zero that fits between 2 fingers, could run circles around some of those cool computers of yesteryears. The past was fun then, the future is much fun IMO.
On the positive, I will pull out something like an Arduino for fun. It is possible to explore the depths of the machine and do things that are rather unique. Cheap FPGAs, 3-D printing, and such are all the same.
On the other hand, you have things like the Raspberry Pi 400. It was fun to watch a tear down or two to see what is inside. The vintage form factor is cool too. Yet the moment it came to software, the interest dwindled since it is just another Linux box.
When it comes to software, I feel that the lack of scarcity also hurts. Take those programming languages. Once upon a time there was the anticipation of getting something new, and there was incentive to learn the details. Now it's all just there. It's often hard to find motivation to learn anything but the surface details unless it's a truly unique language.
This isn't meant to diminish what has been accomplished. In the hands of the right people, they are powerful tools. In the hands of someone who just wants to play, they are much less cool.
On the motivation front, that's a personal thing. When choices are limited and things are new, there's that exciting factor in learning. But it seems most of us have seen the same thing in different variation, and then drowning in choices which does reduce the motivation factor. But this is more subjective than objective. Nothing stopping you from creating your own OS for the pi. Linus did it in 1990 for x86 with very little resource, anyone motivated can do so today.
This takes me back to the perspective that "wealth is relative".
Sure, these arduino and rpi may not be as powerful as the latest intel and amd processors, but lol kings and queens hundreds of years ago didn't even have access to these - not even to a goddamn calculator!
Also, required computation power and perceived power is relative.
I have a small OrangePi Zero fixed under my desk which runs Syncthing, Transmission, DNSMasq and other small daemons which really makes my life easier and network faster. That little box sits at 0.00/0.00/0.00 load while doing a lot of helpful stuff.
A good and very useful computer doesn't have to have a ton of bandwidth, monstrous amounts of power or bag-of-holding level storage to be exciting, interesting and useful at the same time.
And that board is like 10x faster than my router, which does similar things with a 560 MHz single core MIPS processor, 128 MB of RAM and USB attached storage.
Oh I fondly remember my Pentium MMX 200 Mhz with a Voodoo 2 and a whole 64 MB of RAM. It even ran Windows XP after upgrading to 128 MB of RAM! I was pretty sad when I found out it would not run GTA III :D
Back in the day, everything related to computers was cool. It was a small priesthood for the nerdiest among us.
Now that Harvard MBAs are "going into tech" there is a whole lot of uncool computer-adjacent activity.
But computers are cooler than ever. I can roll my own object detection algorithm to tell me if my dog is eating my shoes while I'm out. You can do so much cool stuff with them.
Back in the day, Richard Feynman thought using computers to print tables of arctangents was cool. Look what we can do now! It is straight up wizardry (or sorcery).
I think this article has a bad title, but I agree with its actual point, which is that back in the 1980s and early 1990s, desktop computers topped out at one level and workstations were a discontinuous jump above, unreachable as long as you were limited to commodity components and supply chains. Their hardware and software was Just Better (Than Home Systems Of The Same Era[1]) and it was certainly a lot cooler. I mean, NeXTcube vs Macintosh Quadra? Please!
That said, this is a Kids These Days rant about how there is a smooth transition now, because the workstation as-was is a dead concept, replaced by desktop systems just a few price points more expensive than yours but which are still, fundamentally, the same kind of system.
I think that is why Raspberry Pi and the like are the new cool. They are different, and they can also be what you want it to be. They spark the imagination, and I think that’s the kind of magic of what the author describes. A system thats unique (and capable) is kind of the lure.
From a software perspective I would have preferred for workstations to remain their own thing. "home" computer software just doesn't take certain attributes seriously enough and with the way microsoft is treating the "pro" version of windows these days for example this lack of seriousness is bleeding over to the professional space.
That's part of why old-style workstations died: Everyday desktop computers rose to meet them, and suddenly they couldn't claim to have any technical advantages to justify their pricetags. I'm not saying the Quadra in specific was on a level with the NeXTcube in specific on all fronts, but over the course of the 1990s that leveling process helped sweep away the workstation as a completely independent class of system.
But computers are cooler than ever. I can roll my own object detection algorithm to tell me if my dog is eating my shoes while I'm out. You can do so much cool stuff with them.
On the other hand, if you remove enough limits from art or computation, nearly everything becomes a feasible possibility and that's boring! The best stuff always happens when you're constrained, and the end product from that just barely yet still magically works - maybe with a few hickups but well enough nevertheless.
> On the other hand, if you remove enough limits from art or computation, nearly everything becomes a feasible possibility and that's boring!
I agree. What I am trying to figure out is why.
About 25 years ago, I was enthusiastic about exploring 3D computer graphics. These days, I could hardly care less. Challenge isn't really an issue here because excellent 3-D graphics are more difficult to create simply because the bar has been raised so much higher. I suspect that doing something that feels innovative may be a big part of the problem. While this doesn't take away from the personal challenge of learning something new, it does diminish the sense of accomplishment. Or maybe something else is going on.
I don't know, there is nothing else I can think of that opens so many different capabilities with a little effort. If it was effortless then yeah it would be boring, but as is it still feels like actual wizardry when I make my computer do something cool, and these days the spellbook is a whole lot larger.
No. They're just experiencing dense nostalgia. Also, they're pointing a much deeper issue with computing nowadays.
I guess that I'm close to the author age-wise and, can relate to their feelings. I feel the same thing from time to time and had reflected upon that.
They're missing the time where capacity and capability is visible from outside and, somewhat overt.
You needed a powerful computer with SMP capabilities? You needed two actual CPUs, a special board, a lot of RAM, appropriate power supply, cooling and case. Now, a phone or a lowish-end SBC has more processing power than that beast.
You wanted a lot of fast storage? Get the SCSI drives with big ribbon cables and terminators, set it up install the controller with its own dedicated XOR processor (for RAID) and listen disks spin up and sing with head seek and platter resonance sounds. Today a run of the mill USB 3.0 stick can transfer my first hard disk's capacity under a second.
You wanted good network? You had to get a good NIC with its on-board offloading engine with lots of components and LEDs and shove it to your PC. Now? A measly battery powered Huawei modem sucks data at 100Mbit ethernet speed and shoves to a USB cable without you ever notice. A simple on board NIC can transfer 100MBps for days and whole system barely notices the streaming data.
This visibility and explicitness of tech also brought simplicity with that. The most complicated thing was X86 protected mode and it wasn't that hard to fathom. Nothing was optimized beyond comprehension and with an in-order CPU, you can visualize everything from keypress to screen in your mind with some experience.
Now a simple webpage is behind a webserver which is a load balancer to a K8S cluster which runs 10+ containers to keep everything scalable which runs deep inside a Linux system where you pass thru at least three networks (Internet->DC LAN->K8S->Docker) to access to the page you're looking for.
Your cloud files are behind a REST API which terminates at a library which queries an object storage front end which retrieves your file from an object on a disk array god knows where (forget the details about array controllers, RAID, network, etc.)
Even learning basic K8S is a multi-week effort now.
This is what makes computing un-cool today. Migrating from a Raspberry Pi to a production system is not straightforward anymore. Nothing is simple anymore. Nothing is easily graspable anymore. Building abstractions to make things simpler has gone too far and the middle layers are just too many and too thick. It's too time consuming to try to learn everything just because of curiosity now.
I grew up in the age of 8 bit computers, C64s Ataris, spectrums, amstrads, etc. It was cool. Everything was really constrained, and it was super exciting when something broke through a constraint.
But now, I think it's even cooler, there is just so much we can do with all the computing devices available to us. We are connected with so much human creativity these days that it really is overwhelming. Things like raytracing and VR etc are still pushing at edges that makes things exciting, then there are things like raspberry pis, microbits, arduinos, etc are super cool, I still think top end graphics cards are cool. I think because there really is so much cool stuff, it starts to get hard to appreciate it.
I agree, seeing how much gargantuan computer power is available to us today is not something I learned until recently, even though I in theory knew it intellectually.
Watching ben eaters YouTube series where he constructs an 8-bit breadboard computer gave me a new perspective on just how integrated and complex my previously considered humble or even pedestrian computer really is.
What made those computers cool was that you could do things with them than you could NOT do with the run of the machine. Yes, today is a bit boring, but we've all got super computers on our desks. This is awesome.
I get what this guy is driving at. Back in the day, there was large step changes in desktop capabilities that are just a gradient of capabilities now.
For me, it's about my home lab. There I build capability that's not quite as common in desktops (10/40Gbe networks, object storage arrays, clusters etc...) And it's not that expensive with all the second life kit as long as you don't count the power costs!
Yeah, I call BS on that. This bubble us techie live in it paints this sort of boring. Try go on the country side or in another industry and suddenly you are looked upon as some sort of god - it's almost cringy they way they treat you when you spend 5 minutes fixing something that half the village tried to fix it in past year and then they gave up.
> Try go on the country side or in another industry and suddenly you are looked upon as some sort of god - it's almost cringy they way they treat you when you spend 5 minutes fixing something that half the village tried to fix it in past year and then they gave up.
Related to this I was recently a bit saddened after helping some family members getting their new WiFi and Apple TV set up because, after I was done setting it up one of them was like, “you should work in tech support, you are so good at this”.
Now the reason that it saddened me was because while I appreciated that they appreciated the help, and that she ment nothing but to compliment by saying that, but the thing is that doing tech support would literally kill my soul. And I realized that to a lot of people there is no discernible difference between doing software development and doing tech support. It’s all just working with computers right? And they know that I like to work with computers.
But where I actually derive joy from computers is in creating things. Writing code, optimizing my algorithms and data structures, improving the speed and robustness of software that I write, making software that is as ergonomic and sensible as I can. Both when it’s APIs that other developers will use or I myself will use and when it is user interface stuff that end users will use. Those kinds of things are what gives me satisfaction.
To sit on a phone line and guide someone through a thousand steps of troubleshooting and finally in 90% of cases having them reboot a machine, that’s just.. not even close to being enjoyable.
I always like to help people, and if someone asks me, whether they be friends or family or others I will help them. But when I do that, I do it for a different reason. I do it because it is easy and I know it helps them. Not because I enjoy doing that in its own.
And I don’t mean to talk down at people doing tech support or anything. Nor do I want to seem arrogant. What I am getting at is just that it’s not the same. Not at all the same.
I didn’t want to get into all of that then and there of course, so I just carefully said that tech support wouldn’t really be for me because what I like most is to create stuff.
I did tech support and I definitely liked it. Climbed up the ranks quickly. Really the first job I was exceptionally good at. The problem was that not only the calls kept coming from fresh users that were just as uninformed as the ones I got on my first day but worse was the fact that every month my coworkers got more clueless than the previous batch.
I can stand clueless (and largely thankful) end customers. What I cannot stand is people that should be better at what they're doing and simply don't care.
The other professions have exactly the same issue. People dont understand nuances of other professions, especially not professions they never encountered.
I loved doing tech support, but the problem is it doesn't scale. You can't really make much of a living at it because you can only help one person at a time with their specific problem, and they can't afford to pay you hundreds of dollars to do it.
A while ago I was at a Bachelor party where someone, who was a therapist by trade, talked about a patient who's written an app to help him map products to their location in the store, so that he could do groceries like a travelling salesman.
They were truly amazed by this, while my reaction was "wait, I could totally do something like that".
I get caught off guard by such situations quite often.
And knowing exactly zero about bio lab, as a SW dev with plenty of experience, you guys are gods for me. You manage to come with solutions to my health problems, I will kiss the ground before your feet.
I totally agree. Also, I am worried that the current tri-poly is a stable local maximum: it now requires huge resources to create a new computing environment (as in, combination of hardware and software) that could compete with any of the currently available ones.
This is sad for many reasons, not the least of which is the fact that the current offerings are all unsatisfactory.
This happens to pretty much everything as they advances and become better. Try to start a car company today without huge investments and a team of people.
I fondly remember a summer job in the IT department of a gas exploration company where we were unboxing and setting up a load of SPARCstation 10 machines, each with three(!) monitors. Monitors so huge and splendid that they came with a credit-card sized remote control (which when not in use was stored in a slot in the monitor casing) and a warning not to damage the screen on your belt buckle when lifting it. So cool.
We also did a series of RAM upgrades to some long-forgotten Silicon Graphics machines. Whatever they were, the cases were large and purple, everything inside looked like it was designed to survive a nuclear attack. When powered back on the machine played some wonderful indulgent sound deep from inside. So cool.
When all that was done it was time to get back to the PC users to complete their upgrades from 4MB to 8MB of RAM. Less cool, but at least the users were happy at the end of it.
Those would probably be SGI Onyx boxes, fyi. I don't know any other machine they had that was purple.
Never saw one. The closest I got to the glory days of SGI workstations was the Indys which at my first real employer had been repurposed as auxiliary web servers. Even then, they were sweet machines.
"GPT-3 uses half-precision floating-point variables at 16 bits per parameter... At around $130,000, DGX-1 is short on VRAM (8×16 GB), but has all the other components for a solid performance on GPT-3." (DGX-1 is the previous chassis NVIDIA offered for hosting V100s.)
So I tentatively think the answer is yes, a single DGX-A100 probably can run GPT-3 comfortably. You'll still want a cluster of them to train it, though, because they note training takes about 355 V100-years.
I can only comment that this is at odds with the information at the link I supplied, and I cannot tell you for certain whether that information is actually accurate.
The link only mentions requirements to run inference at "decent speeds", without going into details about what they consider to be decent speeds.
In principle you can of course run any model on any hardware that has enough RAM. Whether the inference performance is acceptable depends your particular application.
I'd argue that for most non-interactive use cases, inference speed doesn't really matter and the cost benefit from running on CPUs vs GPUs might be worth it.
If you find tech frugality, low-end computing etc interesting, then all the other essays on this guy's pages are also worth a read. Great and thoughtful blog, especially given that The World Is About To End Pretty Soon. Thanks a lot for posting the link.
Democratization leads to that, computers were restricted sophisticated tools surrounded by crypticism and mystique. Now your phone obeys to your finger and you can buy a new one with pocket money.
I think the boring part of current computers is not in the increase of speed, but the reduction of ports and richness of interfaces. Everything is Windows/Linux + mouse or a slab of black screen without buttons.
Back then we had Parallel ports, vga, rj11, express card, speaker buzzer, etc. Now hardware ports are way more complex and less fun to mess with. At least it can be for a greater common good of compatibility and efficiency.
What was really cool back then was ISDN to your home.
I was doing work with SGI back then and had an Indy at home. It was a nice computer, but there wasn't that much software that really made a difference day-to-day. Sure the debugger could render 3D graphic representations of data in memory, but who cares?
ISDN was a huge upgrade over dial-up. Latency and packet loss was much much lower than dial up which made accessing the early web a totally different experience.
We got ADSL, which was just as impressive. Unlimited data for one monthly fee, no more hogging the phone line. And now nobody bats an eye at gigabit fiber straight to their home.
Ahh, I did work experience at SGI in the UK and one of my tasks was to process the BABT approval invoice for the Indy's ISDN modem. Four days of fairly boring tasks, and one day of wonder spent in the Reality Centre in the week before it officially opened.
I dreamed of getting an Indy, too. The low end version was "only" a few thousand. It seemed so in reach (never mind impossible for a starving university student) and so much more fascinating than the moldering 486 at my parents' house.
> Sure, the machine might be faster than your current one, but except for the rare few cases when you actually utilize all that power, it won't provide a profoundly different user experience compared to what an iMac will deliver at a fraction of the cost.
Same could be said for a stronger car. It still only gets you from A to B. Except for a few times where you are not stuck in traffic, there is any benefit to the engine. And if you talk about ergonomics of seats and comfort, well nobody stops you from investing in better office gear.
The author has a point. Cool car are faster but that's not where coolness comes from.
The same is with houses, computers and stuff. The cool computers have an indescribable 'thing' that you can notice and like when it's there and miss it when it's not.
I'm sure I've heard people talk about it before. 'The art of motorcycle maintenance' talks about 'quality'. Richard P. Gabriel talks about the 'quality without a name' in one of his books.
If by “cool” you mean exclusive and expensive, then good news: it’s still the same today! Obviously not for things like low resolution 3D graphics, web browsing, or word processing, but that’s because those things have been democratized already and the world has moved on. There are plenty of new frontiers though, you’ve just got to rip off the nostalgia goggles and embrace them.
Viable VR headsets for home use only came out last year (Oculus Quest). Nvidia's latest graphics card is sold out. People got rich mining cryptocoins on their computers (granted, 10 years ago, but still).
There are still many interesting things to do with Computers.
Also back in the day, people who liked Computers were likely to become outcasts in society. They were not the cool kids.
This article kind of reads like a car magazine version of cool. Man, we had big huge V8's! We were slicing for any edge we could get! Change was in the air & we were proud to be on top! Our MHz were mighty, our caches large!
It still strikes me as ultra-consumerist, the right to nerd out via niche knowledge, geeking out of displacement & carbs & tranmissions, pride in the machine.
What seems utterly absent is that operators used to be cool. There used to be wizards, masters of the machine, who could make it dance. We used to be tuning ourselves in, better harnessing the capabilities about us. But consumerism has won out. The pride in the numbers has taken over, & pushes on & on, to such great heights that the numbers are irrelevant, gratuitous. The close connection with the machine, the making it an extension of ourselves, that has faded, that has not maintained, we no longer have those role models, those operators of supreme power no longer prefigure in our imaginations.
Now a days, almost all our experts are harnessed, lashed to large massive industrial machines. We no longer have the local, scattered computing experts. There are those operating at and inside FAANG scale, those operating their sizable or medium scale systems, and everyone else, and few of these are hyper-empowered individuals, few are wizards: all are either workers within their greater vaster machines, or users of the other expert technology they happen upon.
Computers are still cool. We are less so.
> there was a world of curious and wonderful applications written with nothing but SGI hardware in mind: web authoring, video editing, image manipulation and graphics creation unavailable on any other platform
More pointless consumeristic exclusivity.
> Yet there isn't, today, an equivalent of the SGI Indy, or the Sun SPARCstation, or the DEC Alpha, or any of the other professional workstations. The only thing that's on offer is more of the same user experience, only slightly faster.
Meh. You _can_ go spend a couple thousand on a 32 or 64 core ThreadRipper or Epyc? You can go stuff your system full of TB of ram, or dozens of GB/s of NVMe. You can go add 100Gbit ethernet. You can go put in a colossal GPU. These will allow you new powers, but they are not reserved for some special class of machines. They are part of the same machines, with the same power as other machines, but greater specs, greater numbers. Your claim to exclusivity, your ability to feel special, is diminished, in this PC-compatible era. I'm not sorry to see these competencies subsumed, made generally available, no longer a part of elite, select niches, particular breeds of machines.
I do think there are still some magical bits of computing. To be honest, Intel's latest Ice Lake mobile chip is unlike anything else the world has seen. It doesn't have a lot of connectivity, only two ports of connectivity built into the chip, but each port can do 40Gbps of mixed USB or DisplayPort or PCIe, and that's magical. USB4 is good for 80Gbps of DisplayPort. We need to get on this kind of magic, this level of cool. Finally, the consumers have non-junk connectivity. 1Gbit ethernet finally made apparent as the tiny little straw it is, a joke, and USB4 does support the system-to-system connectivity to drive home what a sham we've been living with, an unending non-upgrading set of standard peripherals that haven't improved much at all.
But mostly, I think computers being cool come down to an issue of consumerism. Of everyone experiencing the same mundane application-centric small-operating-system world. A world where cloud has taken over, even for consumers, where we want our stuff accessible anywhere, anytime, from any device, and where what we do on any given box is hardly relevant, and, at this point, well patterened & un-original, undifferentiated, all pretty much the same. The OS has no chance to differentiate, because all of the power, all of the use, all of the systems have kept effervescing higher & higher, away from us, into that great mighty cloud above. How can we be cool, when everything that's happening is so very far away?
> How can we be cool, when everything that's happening is so very far away?
You are conflicted here. On the one hand there's a whole new world available to tinkerers and poor people alike:
low cost SBCs and rich ecosystems of knowledge, peripherals, open source software packages, and maker communities. OTOH consumerism turns once cool and magical things into mundane commodities for the masses.
Being cool is no longer expressed by being able to wield the magic wand of exclusive and arcane insider knowledge or access to hardware.
Being cool means creating your own semi-autonomous robot. It means providing teaching tools to the less fortunate at affordable prices. It can be achieved by realising your creative potential in new and unexpected ways (smart mirrors, automated hydroponic gardens, interactive art, ...).
Many of the things you mentioned regarding applications moving away from us (and into the "cloud") were never even cool to begin with. Streaming is just TV in new clothes. Social media is just the digital equivalent of high school hallway gossip. [edit]I understand you never mentioned these particular applications by name, but it's what most people actually use the cloud and computers for these days[/edit]
The difference between cool and consumerism can be little things: a digital picture frame is boring consumerism. A digital picture frame that has a motion sensor, a camera and facial recognition that only turns on if you move towards it, identifies you, tries to guess your current mood and picks an image accordingly, however, would be something I'd consider to be pretty cool.
The best part about it is that you can probably build something like that for less than $100 and it'll work offline using OSS components only.
So consumerism is both a blessing and a curse - it enables products to be affordable for a general audience; at the same time it takes away the "magic aura" surrounding incredible technology and its applications.
The existence of computers that are out of reach for everyone except massive corporations and universities is a very weird definition of cool. Computers are cooler than ever. They're so cool they subsumed much of popular culture including the uncool people, but don't be fooled the cutting edge of cool is still happening on computers.
When looking at the Armari Magnetar [1] I'm torn between wowed because of fastest workstation and turned down by the ugly case. I'd wish for something matte black, maybe with the aspect ratio of the monolith from Kubrick's 2001,
or something like the design and material from NeXT.
This could have been titled, "when operating systems (or computers) were different from each other".
Now it's just a cpu, peripherals, a flavor of Unix, some mild window management customization, and one of two browsers. Even windows is finally getting on the Unix bandwagon (not counting xenix....)
Yet, I find that I'm still experiencing that same sort of fun, the raspberry pi, Arduino, GPUs, Xeon Psi, Google Coral, all sorts of cheap FPGA, if you want hardware, you got it! Tons more are now available with documentation. If you want software, you got it, Linux, BSDs, and you can still run NetBSD on your toaster. If you want programming languages, there's more available now than ever. Even getting access to a solid Lisp or Prolog system back then was a pain, you can get access to many choices for free today. Then you have latest and newest shiny things like Rust, Go, Javascript/Typescript, Haskell, Ocaml, Idris, elm, dart, etc.
Computers have never been this freaking cool! Craziest thing is that a $5 raspberry pi zero that fits between 2 fingers, could run circles around some of those cool computers of yesteryears. The past was fun then, the future is much fun IMO.