Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It’s funny how I used to think that operating systems, databases, and cloud systems, as these kind of arcane things that were somehow apart from the rest of software.

I could not begin to grasp how they could deal with running programs, access control, or any of these things that once seemed so foreign and mysterious to me.

And realising that it’s actually just software, that there’s no black magic to it, just lower-level APIs, less pre-built batteries-included stuff to rely on, and probably a whole lot of more complexity, that was… both mind blowing, exciting, and kind of a let down.

And it has enabled me to have more realistic expectations from these tools.

It however doesn’t take away any of the brilliance and effort that went into the these things, or lessen the credit due to those behind them. I can only begin to imagine the complexity involved in a full-fledged modern OS.



If you wanted to learn, I really recommend Operating Systems: Three Easy Pieces (OSTEP). I thought it was excellent and pretty easy to follow. https://pages.cs.wisc.edu/~remzi/OSTEP/


There are many things in OSTEP that I found really eye opening. One thing that stuck with me was that at some point, when discussing virtualization, it referred to an OS as a virtual machine, and that really changed the way I look at operating systems now.

> That is, the OS takes a physical resource (such as the processor, or memory, or a disk) and transforms it into a more general, powerful, and easy-to-use virtual form of itself. Thus, we sometimes refer to the operating system as a virtual machine.


Oh that’s a great quote. Really good framing.


ken thompson when asked what he thought of virtual machines, his response "they are such a good idea i wrote one. you may have heard of it: unix"


Thanks!


It's turtles all the way down. Layers of abstraction stacked on layers of abstraction.


Until you get to the physics of transistors and logic gates :)

Though I suppose in a sense, even subatomic particles are abstractions over quarks. It’s just that those are no longer abstractions we control.


I realized the brilliance of the name Quartz for the graphics framework. It’s what happens to “rendered” silicon.


Quartz glass is strong and very clear, it implies sharp, clear images.


Yup. APIs and constraints all make engineering sense. Turtles, all the way down... except when you get to quantum mechanics, that stuff is whack.


>QM is whack

Very much so. But I also think it always had to be that way. A universe of infinite classical regression (atoms made of atoms made of atoms...) would be more insane than whack.


If I had to guess, I'd say infinite regression is how the real world works, and quantum mechanics is how our reality's simulator lazily computes interactions.


Would that make things like wormholes similar to a heartbleed attack?


why i like arduino.

why i kind of like old IBM PC environment. x86 + a few bios calls, dont even need DOS after you load game.exe


Yeah NetWare used to boot DOS then load their kernel over it. I feel like computers used to be cooler.


Well, that is it worked on Windows as well ( up until Windows ME ), load DOS first and then Windows.

You could still do it that way if you want. Windows 11 needs a boot loader. No reason you cannot write a DOS based one.

The comment about game.exe above probably refers to the days of “DOS Extenders” that games were based on. You would launch a game from DOS and, instead of just running a 16 bit DOS executable, it would switch into 32 bit mode and use a “DOS extender” to run the game.

“DOS extenders” are really just operating systems though that use DOS as a boot loader. Popular options for DOS games and programs were DOS/4GW, Phar Lap, and Quarterdeck ( from memory ). There were many others.

Windows 95 was really a “DOS extender” too.


dos.exe was kind of referring to how a lot of games would just completely ignore DOS even before the days of protected mode and 32 bits, im going back to the 8088 days.

but yeah same idea.

actually now that i remember it, my first linux was a copy of ... i think Dragon Linux? It used Dos as a bootloader, it ran off the existing DOS FAT filesystem so you didnt have to mess with partitions, and it dealt with the filename-too-short issue by keeping a dot file in each directory with a list of the 'real' linux long names and how they matched the DOS fake shortnames.

but like even windows and linux, they are having layers and layers of abstractions. trying to do anything in X11 or win32 API calls, compared to the before-times where you just write directly into RAM at A0000 to draw pixels.. there is something about that that is very interesting.


Interesting. An odd rabbit hole. I've never heard of this kind of thing existing before Ubuntu wubi, (though I don't know why that surprises me, it's not like they're was no need for such a thing before that):

> DragonLinux was a distribution of Linux that had the ability to be installed on a loopback file on an existing FAT16 or FAT32 partition.

https://sourceforge.net/apps/wordpress/dragonlinux/

In the readme linked there, it says something different:

> DragonLinux is a Linux distribution which runs on top of windows with no partitioning needed as long as the Windows OS is sitting on a FAT32 partition. DragonLinux is fully supported on Windows ME and below. Work for support with Windows 2000 and Windows XP is in the works. > > DragonLinux is a UMSDOS distribution, compared to a Loopback filesystem of vr2r1... That change was made to eliminate the 2GB disk space boundary with the previous version, and to simplify installation and expansion (ie the file system grows as needed). > > The Full Version is a fully loaded version with GNome and several other smaller window managers, and the full line of tools for GNome. The lite version is a full console version with everything from the full version excluding the X environment, therefore being able to be run on older machines.

Looking this up more brings me to umsdos project, a filesystem that apparently runs on top of another filesystem

https://tldp.org/HOWTO/UMSDOS-HOWTO.html and https://en.wikipedia.org/wiki/FAT_filesystem_and_Linux

This brings me to loadlin:

> loadlin is a Linux boot loader that runs under 16-bit real-mode DOS (including the MS-DOS mode of Windows 95, Windows 98 and Windows Me startup disk). It allows the Linux system to load and replace the running DOS without altering existing DOS system files.

https://en.wikipedia.org/wiki/Loadlin

And then there's also grub4dos

https://github.com/chenall/grub4dos


> I've never heard of this kind of thing existing before Ubuntu wubi

WUBI wasn't the same at all.

WUBI makes a single big file and formats it with a Linux filesystem.

`umsdos` kept Linux files as DOS files directly in the FAT16 filesystem, with a hidden file in each directory containing additional Linux metadata. No Linux filesystem anywhere.

You could even run a DOS defragger and your Linux would survive. :-)

I used a distro called Pygmy Linux that worked this way.


It still kind of blows my mind that basically all this reduces to assembly and then machine language.

I’m of the age where my first computing experiences were on green screens, PET computers, networked VAX machines at the colleges my parents taught at and so on.

I was aware of what assembly was back then, and the idea that the game or basic business program I was using was just a sequence of MOVE ADD eax and so on. I just couldn’t quite get how you could be smart enough to use that to something useful.

And everything we see today is basically just a bunch of moves and adds and pokes a layer or three below the surface.


It hits you in the face, how do you eat an elephant? A bite at a time. One API, one abstraction at a time.


The thing that made me realize this was compilers!

"Wait, code is just text?"

"Always has been"

I still have to remind myself of this sometimes when I think "woah, how does this work?" and then I try to step through how it might be built.


I don’t know that code was always text, but I’m thankful to have been born in the days after programming assembly on punch cards was not the only way.


The early days of programming was also text in the form of pencil on paper, then the program was transferred to the machine in various manual ways, such as flipping toggle switches, followed by punch cards. But the actual process of writing programs was always text. Even before electronic computers people wrote down algorithms to be executed by themselves on paper, or by rooms full of people whose job title was ‘computer’.


My first experience of programming was a 1970s 8085 kit with a hex keypad and a few digits of 7 segment LED display. The only book was Intel's technical reference manual for the 8085, and my first big project (a full year university project) involved conceiving the right program, writing it on paper in assembly language, hand assembling it, entering it as hex (a few hundred bytes). It worked first time because it pretty much had to work first time so I was very very careful.


Yeah, it's all a bunch of programs doing their thing. One draws a GUI, another talks to the network adapter, and so forth.. And when you click a launcher icon, a loader program is started that (more or less) copies a program into RAM and when all is set up, it tells the OS (and the hardware) to run that, too.


Yes, a modern OS is a tangled mess of microservices.


They just don't communicate using JSON. I can see it now; Kernel JSON (aka kjson) support in the Linux kernel to attract Node developers.


Seriously I wish /proc was available as a tree of JSON documents as opposed to all these different ad hoc plain text formats. It would make using /proc a lot easier and more reliable


Could be unnecessarily costly unless the files are written ahead-of-time. I propose some kind of GraphQL-like interface to only read what you need.


/proc files are generated on-demand as you read them.

Generating JSON couldn’t be significantly more costly than the current ad hoc plain text formats are


Just take the Unix philosophy and replace the word "file" with "JSON file".


I think that’s one of the advantages of coming into computing in the early 80s. Computers like the Apple ][ were completely understandable by a single person (and it’s still my mental model for how a computer works 40 years later).


For me this moment was when computers were no longer “magical” to me and that I found comments to this effect (even from those that know more than I and were more senior) odd and kinda culturally cultish.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: