Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

From the "If I could" files, I would have liked to spent 5 years on an AS/400, trying to make it work for whatever company I was working for.

The best way to learn this stuff is simply apply it, trying to solve problems.

Going from a High School PET to a College CDC NOS/VM Cyber 730 to a RSTS/E PDP 11/70 was very education cross section of computing that really opened my eyes. If I had gone to school only a few years later, it would have been all PCs, all the time, and I would have missed that little fascinating window.

But I never got to go hands on with an IBM or an AS/400, and I think that would have been interesting before diving into the Unix world.



The OS for the AS/400 is really remarkable as a "path not taken" by the industry and remarkably advanced. Many of the OO architecture ideas that became popular with Java were baked into the OS

https://en.wikipedia.org/wiki/IBM_AS/400

and of course it started out with a virtual machine in the late 1970s.


> Many of the OO architecture ideas that became popular with Java were baked into the OS

I disagree. OS/400 has this weird version of “OO” in which (1) there is no inheritance (although the concept has been partially tacked on in a non-generic way by having a “subtype” attribute on certain object types), (2) the set of classes is closed and only IBM can define new ones.

That’s a long way from what “OO” normally means. Not bad for a system designed in the 1970s (1988’s AS/400 was just a “version 2” of 1978’s System/38, and a lot of this stuff was largely unchanged from its forebear.) But AS/400 fans have this marked tendency to make the system sound more advanced and cutting-edge than it actually was. Don’t get me wrong, the use of capability-based addressing is still something that is at the research-level on mainstream architectures (see CHERI) - but the OO stuff is a lot less impressive than it sounds at first. Like someone in the 70s had a quick look at Smalltalk and came away with a rather incomplete understanding of it.

> and of course it started out with a virtual machine in the late 1970s.

If you consider UCSD Pascal, BCPL Ocode - far from a unique idea in the 1970s. It is just that many of those other ideas ended up being technological dead-ends, hence many people aren’t aware of them. I suppose ultimately AS/400 is slowly turning into a dead-end too, it has just taken a lot longer. I wouldn’t be surprised if in a few more years IBM sells off IBM i, just like they’ve done with VSE


I'll say this. There is more than one side to "object orientation".

A better comparison would be between the AS/400 architecture and Microsoft's COM. That is, you can write COM components just fine in C as long as you speak Hungarian. This kind of system extends "objects" across space (distributed, across address spaces, between libraries and application) and time (persistence) and the important thing is, I think, the infrastructure to do that and not particular ideas such as inheritance.

When I started coding Java in 1995 (before 1.0) it was pretty obvious that you could build frameworks that could that kind of extension over space and time and I did a lot of thinking about how you'd build a database that was built to support an OO language. Remember serialization didn't come along until Java 1.1 and than an RMI were still really bad and really cool ideas built on top of them often went nowhere, see

https://en.wikipedia.org/wiki/Tuple_space#JavaSpaces

there was the CORBA fiasco too. What's funny is that it just took years to build systems that expressed that potential and most of them are pretty lightweight like what Hazelcast used to be (distributed data structures like IBM's 1990s coupling facility but so easy... Not knocking the current Hazelcast, you can probably do what I used to with it but I know they've added a lot of new stuff to it that I've never used) Or the whole Jackson thing where you can turn objects to JSON without a lot of ceremony.

The more I think about it, objects have different amounts of "reification". A Java object has an 8-16 byte header to support garbage collection, locks and all sort of stuff. That's an awful lot of overhead for a small object like a complex number type so they are doing all the work on value types to make a smaller kind of object. If objects are going to live a bigger life across space and time those objects could get further reification, adding what it takes to support that lifetime.

I worked on something in Python that brought together the worlds of MOF, OWL and Python that was similar to the meta-object facility

https://ai.eecs.umich.edu//people/dreeves/misc/lispdoc/mop/i...

where there is a concept of classes and instances that build on top of the base language so you can more or less work with meta-objects as if they were Python objects but with all sorts of additional affordances.


Yes, AS/400 / IBM i is the other IBM OS I like to play with (I have an actual AS/400e at home), and in a lot of ways I consider it to be the polar opposite of MVS on the mainframe:

Where MVS seems to be missing very simple abstractions that I took for granted, AS/400 abstracts way more than I'm used to, differently, and most importantly far away from the very, very "file-centric" view of today's systems that was derived from UNIX. It indeed shows you what computing could have been, had AS/400 been more open and had those ideas spread farther.

Before I got to know AS/400, I thought UNIX was great, and that it rightfully took over computing. Now, not so much, and I've started to see how detrimental the "everything is a file" concept that UNIX brought into the world actually was to computing in general.


> From the "If I could" files, I would have liked to spent 5 years on an AS/400,

pub400.com still exists and probably will for 5 more years. not sure to what extent you can make it work for a company but you can at least do learning projects on it




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: