Hacker News new | past | comments | ask | show | jobs | submit | more GrumpyYoungMan's comments login

The display, RAM, and other peripherals are consuming power too. Short of running continuous high CPU loads, which most people don't do on laptops, changes in CPU efficiency have less apparent effect on battery life because it's only a fraction of overall power draw.


And Neal Stephenson acknowledged it was obsolete in 2004:

"I embraced OS X as soon as it was available and have never looked back. So a lot of 'In the beginning was the command line' is now obsolete. I keep meaning to update it, but if I'm honest with myself, I have to say this is unlikely."

https://slashdot.org/story/04/10/20/1518217/neal-stephenson-...

But people still dredge this quarter century old apocrypha up and use it to pat themselves on the back for being Linux users. "I use a Hole Hawg! I drive a tank! I'm not like those other fellows because I'm a real hacker!"


Neal gave up on Linux because he wasn't a developer. He couldn't take advantage of the freedoms it provided and it worked the same for him as any proprietary OS would. I.E. he had the excuse that programming is hard, a specialty that requires much practice. This is an ongoing issue with free software and is why it is niche... it primarily appeals to software developers as they are the only ones that can take advantage of the freedoms it provides and are the ones that truly sacrifice that freedom when they use a non-free OS.


Yeah, this is basically my take too. I had a hardcopy of this sometime between 99 and 02, and read it several times

At the time I was an embedded developer at Microsoft and had been a Windows programmer in the mid 90s. It was pretty clear that there was some dunning Krueger going on here. Neal knew enough about tech to be dangerous, but not really enough to be talking with authority


Given what OS X has become it's un-obsoleted itself again.

It's kind of ironic that you're using a post from 20 years ago to invalidate an essay from 25 years ago, about an OS that's been substantially dumbed down in the last 10 years.

Bad corporate blood will tell.


I also "embraced OS X as soon as it was available". My first Linux install was Yggdrasil, but I cut my teeth on SPARC stations. I respected two kinds of computer: Unix workstations, and the Macintosh.

So when Apple started making workstations, I got one. I've been a satisfied customer ever since.

I have no idea whatsoever what dumbing down you're referring to. The way I use macOS has barely changed in the last ten years. In fact, that's a major part of the appeal.


Seconded. Make it 20 years.


Just got a Windows 11 machine. Had to, to run Solidworks. Have it next to a new M3 iMac. They’re all configured with the same apps. Despite not having used Windows in 10 years, these machines behave identically. But Windows 11 is snappier. And you can actually find things that you don’t know where they are!

I was amazed.


With all the ideological hate against Microsoft/Windows (even among its long-term users it seems) everybody seems to miss the part where Windows 11 is actually pretty good and I would say in some ways actually superior to macOS nowadays, especially with PowerToys.

For starter, it is much less annoying from a security/notification standpoint, you can tell it to fuck off and let you do your things if you know what you are doing.

macOS isn't too bad yet but is clearly lagging behind, Apple is unwillingly to meaningfully improve some parts and seems to refuse to sherlock some apps because it clearly goes against their business interests. They make more money earning the commission on additional software sales from the App Store, a clear conflict of interest. They got complacent just like Valve with all the money from running it's marketplace.


> For starter, it is much less annoying from a security/notification standpoint, you can tell it to fuck off and let you do your things if you know what you are doing.

In many corporate environments you can't.


macOS is behind. And this is speaking as someone who probably owns one of everything Apple makes, Apple stock, and was exclusively Mac for the last 10 years.

I have less than 50 hours use on my Windows 11 machine, a midgrade Lenovo P358 rig I bought renewed because it had plenty of memory and an Nvidia T1000 card. Yet it taught me that the test of an operating system is how quickly you can navigate around, and how well it can find things, given only clues. Windows 11 is just snapper, quicker, than the latest macOS running on a new M3 Mac.


This is also my experience. Worse, it is snappier on a 10 yo computer (top of the line though) than on an expensive 24GB M2 MacBook Pro.

There is some software that I find nice and convenient in macOS but it has gotten really hard to justify the price of the hardware considering the downsides.


I’m not knocking windows for me I’m just kind of hooked on the Mac trackpad and a few little things about the Mac that I prefer now. I use a networked Windows machine semi regularly and there’s nothing wrong with it. I just remember the days of BSOD and viruses and random shutdowns to install updates that couldn’t be stopped in the middle of the workday and 1000 other little niggles that makes me choose a Mac. I’m sure contemporary windows machines if configured right are totally fine, better even - my housemate keeps touching to scroll on my Mac screen because she has a windows laptop that comes with a touchscreen and I can see how that would be handy on the Mac.


I will never understand the sentiment that macOS has been “dumbed down.”

It’s a zsh shell with BSD utils. 99% of my shell setup/tools on Linux just work on macOS. I can easily install the gnu utils if I want 99.9% similarity.

I very happily jump between macOS and Linux, and while the desktop experience is always potentially the best on Linux (IMO nothing compares to hyprland), in practice macOS feels like the most polished Linux distro in existence.

Do people just see, like, some iOS feature and freak out? This viewpoint always seems so reactionary. Whereas in reality, the macOS of the past that you’re pining for is still right there. Hop on a Snow Leopard machine and a Ventura machine and you’ll see that there are far, far more similarities than differences.


I stopped using Mac because of the planned obsolescence, which is a huge problem with machines even 4 years old. Can't even update those BSD utils anymore because you can't update the OS, because you don't have the latest hardware, because you don't want to spend 2k more to get back the basics you had already paid for.


MacOS Sequoia supports every computer 4 years old. Almost no apps or CLI programs will absolutely require it unless they’re specifically invented to support some brand new feature. I’m running it on my 2018 Mac Mini.

If you wish Apple supported computers longer, fine. I’d personally disagree because I’ve had wonderful luck with them supporting my old hardware until said hardware was so old that it was time to replace it anyway, but would respect your different opinion. Don’t exaggerate it to make a point though.


It's ridiculous that you would claim this isn't a problem.

I'm typing this on a 12 year old MacBook Pro running Debian whose hardware perfectly fine, but hasn't been supported by Apple in years.

FWIW, Debian supports it fine, though NVidia recently dropped support for the GPU in their Linux drivers.

I'm going to miss it when it dies, too. Plastic Lenovos just can't compare.


I never said any such thing. I said it’s not a problem for me but that others may have a different opinion. And you could still use an older OS on that Mac; the ones it shipped with don’t magically stop working.


You're not a typical Mac OS user. The typical users I know do not know how to use finder, let alone shell commands, to navigate file system. Personally I have no issues developing in either Mac, Linux or Windows because I'm advanced level. But for the same reason, I prefer Linux or even Windows because those provide more freedom to the developer.


You are getting downvoted by the fanboys but this is exactly my experience too. There is a type of macOS users that are experts (in technology in general) but those are by far the minority (and it is shrinking).

For the most part the macOS user is of the religious zealots' type and they barely know how to do the basics, far worse than you average seasoned windows user, even though in principle macOS should be easier to handle (in practice it's not exactly true but still...).

People here who seemed to think otherwise really live in the reality distortion field and it seems to be linked to the mythical Silicon Valley "hacker". At first, I drank the kool-aid on that definition but it actually seems pretty disrespectful for "real" hacker; but whatever, I guess.


In what way has it been “dumbed down?” I use modern MacOS as a Unix software development workstation and it works great- nothing substantial has changed in 20 years other than better package managers. I suppose they did remove X11 but it’s trivial to install yourself.


Not GP, but usually when people talk about the "dumbing down" of macOS, they refer to new apps and GUI elements adopted from iOS.

macOS as an operating system has been "completed" for about 7 years. From that point, almost all additions to it have been either focused on interoperation with the iPhone (good), or porting of entire iPhone features directly to Mac (usually very bad).

Another point of view is that macOS is great, but all ideas that make it great come from 20 years ago, and have died at the company since then. If Apple were to build a desktop OS today, there's no way they would make it the best Unix-like system of all time.


> Another point of view is that macOS is great, but all ideas that make it great come from 20 years ago, and have died at the company since then.

This also applies to Windows, by the way (except it’s more like 20-30 years ago).


Whereas Linux never stopped coming up with new ideas, but doesn't have the manpower to implement them


systemd!

(Currently struggling with the way systemd inserts itself into the DNS query chain and then botches things.)


The how-hard-can-it-be-and-who-cares-anyway approach to replacing basic system components. Love it.


It likes to fall over to the secondary server, doesn't it.


There are so many known systemd-resolved bugs [1][2] that I can't tell which one was breaking both of my simple Ubuntu desktop machines. Systemd-resolved sets itself up as the sole DNS resolver and then randomly reports it can't reach any DNS servers.

[1] https://github.com/systemd/systemd/issues?q=is%3Aissue+is%3A...

[2] https://www.reddit.com/r/linux/comments/18kh1r5/im_shocked_t...


Yes and... the tools are now highly distro-specific. I don't want to allocate my study time to resolvectl, I want to allocate it to programming, but my home server requires me to be a beginner again in something that was easy a decade ago. And I am not getting anything of value for that trade.


It likes to botch things.


Which is why I gave up on it. Was tired of something in my workflow breaking every 6 weeks because “ooh shiny”


> Another point of view is that macOS is great, but all ideas that make it great come from 20 years ago, and have died at the company since then. If Apple were to build a desktop OS today, there's no way they would make it the best Unix-like system of all time.

Many of those ideas came from NeXT, so more like 30 years ago.


I don't see how any of that is an issue... basically you can now run iOS software which is great, and there are some interface and design elements from iOS- which frankly has a great interface, and they're improvements I like.

I agree there is some conceptual inconsistency- which I see on almost all OSs nowadays, but Windows 8 being the most egregious example, where you are mixing smartphone and traditional desktop interface elements in a confusing way.


Yeah thats fair and I concur, but gp is right.

However and unfortunately I feel your last statement is spot tf on! Our only hope I guess is that they have incurred enough tech debt to be unable to enshitify themselves.

For those not in the know apple is an og hacker company, their first product was literally a blue box! Why this matters and gp is correct and why linux peeps gets in a tivvy and what stephenson was getting at with the batmobile analogy is that traditionally if hackers built something consumer facing they couldn’t help themselves but to bake in the easter eggs.


With each new version it has become increasingly hostile to installing new software, particularly open-source software that hasn't been "signed" by a commercial developer, throwing up huge warning windows suggesting that anyone daring to run such stuff is taking a huge risk. And many of the standard UNIX locations have become locked down making it impossible to install stuff there. It's clear that Apple would like to see a world where everything is installed on a Mac via their App Store and everyone writing Mac software are official paid developers as with their phones.


I don’t understand this sort of comment. The warning windows aren’t “huge”. In practice is clicking through the dialog any more cumbersome than typing sudo and entering your password? In reality is the dialog any less appropriate for the average Linux desktop user?

Is locking down the System folder any more problematic than app armor, and any less useful for system integrity? Putting everything from brew under /opt follows UNIX conventions perfectly fine, definitely more than using snaps in Ubuntu for basic command line utilities. And installing whatever you want on macOS is just as easy as it is on Ubuntu.

This sort of complaint just gets so boring and detached from reality, and I’m not saying that you don’t use macOS but it reads like something from someone who couldn’t possibly be using it day-to-day. For me it’s a great compromise in terms of creating an operating system where I can do anything that I would do in Linux with just as much ease if not more, but also not have to provide tech support on for my elderly parents.


I wouldn't mind in the least if it was a matter of using sudo. That's a logical elevation of privileges. MacOS already does this at points, asking you for your password (which if you are an administrator is basically running sudo for you). These warning messages and locking down the /usr hierarchy (even with sudo) are different as they aren't asking for more access but merely to spread FUD about open access software (yes, you can use brew if the program you want is in it, but that is just adding another garden even if less walled, and it works because someone in the Homebrew project is signing the binaries).

I have used UNIX/Linux on a daily basis for over 30 years, and OSX/MacOS daily for over 15 years. I know how UNIX systems work and where things traditionally are located. And until a few years ago MacOS was a reasonable UNIX that could be used more or less like a friendly UNIX system -- but it is becoming increasingly less so.


having to go into the terminal to run chattr in order to remove the quarantine bit is a lot to ask of a non technical user.


You are switching the goalpost. Not only are there some "security" features that you can't disable and are of dubious actual usefulness like the system partition but they make it much harder to actually hack around the system and modify stuff as you see fit. It has also complexified the installation use of a range of software that is more annoying than it should be.

The openness and freedom to modify like an open UNIX was a major selling point, losing all that for "security" features that mostly appeal to the corporate are not great. Those features also need to be proven useful because as far as I'm concerned, it's all theory, in practice I think they are irrelevant.

The notification system is as annoying and dumb as in iOS and the nonstop "security" notification and password prompt is just a way to sell you on the biometrics usefulness; which Apple, like big morons they are, didn't implement in a FaceID way, in the place where it made the most sense to begin with: laptops/desktops. Oh, but they have a "nice", totally not useless notch.

Many of the modern Apps are ports of their iOS version, wich makes them feel almost as bad as webapps (worse if we are talking about webapps on windows) and they are in general lacking in many ways both from a feature and UI standpoint.

Apple Music is a joke of a replacement for iTunes, and I could go on and on.

The core of the system may not have changed that much (well expect your data is less and less accessible, forcibly stored in their crappy obscure iCloud folder/dbs with rarely decent exports functions) but as the article hinted very well, you don't really buy an OS, just like nobody is really buying solely an engine. A great engine is cool and all, but you need a good car around that to make it valuable and this is exactly the same for an OS. It used to be that macOS was a good engine with a great car around, in the form of free native apps that shipped with it or 3rd party ones. Nowadays unless you really need the benefits of design/video apps very optimized for Apple platforms it increasingly is not a great car.

Apps around the system aren't too bad but they are very meh, especially for the price you pay for the privilege (and the obsolescence problem already mentioned above).

It's not really that macOS has regressed a lot (although it has in some in the iOSification process) but also that it didn't improve a whole lot meanwhile price and other penalty factors increased a lot.

But I doubt you can see the light, you probably are too far in your faith.


These are great features IMO, as a unix savvy 'power user.'

A system should be heavily locked down and secure by default unless you really know what you are doing and choose to manually override that.

Modern MacOS features add an incredible level of security- it won't run non-signed apps unless you know what you're doing and override it. Even signed apps can only access parts of the filesystem you approve them to. These things are not a hassle to override, and basically make it impossible for hostile software to do things that you don't want it to.


I stopped using it a few years ago, but IMO it was definitely being dumbed down and not respecting users any more. Things like upgrades reseting settings that I went out of the way to change - Apple has a "we know better than you" attitude that's frustrating to work around.


Which settings? I am a long term MacOS (and Linux) user and have not noticed such problems.


The "Allow Apps from Anywhere" setting, amongst others.

Linux works better for me, anyway.


Just using at a barely advanced level for 20 years or so as I do, the other comment was correct in that it is the changes that have seemingly been made to make it more familiar to iOS users and “idiot proof”.

Mainly slowly hiding buttons and options and menus that used to be easily accessible, now require holding function or re-enabling in settings or using terminal to bring them back.


Off the top of my head:

- The settings app is now positively atrocious, "because iPhone"

- SIP is an absolute pox to deal with.

- "Which version of Python will we invoke today" has become a fabulous game with multiple package managers in the running

- AppCompat games.

- Continued neglect for iTunes (which is now a TV player with a "if we must also provide music, fine" segment added - but it still thinks it should be a default client for audio files)

- iCloud wedging itself in wherever it can

Yes, all of those can be overcome. That's because the bones are still good, but anything that Apple has hung off those since Tim Cook is at best value neutral, and usually adds a little bit more drag for every new thing.

Don't get me wrong, I still use it - because it's still decent enough - but there's definitely a trajectory happening.


Settings - I preferred the rectilinear layout, but I don't see why making it linear makes it atrocious.

If you don't want SIP, it will take you a few minutes to reboot and switch it off permanently (or perhaps until the next OS upgrade). This is really the only one in the list which has to be "overcome", and personally I think that SIP enabled by default is the right choice. Anyone who needs SIP disabled can work out how to do that quickly - but it is years since I've had a reason to do it even temporarily, so I suspect the audience for this is small.

Multiple package managers and Python: that sounds like a problem caused by running multiple third party package managers.

If you want games, x86 or console is the preferred choice. Issue for some, decidely not for others. I'd much rather have the Mx processor than better games support.

iTunes - I can't comment, I don't use it.

iCloud - perfectly possible to run without any use of iCloud, and I did for many years. I use it for sync for couple of third party apps, and it's nice to have that as an available platform. It doesn't force its way in, and the apps that I use usually support other platforms as well.


OS X started going down hill as soon as they replaces spaces and expose with mission control.


Consolidating Spaces and Exposé is not one of the things they did that hurt Mac OS X.


I could not disagree more strongly.

Having them separate, and more importantly taking spaces from a 2d array of desktops to a 1d array of desktops ruined it substantially.


> There was a competing bicycle dealership next door (Apple) that one day began selling motorized vehicles--expensive but attractively styled cars with their innards hermetically sealed, so that how they worked was something of a mystery.

Neal said the essay was quickly obsolete, especially in regards to Mac, but I'll always remember this reference about hermetically sealed Apple products. To this day, Apple doesn't want anyone to know how their products work, or how to fix them, to the point where upgrading or expanding internal hardware is mostly impossible.


The difference between Apple and IBM is the latter lost control of their platform and those who inherited, arguably Intel and Microsoft, had no interest in exerting absolute control. (If they tried, it likely would have backfired anyhow.)

As for Apple, their openness comes and goes. The Apple II was rather open, the early Macintosh was not. Macintosh slowly started opening up with early NuBus machines through early Mac OS X. Since then they seem to be closing things up again. Sometimes it was for legitimate reasons (things had to be tightened up for security). Sometimes it was for "business" reasons (the excessively tight control over third-party applications for iOS and the incredible barriers to repair).

As for the author's claims about their workings being a mystery, there wasn't a huge difference between the Macintosh and other platforms. On the software level: you could examine it at will. At the hardware level, nearly everyone started using custom chips at the same time. The big difference would have been IBM compatibles, where the chipsets were the functional equivalent of custom chips yet were typically better documented simply because multiple hardware and operating system vendors needed to support them. Even then, by 1999, the number of developers who even had access to that documentation was limited. The days of DOS, where every application developer had to roll their own hardware support were long past. Open source developers of that era were making a huge fuss over the access to documentation to support hardware beyond the most trivial level.


I think its still quite relavent. There are still people who enjoy more diy OS. Now a days they use arch or something. That doesn't make them any better or worse than anyone else. Some people enjoy tinkering with their OS, other people just want something that Just Works(tm) and there is nothing wrong with that.

Threre is an implicit supperiority in the text which is just as cringey now as it was at the time, but i think its still a good analogy about different preferences and relationships different people have to their computers.


> I keep meaning to update it, but if I'm honest with myself, I have to say this is unlikely."

IME he hates to revisit anything he's already written so this claim is polite but implausible.


"Obsolete" is too strong a word, I think. OSX isn't an evolution of the Macintosh's operating system; That'd be Pink, which was even mentioned, and it crashed and burned. OSX was far closer to a Linux box and a Mac box on the same desk, therefore the only change really needed is to replace mentions of Unix or specifically Linux with Linux/OSX as far as the points of the piece are concerned. If Jobs had paid Torvalds to call OSX "Apple Linux" (Or maybe just called it Apple Berkeley Unix) for some reason this would be moot.

I also primarily use Windows and don't have a dog in the fight you mentioned. I might actually dislike Linux more than OSX, though it has been quite a while since I've seriously used the one-button OS.


> OSX was far closer to a Linux box and a Mac box on the same desk

Setting aside the "more BSD/Mach than Linux", OS X pressed a lot of the same buttons that BeOS did: a GUI system that let you drop to a Unix CLI (in Be's case, Posix rather than Unix, if we're going to be persnickety), but whose GUI was sufficiently complete that users rarely, if ever, had to use the CLI to get things done. Folks who love the CLI (hi, 99% of HN!) find that attitude baffling and shocking, I'm sure, but a lot of people really don't love noodling with text-based UIs. I have friends who've used the Mac for decades -- and I don't mean just use it for email and web browsing, but use it for serious work that generates the bulk of their income (art, desktop publishing, graphic design, music, A/V editing, etc.) -- who almost never open the Terminal app.

> though it has been quite a while since I've seriously used the one-button OS

Given that OS X has supported multi-button mice since 2001, I certainly believe that. :)


> Given that OS X has supported multi-button mice since 2001, I certainly believe that. :)

And since MacOS 8 before that...


>Given that OS X has supported multi-button mice since 2001, I certainly believe that.

Just a joke, mate.


macOS shares zero lineage with Linux, which itself shares zero lineage with the UNIX derivatives. It would make zero sense for Apple to call macOS "Apple Linux" when it doesn't use the Linux kernel. Mac OS X is closest to a NeXTStep box with a coat of Mac-like polish on top. Even calling it "Apple Berkeley Unix" wouldn't make sense, because the XNU kernel is a mish mash of both BSD 4.3 and the Mach kernel.

Linux and the UNIX derivates are not even cousins. Not related. Not even the same species. They just both look like crabs a la https://en.wikipedia.org/wiki/Carcinisation.


I used to have a coworker, a senior dev of decades of experience, who insisted that MacOS was a real Linux, "just like BSD". Sigh.

Of course, this belief probably had no downsides or negative consequences, other than hurting my brain, which they probably did not regard as a significant problem.


The crabs analogy isn't a good one, because they evolve independently to play well in a common environment. GNU/Linux is a rewrite of Unix that avoids the licensing and hardware baggage that kept Unix out of reach of non-enterprise users in the 80s and early 90s.


Your statement seems very strong. Is Mac OS X not based on Darwin? Are you defining "lineage" in some way to only mean licensing and exclude the shared ideas (a kernel manipulating files)? Thanks for the "carcinisation" link.


To be more precise: this simplified history suggests a shared "lineage" back to the original UNIXes of all of BSDs (and hence Darwin and OSX) and of the GNU/Linux and other OSes https://unix.stackexchange.com/a/3202

So, "zero shared lineage" seems like a very strong statement.


>OS X is closest to a NeXTStep

This is already in the piece. Why waste time repeating it?


Because the person they replied to used a term like Apple Linux which is ridiculous. So somebody on the internet was wrong and needed to be corrected


> somebody on the internet was wrong and needed to be corrected

https://xkcd.com/386/


I used it, and on purpose. If you missed why, you should go back and read again before replying.


Apple was in some ways on the right track with an OS that had no command line. Not having a command line meant you had to get serious about how to control things.

The trouble with the original MacOS was that the underlying OS was a cram job to fit into 128Kb, plus a ROM. It didn't even have a CPU dispatcher, let alone memory protection. So it scaled up badly. That was supposed to be fixed in MacOS 8, "Copeland", which actually made it out to some developers. But Copeland was killed so that Apple could hire Steve Jobs, for which Apple had to bail out the Next failure.


Yes, I, too, have read the OP before.


Linux is not just another word for Unix or Unix-like and Mac OS X/macOS has never used nor shipped with the Linux kernel.


Sure but what does "rewarding security", as the author suggests, in a way that is genuinely meaningful look like? The direct metric would a low number of security holes or bugs in the product but then you run straight into the problem that many holes/bugs are not found until much, much later, if ever. Perhaps code review failed to notice it, perhaps QA didn't cover that case, perhaps security scanning tools missed it, perhaps no black or white hat hacker ever bothered to try to break it, etc. Without a meaningful metric, what will likely happen is that people get rewarded for some kind of security theater.


Then go the opposite route. South Korea fines companies thousands of dollars every day a vulnerability isn't fixed. Security is one of those areas where negative reinforcement works better than positive reinforcement.


Mind providing a source? (Tried to Google it but didn’t find any relevant info.)

I can think of multiple situations where a vendor from SK has left things unpatched for months, and sometimes years..


Sure, I'd be fine with that but that's going to have knock-on effects on developers because they're the ones writing the code and therefore the vulnerabilities / bugs. Software engineering would turn into something like civil or aerospace engineering or medicine where where practitioners are required to be certified in various ways, either they or their employers carry liability insurance for bugs they write, and endure onerous processes / audits that their employers and insurers demand of them to reduce the risk of bugs. That I'm fine with too since there's so much crap code being churned out but most software developers probably wouldn't.


"thousands of dollars every day" does not a negative reinforcement make. That us not even a rounding error for even mid sized companies.


Then use 1% of revenue or 2K per day, whichever is greater.


So after 4 months, the company would lose more than their entire revenue?


Why not?

A $20k car can do far more than $200k in damage.

We don’t limit liability to the price of the vehicle.


The equivalent would be a $20k Ford resulting in a $1,762,000k fine.


Yeap, should deter building vulnerability riddled solutions.


Treat fixing a security issue or implementing a security component the same as implementing a feature for the purpose of raises and promotion.


Ouch, now that really stings to wake up to. Eva aside, Nadia, Gunbuster, FLCL all stand out among the great anime of the '90s-'00s. Even lesser known series, like KareKano, which Anno had a hand in, really have a vivaciousness to them to them that just isn't that common. RIP Gainax.


It stings a little, but most of the Good People have gotten around to other studios anyways. As I understand it gainax was really just an IP holding company for the most part in recent time


>Even lesser known series, like KareKano

Ahh the memories...I'll never forget their style of running out of funds mid series and then switching to the anime version of powerpoint style transitions to tell the remainder of the story! :D

And who can forget Mahoromantic with the quintessential other Anime quirk: The budget runs outs so the important people just die and the credits roll with sad music....the end. :D

I discovered in early high school after investing considerable time into a few series only to be burned by bad endings, that I should try something other than Anime. Its enjoyable but man I can't trust these guys. They suck you in and then leave you hanging too often.

At least Evangelion was good I guess. Guess thats what happens when you have more continual income coming in for a franchise?

Whats old is new again, Amazon and to a lesser extend Netflix are now mass producing content with this trap but at least we can make our voices heard. Back then in the late 90s early 00s we could not as easily complain to the company.


Honestly I'm super excited for this. An important detail of note is that Studio Khara will be responsible for selling off and distributing IP to interested parties. Realistically most of these properties will go to Khara themself, Trigger, and studios like SHAFT that worked closely with gainax in the early days.

So if anything this is how many of those beloved projects will get out from under Gainax's death grip and get something new other than a pachinko machine.


Well, their biggest success is pretty much done to death now. NGE has had a series, multiple OVAs, multiple movies, multiple revamp movies... there's really not much else to get out of it unless you want to tell the story of how they got into that mess to begin with, which might be the most interesting part of all of this to its core fanbase given their age. Maybe tell the story of the project up until Shinji's mom dies or something.


There's all kinds of fun things that could be done with the franchise. An isekai where an Otaku gets reborn as Shinji and tries to do it right would be great.


I mean the plan with Eva according to Anno is to do like what Gundam does.

i.e. you have you main timeline (Universal Century) with focuses on different groups on opposing sides of the conflict during different periods.

But then you have your many other offshoot series (00, IBO, GWitch, SEED, Turn A, etc) that are free of the main established canon while keeping the same core elements to the series.


They announced that, but all major IPs they had were already transferred to Khara and Trigger a couple years ago (that's why the TTGL movies reappeared in cinema a couple years ago). It's likely just some of the older/less popular stuff that's not easy to assign owners to that will need to be redistributed.


Don't forget Samsung's fabs in South Korea.

Wikipedia maintains a list of semiconductor fabs worldwide: https://en.wikipedia.org/wiki/List_of_semiconductor_fabricat....


I'd suggest "The Mechanical Universe", which was an excellent series of video lectures developed at Caltech in the that covers undergraduate physics.

https://www.youtube.com/playlist?list=PL8_xPU5epJddRABXqJ5h5...

It was created in the '80s and broadcast on PBS and it still holds up well today.


Rapid reading and comprehension¹ and exposure to a breadth of vocabulary and ideas is the professional equivalent of having superpowers. So many developers have crappy writing skills and weak reading comprehension and have to struggle harder to develop their careers because of it.

¹ The barren wasteland that is online discourse doesn't count for obvious reasons.


I'm not so negative about online discourse. Participating in online arguments, and writing tons of comments has immensely improved my writing skills and my ability to translate my ideas into text in a clear and concise manner.

Sure, it's not the peak of literary achievement, but hey, you gotta walk before you can run.


I also like to say that years of rules lawyering in D&D and MTG have given me a leg up in contractual language debates. Its kind of fun, even.


Solid article. The one quibble I have is IMO that there should be mention that, when the choice is made to use an untried technology, it needs to be acknowledged and communicated that this carries some level of risk and an increased chance of not meeting the project goals.


Don't commercially available software controlled USB hubs exist already? The Yepkit YKUSH 3 came to mind immediately (https://www.yepkit.com/product/300110/YKUSH3) and is the only device I've seen that explicitly says that it disconnects both the data lines and power lines when switching. A casual search also finds https://www.usbgear.com/managed-hubs.html


Sad news indeed. Descent, Freeespace 1+2, Red Faction, and the Saint's Row games are all fond memories of mine.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: