Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I remember how we were fuming at the A4000 opting for IDE over SCSI, and the other ways it was hampered, and how ugly it was. The A4000's was PC-ification of the Amiga without the upsides - things like SCSI helped offset the anaemic higher end M68k CPU's. An A3000 + faster CPU + flicker fixer built in + AGA, would've been vastly superior. Basically Dave Haynie's A3000+ prototype.



And MMU... and CPU daughter cards on the Zorro bus with local RAM for things like "distributed" Lightwave rendering. It would have made performance better for the rather price insensitive pro segment.

Commodore basically catered to no market segment.

Edit: the A1200 could have been better (twice the speed for calculations) if it had some just a tiny sliver of fast RAM instead of only chip RAM. And so on. It should also have had a VGA connector, it wouldn't have cost more than a few cents. People used TVs because the couldn't find multisync monitors. Commodore, a fractal of bad business decisions.


The A1200 lacking fastmem out of the box does seem like a strange decision, but in this case I believe Commodore actually listened to game developers. Having more chipmem instead meant that games could effortlessly load more graphics, music and sampled audio.

As for the VGA connector, there was a cheap Amiga RGB->VGA adapter available. Connectivity wasn't the problem. The issue was that VGA monitors couldn't cope with a 15 kHz PAL/NTSC signal. Many didn't work with the 50 Hz PAL refresh, either. In order for a VGA connector to be meaningful, hardware would've been needed to address this, adding to the cost of the machine - and ruining the 50 Hz sync of a massive, pre-existing games library.


If it had even 32 kilobytes of fast mem, (it had 2 megs of chip, same as the original Playstation) it would have made a huge difference for games.

Back in the day, people didn't even know Amiga 1200 could output VGA resolutions. Also, you needed a special adaptor cable which I only saw home-made versions of. If it had a VGA connector people would know they could run productivity software on a regular monitor. Having a TV for gaming and a monitor for word processing would have been fine. There's precedent with Atari supporting both regular (TV-like frequencies) monitors and the monochrome monitor for productivity.

(IIRC you had to press both mouse buttons at boot or something like that to get VGA frequencies. All of this could have been made automatic if the VGA monitor was plugged in or something. BOM cost would have been almost 0. No strategic thinking at all at Commodore leadership, no understanding of the market.)


32 k of fastmem probably wouldn't have made much difference at all. A single routine or two could maybe run faster, but data for them would still have to be mostly in chipmem. Plus, you'd need to add space and tracing for it on the motherboard. Once you've done that, 32 k seems a bit pointless. Then you might as well have added a SIMM socket, too. The initial doubled speed and increased video bandwith compared to the A500 was already a major step up for the type of Amiga games that were popular when the A1200 was designed.

Whether or not people knew of the scandoubled modes back in the day, well, me and all my friends certainly did, and all the A1200 reviews I've read mentions them. Having both a TV and a VGA monitor sort of defeats the purpose of a cheap, compact entry-level machine. Atari users typically had _either_, not _both_: The monochrome monitor was for the DTP and music studio markets.

There was no mode switching in the early startup menu, apart from being able to toggle between NTSC and PAL. Selection of AGA scandoubled modes were made in the Workbench preferences. Adding some kind of auto-sensing hardware would add to the cost of the machine and require a rewrite of Workbench to cope with this in some way without interfering with screenmode preferences (and what if you indeed had both a TV and a VGA monitor hooked up at the same time?).

In hindsight, I think the A1200 was a decent solution to a hard problem: constructing a cheap, worthwhile upgrade while remaining compatible with as much existing software and hardware as possible.


Back in the day, people didn't even know Amiga 1200 could output VGA resolutions. Also, you needed a special adaptor cable which I only saw home-made versions of.

The Commodore monitors which were released at the time intended for use with A1200/A4000, the 1940 and 1942, had a permanently attached 15-pin VGA cable. I know because I have one for my A1200.

These monitors were dual-sync 15 and 31 KHz devices, and were perfectly usable as a VGA monitor with a PC. Probably not a lot of PC users bought them.

But if you bought the 1940 for use with the Amiga 1200, you couldn't plug it into the computer without a 15->23 pin adapter. I'm sure it came with the adapter in the box, but still...


Have you read Brian Bagnall's series of books on Commodore? It really digs into the full universe of badness that was Commodore management in excruciating detail.


No, thanks, I'll get ahold of them.


I recall reading commodore had a glut of PC cases and tooling they wanted to use up, hence the weird case for the A4000 with the mouse ports cut out inconveniently on the side.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: