If it had even 32 kilobytes of fast mem, (it had 2 megs of chip, same as the original Playstation) it would have made a huge difference for games.
Back in the day, people didn't even know Amiga 1200 could output VGA resolutions. Also, you needed a special adaptor cable which I only saw home-made versions of. If it had a VGA connector people would know they could run productivity software on a regular monitor. Having a TV for gaming and a monitor for word processing would have been fine. There's precedent with Atari supporting both regular (TV-like frequencies) monitors and the monochrome monitor for productivity.
(IIRC you had to press both mouse buttons at boot or something like that to get VGA frequencies. All of this could have been made automatic if the VGA monitor was plugged in or something. BOM cost would have been almost 0. No strategic thinking at all at Commodore leadership, no understanding of the market.)
32 k of fastmem probably wouldn't have made much difference at all. A single routine or two could maybe run faster, but data for them would still have to be mostly in chipmem. Plus, you'd need to add space and tracing for it on the motherboard. Once you've done that, 32 k seems a bit pointless. Then you might as well have added a SIMM socket, too. The initial doubled speed and increased video bandwith compared to the A500 was already a major step up for the type of Amiga games that were popular when the A1200 was designed.
Whether or not people knew of the scandoubled modes back in the day, well, me and all my friends certainly did, and all the A1200 reviews I've read mentions them. Having both a TV and a VGA monitor sort of defeats the purpose of a cheap, compact entry-level machine. Atari users typically had _either_, not _both_: The monochrome monitor was for the DTP and music studio markets.
There was no mode switching in the early startup menu, apart from being able to toggle between NTSC and PAL. Selection of AGA scandoubled modes were made in the Workbench preferences. Adding some kind of auto-sensing hardware would add to the cost of the machine and require a rewrite of Workbench to cope with this in some way without interfering with screenmode preferences (and what if you indeed had both a TV and a VGA monitor hooked up at the same time?).
In hindsight, I think the A1200 was a decent solution to a hard problem: constructing a cheap, worthwhile upgrade while remaining compatible with as much existing software and hardware as possible.
Back in the day, people didn't even know Amiga 1200 could output VGA resolutions. Also, you needed a special adaptor cable which I only saw home-made versions of.
The Commodore monitors which were released at the time intended for use with A1200/A4000, the 1940 and 1942, had a permanently attached 15-pin VGA cable. I know because I have one for my A1200.
These monitors were dual-sync 15 and 31 KHz devices, and were perfectly usable as a VGA monitor with a PC. Probably not a lot of PC users bought them.
But if you bought the 1940 for use with the Amiga 1200, you couldn't plug it into the computer without a 15->23 pin adapter. I'm sure it came with the adapter in the box, but still...
Back in the day, people didn't even know Amiga 1200 could output VGA resolutions. Also, you needed a special adaptor cable which I only saw home-made versions of. If it had a VGA connector people would know they could run productivity software on a regular monitor. Having a TV for gaming and a monitor for word processing would have been fine. There's precedent with Atari supporting both regular (TV-like frequencies) monitors and the monochrome monitor for productivity.
(IIRC you had to press both mouse buttons at boot or something like that to get VGA frequencies. All of this could have been made automatic if the VGA monitor was plugged in or something. BOM cost would have been almost 0. No strategic thinking at all at Commodore leadership, no understanding of the market.)