Hacker News new | past | comments | ask | show | jobs | submit login

At the Vintage Computer Fest last week Bill Mensch mentioned to the audience that no one ever hears about the 65C02 and 65C816’s use in defibrillators and pacemakers - life critical applications - unless he tells them!

Does anyone know of good write ups or explanations of what makes the 6502 so reliable and what competition it had in being chosen for medical applications?




Simpler is an advantage in that world, if you can understand the functioning of your device to the cycle level then you have a much better chance of delivering something that will work reliably.


One of the things that I loved about the Apple ][ was that it was possible for one person to completely understand everything about that computer from the hardware to the software. I've never had that level of complete understanding of any system I've used since.


Yep, similar experience here. My first computer was a Tandy / Radio Shack Color Computer. It had a 6809 processor (8/16-bit precursor to the 68000) @1.8MHz, 4k of RAM (upgradable to 64k), 16k or 24k ROM memory with a quite expansive MSFT Extended Basic Interpreter (supposedly the last ROM OS & BASIC that had assembler written by BillG himself).

I taught myself BASIC, assembler, graphics programming and game programming on that machine over a period of about four years of hacking around on it (including hand-commenting some significant chunks of the ROM). By the time I retired it for a shiny new Amiga 1000 in 1986 I'd upgraded it to 256k of bank switched RAM with a soldered-in hack board, added four floppy drives, various I/O boards and learned OS/9 (a UNIX-inspired multi-tasking, multi-user OS) and hacked in my own extensions to the ROM OS (including adding my own new commands and graphics modes to the BASIC interpreter).

It started out as a lot of trial and error but, on later reflection, ended up being a surprisingly thorough grounding in computer science from which to launch my career. That 6809 machine was also the last time I really felt like I was aware of everything happening in a computer from interrupts to registers to memory mapping down to the metal.


Yes, that was the beauty of the 8 bit era, and many people lost it without even knowing that they lost something very precious. The total control is a very nice feeling.


I'm not sure why "simple, understandable system design" would have to be synonimous with 8-bit computing. One of the most appealing things about new open hardware initiatives is how they bring this simplicity and surveyability together in what's otherwise a very modern design and context.


8-bit can't afford much abstraction, so it's simple/understandable by necessity


Seems every time someone applies that to hardware with a wider compute path, other complexity creeps in.

Would be interesting to make a 32 bit Apple 2 style computer. Include a ROM for a means to boot, and leave everything else simple, with some nice slots. Could be a great development / learning machine.


I've "built" such machines in FPGA; PicoRV32 core, hand made display processor, a bit of RAM, and a couple UARTs. It was fun and not that hard for me, a newbie to FPGA.

One of the bigger challenges is integrating peripherals. I got bogged down trying to do SD Card interfacing. There are off the shelf bits of IP from Xilinx, etc. you can use to do this, but that sort of defeats the purpose of the exercise.

I think modern machines started their slide into mind boggling complexity when bus speed and CPU speed outstripped RAM speed. So much complexity and unpredictability is in the all the infrastructure built around cache.

Something like an Amiga or Atari ST was still not hard to understand all/most of, despite being 16/32 bits.


Those kinds of things can be done with the CPU, Apple 2 style. Map some I/O to a few addresses and service the SD card with a small routine.


New machine based on the C64, nearing production release.

https://www.commanderx16.com/forum/index.php?/about-faq/



Yes, but perhaps aimed more at retro computing.

I have a couple of those. The HYDRA was a lot of fun.


Because once the clock speed gets past a handful of MHz, maintaining good clock distribution and good signal integrity become more painful very, very rapidly.


Are there any good sources of documentation for working with Arm chips? Like, a simple single board (or breadboard) computer?


It's unclear what you're asking for here? The Raspberry Pi Pico and Raspberry Pi Zero are existing and very well documented ARM based single board computers.


Fair enough. I suppose what I'm asking is whether it's possible to purchase a more modern CPU independently of a SOC and design my own single board computer built around that. Can you buy a naked Cortex M0 in a PDIP package? Are there data sheets for it?

A search of the interned didn't turn up what I was looking for, but I'm very new at hardware work. Perhaps newer chips have such tight timing requirements that you can't work with them without using a SOC?


Nearly everything is going to be a SOC because that's what commercial applications need to minimize part count and cost; there's negligible demand for a standalone processor.

If you're just looking to breadboard up a computer but don't want to go back to 8-bit processors, the Motorola MC68000 / MC68008 used in the original Apple Macintosh is a 32-bit processor in a DIP package running at a manageably low frequency and can be found on eBay inexpensively.


> Can you buy a naked Cortex M0 in a PDIP package?

No. The busses used to access peripherals and memory are not suitable for off-die use. (This goes for all ARM cores, not just Cortex-M0.)


You may like Project Oberon [1] designed by Niklaus Wirth [2] then. His guiding principle was to make a powerful but simple system that could be understood from top to bottom by a single person, from the RISC hardware to the OS to the compiler used to compile and run the OS.

It's quite a bit above the Apple ][ in terms of power.

[1] http://www.projectoberon.com

[2] https://en.wikipedia.org/wiki/Niklaus_Wirth


And there is a ton of developed software ready to go. At the least, all the internal code would be very mature at this point.


I don't have any documentation but I would imagine that as these chips have been in existence for so long, its behaviour is extremely well understood, including most, if not all, of its weak points. The work around for these weak points should also be well known.


The nice thing about the 6502 is that it is completely reverse engineered down to the transistor level, so it's possible to explore what exactly is going on in the chip for each clock cycle even when the original design documents had been lost:

http://visual6502.org/JSSim/index.html

(and shameless plug, my own 'remix' with better performance and more features: https://floooh.github.io/visual6502remix/)


I wonder if that makes finding replacements easier too, since you can comfortably find (or even make) new ones.


Blank black window for the remix. What should I be seeing?

[Edit: It's webgl. I don't have webgl in QubesOS :(.]


Though I'd think they would use a microcontroller with a 6502 CPU which integrates ROM/RAM/GPIO/peripherals into one. Here is a microcontroller with a 6502.

https://pdf1.alldatasheet.com/datasheet-pdf/view/103795/ETC/...


I just read an article recently about how 6502-based chips are used inside of satellite receiver boxes.

I wonder if it's just a function of the time. I imagine anything designed new now would use an ARM based microcontroller but likely when many of these systems were originally designed those were much less common and more expensive.


I'd expect that there are still a lot of new designs where something like an 8-bit microcontroller such as an AVR makes more sense than using something ARM based.


It's getting harder and harder to find places where this is true.

ARM Cortex M0/M0+ blows AVR out of the water, and is usually cheaper except for the very lowest end AVR parts. Generally will use less power, too. And that's assuming your unit counts are so high that firmware developer time is free.

Of course, it's getting impossible to find 5V VCC ARM parts, so that's something that would steer you towards AVR if your system is really a bunch simpler by having a 5V micro.


This is not strictly true. Many AVR chips can handle more computation than their clockspeed would suggest due to some really nice assembly codes that allow for common DSP calculations.

I ported an AVR code base to a cortex M4 last year, and some of the inlined asm didn’t translate. I ended up having to use inlined C instead. So, my 120Mhz M4 chip struggled to do what a 90Mhz AVR did no problem.


Note that it seems you're talking about AVR32, then, and not an 8-bit AVR like we were talking about.

AVR32 was neat, but has lost all commercial relevance.


Oh, yeah, good call. I guess I would be surprised to see significant DSP related asm in any 8 bit processor.


You can get Attiny to sleep at 6uA with a watchdog, and ~120nA with an external interrupt. Can M0 match that? Genuinely curious, I don't have much experience with ARM.


AVR is actually rather expensive, relatively speaking. I believe it's only popular due to Arduino. Even the various PICs will be cheaper, and of course there's still a lot of (very fast) 8051 variants as well as 4-bit MCUs at the ultra-low-cost level (<$0.01).


I remember reading that space probes, including Mars rovers, use what amounts to a PowerPC G3 but in a radiation-hardened modification.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: