Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

C was created during a time where instructions were executed linearly with no vectorization, memory was a flat space with no CPU caches, and there wasn’t a branch predictor that may or may not execute the correct program branch in advance. The list goes on but the rest is beyond my scope.

C was designed for a now obsolete computer architecture model and over the years this old model has essentially become an abstraction that sits between C and the CPU. As such, C programmers aren’t really programming in their CPU’s domain anymore and C, by default, lacks the commands necessary to effectively utilize these new developments. It is left up to the compiler to translate C code from the old architecture abstraction into efficient machine code for our new machines.

For a more in depth look into this topic I recommend you check out (0).

(0) - https://queue.acm.org/detail.cfm?id=3212479



Thanks for the link. Excellent article. Those are all great points and I like having them summarized in one place, because I am indeed a little behind in my modern architecture theory.

However it has always been acknowledged that C was by definition a sort of simplified computing model. For example, when I first learned see the 8086 architecture was popular but it was competing with many others and it was already dramatically different from the PDP-11 virtual machine you describe. The 286, 386 and so on had funky indexing modes and address space weirdness but so did just about every other processor of the time.

There is likely never to be a single unified architecture that anyone agrees on, and the developers of C understood this, certainly by the time the 1989 standard was hammered out. So compiler directives, pragmas, and maybe even language extensions were expected on a per CPU basis, no?


That article is nonsensical. Seymour Cray and colleagues and Tomasula and colleagues invented ILP, branch prediction, etc. in the 1960s before C was thought of. The PDP11 is much more similar to modern x86s than to the weird architectures of the 1970s.


C originated as an evolution from BCPL, with B a stopgap, as means to rewrite UNIX.

Thing is, BCPL main goal was an interim solution to Bootstrap CPL, nothing more than that.

https://en.wikipedia.org/wiki/CPL_(programming_language)

Unfortunely, UNIX's success means we got stuck with something that shouldn't be more than a portable macro assembler.


> now obsolete computer architecture model

I think this is kind-of dismissive. You seem to assume that everyone is programming modern x86 machines. What about embedded? My little PIC32/STM32/ATMEGA do not have caches or predictors, and has a flat memory space.

Thank god there is the C standard, that even today after more than 30 years, give compilers a clear set of rules allowing them to emit code for those thousands of architectures used today in the dozens of embedded devices we have in our houses/offices/industries today, and not only that, it allows to squeeze the maximum performance out of these microcontrollers, for a language that is not plain assembler.


Ah, I didn’t mean to come off as dismissive. My bad. When I wrote my comment I only had consumer hardware on my mind. However, I realize that C isn’t simply limited to this new modern hardware either.


The Chisnall article is a tutorial in incorrect Computer Architecture. PDP11s had caches by the 1970s and always had memory management. IPL was invented in the 1960s and has nothing to do with C. Branch predictors were invented in the 1960s too and one of the first machines C was ported to was the IBM370 which had super sophisticated IPL. Etc.


I was unaware. Thank you for the correction! Do you have any further reading on this topic?


I am sorry to say, I do not have a good short reference. There has to be one, though, I hope. The Hennesey Patterson books are the standard intros.



ILP, sheesh





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: