Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

See "Why Pascal Is Not My Favorite Programming Language" (https://www.cs.virginia.edu/~evans/cs655/readings/bwk-on-pas...). It's written by Brian Kernighan, but it's not a hit piece. It's also not a comparison to C.

Kernighan (and P. J. Plaugher) had written a book called "Software Tools". It was supposed to give several reasonably-lengthy examples of software that did actual useful functions. It was written in RATFOR, which is a pre-processor for FORTRAN. Some time later, they re-wrote the book to use Pascal, calling it (predictably enough) "Software Tools In Pascal". After writing it, Kernighan wrote this paper, basically because he was thinking "That should have been way easier than writing the same stuff in RATFOR. Why was that so hard?"

I used Pascal for two years professionally, and many of the issues in the paper I ran into. Pascal was just clumsy to use. It was a good teaching language, but not good for professional programmers in many cases. (C, on the other hand, was written by people trying to write an operating system, and turned out to be a decent language for writing operating systems in.)

Note well: All of this is true of the original Pascal. Things like Turbo Pascal improved it and made it actually a usable language. But even that wasn't portable - there wasn't a Turbo Pascal for anything other than the IBM PC, so far as I recall. And every other "improved" version was different from Turbo Pascal, so there was no portability between extensions either.



> there wasn't a Turbo Pascal for anything other than the IBM PC, so far as I recall

There was a z80 version of Turbo Pascal that ran on CP/M machines (incidentally, one thing that’s striking about the first several years of BYTE is how many huge Cromenco ads there are) as well as the Apple II with a Z80 card. That, along with x86 support, covered a lot of ground.


Let's just ignore the C dialects outside UNIX like Small-C and RatC, or that we had to wait until 1990 for proper standard, and not even K&R C was a given outside UNIX.


At one point in the 1980s I counted 30 C compilers available for the IBM PC. Programming on the PD dominated programming in the 80s, hardly anyone had access to Unix machines. Probably 90% of C programming was done on the PC.

The 1980s C++ compilers on the PC also dominated the C++ compiler use. C++ on the PC vaulted the language from obscurity into the major language it is today.


It probably depends on what time in the 80s as well; TFA is from 1983.

A while back I chatted with someone who worked on both C and Pascal compilers around that time period and got the impression that the majority of their customers were people running on 68k based Unix workstations. May have just been their niche I suppose.

I didn't start programming until closer to 1990, and started with Mix software's C compiler on a 286, because that's what I could afford.


I also used Mix C. I think it only sold for about $20 (plus $20 for the debugger?). It also came with an electronic tutorial ($10 more?) called "Master C" that I found very useful.


I had Thomas Plum's "learning to program in C" which was excellent


The money in compilers in those days was on the PC.


Not in Europe it did not, it was all about QuickBasic, Turbo Pascal and TASM over here.

And if we go into the Amiga it was about Assembly and AMOS mostly.

On Apple, Object Pascal and Assembly, HyperCard, MPW with C++ came later into the pictures.

On thing I do agree, by the time Windows and OS/2 were taking off, C++ on the PC was everywhere and only masochistics would insist in using C instead of C++ frameworks like OWL, MFC or CSet++.


Half of my C and C++ compiler sales were in Europe. The Mac did indeed lag far behind - Apple bet on the wrong horse (Pascal).


Europe has many countries, I can assure you that I only saw Zortech on magazines after it was acquired by Symantec and was shipping MFC alongside with it.

Sadly I never saw it anywhere on sale, as the graphical debugging for C++ data structures was quite cool to read about.


Britain and Germany saw the most sales. For reasons inexplicable to me, France bought very few compilers.


France had Minitel, so didn't need computers.


I learned C on the Amiga, starting around 1989. That was Lattice C (later SAS C.) Eventually I moved on to Linux (SLS!) which, of course, had GCC.


Oh yes, SLS Linux, always installed from a big stack of floppies. Had almost forgotten! ("Soft Landing Systems".)

Then Slackware came out on CD!


The delay in publishing a "proper" standard was due to the incredible success/usefulness of the defacto K&R standard. But as you point out that was hard to find outside of Unix. I suspect this was mostly due to the effort required to implement the full standard library and/or resource limitations on many systems.

For example, there was a Small-C compiler available for the Atari 800 in 1982:

http://www.atarimania.com/utility-atari-400-800-xl-xe-c-65_1...

"... based on the Small C compiler published in Dr. Dobb's Journal"

If you look in the beginning of the manual it has a list of what is and is not supported. They claim it is sufficient to compile C/65 itself but there are lots of things we take for granted missing.


My first contact with C was RatC, via "A book on C", with its implementation as appendix.

https://books.google.com/books/about/A_Book_on_C.html?id=e5p...

So it is kind of ironic this revisionism how great was C "portability", when in reality it was full of dialects outside UNIX just like the competition.


Nobody needed to use dialectical extensions. All C compilers supported everything you needed, and all with the same syntax.

Except "near" and "far" pointers, which were an absolute plague.


Since when has inline Assembly stop being an C language extension?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: