There is something nostalgic about these old Google documents. It reminds me of when search ads were on the right side.
Google is famous for only supporting a few general languages. Initially it was just C++, Python, Java, and Javascript (browser only!), although a few more eventually trickled in. The language deviants kept writing "official" style guides and adding support to random infrastructure and libraries, hoping that one day the next major project might get written in OCaml. New Googlers often had a confusing moment after learning the restrictive language policy when they compiled their first program and blaze would print that it was generating protocol buffer bindings for Haskell (this would eventually be suppressed).
Haskell is another oddball. It was for a hardware project that used the to-be-auctioned analog TV RF spectrum for a new WAN standard.
It was canceled when it became clear the spectrum wasn´t going to be available. The Haskell rules then stuck around for another ~7 years before I deleted them as part of the cleanup for the Bazel release. They then promptly reemerged as one of the first language rules to be expressed in Bazel's newfangled Starlark language (then still called Skylark).
> Google is famous for only supporting a few general languages. Initially it was just C++, Python, Java, and Javascript
I think most of the big companies are like this. It tracks my experience at least.
At the scale that these companies operate at, it's a good thing. Even if you restrict language choice, you tend to end up with half a dozen or more languages in common use at your company, each of which you have to support with developer environments, CI/CD configurations, security & supply chain code vetting, etc.
One of the largest Borg jobs? Really? This goes hard against my expectations, both based on my understanding of the software involved in flight search, and in terms of my knowledge of things that run on borg and how big they get. Feel free to reach out internally, I'd love to chat about it!
Remarkable to me that Google would spend so much on software engineers and then hobble them with blub languages. If they decided to push a niche but superior language then people would learn it (as they did with Go).
Google's main line of business back in the day involved crawling and indexing as much of the entire web as possible. When you're operating at that kind of scale, back in the 2000s-early 2010s your choices were C++, and pretty much nothing else. Garbage-collected languages introduced too much overhead. Keeping the number of languages small, and keeping them purpose-focused (C++ for the heavy stuff, Java/Python for web backends, JavaScript for UIs) helped unify coding standards and conventions within and across teams, reining in the chaos that would have ensued had Google allowed developers or teams to choose languages willy-nilly.
These sorts of language restrictions are not a knock against the skill of Google engineers. A good programmer can be good in any language. Even C++. Even Java. Think of the midwit meme, with the simpleton at the left going "just write everything in C++ or Java", the midwit in the middle going "NOOOOOO! Lambda calculus/monads/Hindley-Milner type systems!" and the genius/Jedi monk at the right going "just write everything in C++ or Java".
The appeal of Lisp when writing production code is to lazy, mediocre programmers who think Lisp's sexy features will extend the reach of what they can create. Which is true, to a certain extent, but a) not as much as they may think; and b) when you consider that Lisp programs are bloated and slow compared to the corresponding C++ programs, the juice just isn't worth the squeeze.
Lisp peaked ~60 years ago. There are very sound technical reasons why it is obscure today.
I think this is the perfect opinion to have for a middle manager at a large FAANG with seemingly infinite developer supply and little existential incentive to innovate.
In practice, C++ and Python codebases have insane amounts of churn. They're constantly moving targets and require a lot of overhead to keep in shape, from just about every perspective: performance, security, maintainability, implementation compatibility, etc. It's tolerated though because it's standard practice, and we developers get a paycheck anyway.
Languages like OCaml, Common Lisp, Haskell, and other niche but obviously practical languages continue to be indispensable to certain cohorts of companies and developers. While I do agree that Lisp specifically peaked in popularity decades ago, it still exists in the commercial domain with resounding staying power, even in environments where it could in theory be replaced by C++ or Python.
It's also abundantly clear there's a hunger for new (or at least replacement) languages for the stalwarts. Rust, Julia, Mojo, Swift, and OCaml are a few such languages that get attention in their commercial investment and development, often with the specific purpose of replacing Python or C++.
As a personal anecdote, I find professional Common Lisp programmers generally to be quite sensitive to matters of bloat. Some programmers go as far to chisel away needless cons cells (or allocation in general), and constantly be in touch with the compiled assembly—something frequently ignored by professional C++ programmers of comparable experience.
I would love to hear the "very sound technical reasons" explaining Common Lisp's obscurity. Given the vast and diverse biome of programming languages in use today, including and especially in professional settings, I'm quite interested in understanding what technical reasons uniquely relegate Lisp to obscurity.
My take (not OP) is that it's not so much the technical features of Lisp itself, but the fact that other languages evolved competing features. Over the years, they have slowly chipped away at Lisp's advantages. 50 years ago, Lisp was uniquely powerful with no rival. Today, Lisp is yet-another powerful language to choose from.
I disagree: Common Lisp has many remaining powerful features AND a large set of them, when a language maybe caught up with one. The interactive debugger is unmatched, image-based development nearly doesn't exist elsewhere, the condition system… Example: fix a bug without quitting the debugger, re-compile the fixed function with one keystroke (instant feedback), go back to the debugger, choose a stack frame to resume execution from, see the execution pass. You didn't re-run everything from zero.
SBCL has pretty good type inference at a keytstroke (not complete, but useful, for a Haskell on top of CL see Coalton), it's fast, different implementations, stability…
> As a personal anecdote, I find professional Common Lisp programmers generally to be quite sensitive to matters of bloat. Some programmers go as far to chisel away needless cons cells (or allocation in general), and constantly be in touch with the compiled assembly—something frequently ignored by professional C++ programmers of comparable experience.
C++ and Rust programmers usually don't have to go down into the assembly. The language itself provides fine-grained control over when and how allocations take place, and how memory is laid out. It's a doddle to establish efficient allocation patterns and lay out memory to take advantage of locality of reference so you don't blow up your data cache. Lisp, not so much; in addition to the pointer-heavy semantics of Lisp objects you have the inherent nondeterminism of a GC. So not surprising that when a Lisp program stutters or hangs, the programmer has to go down to assembly level to determine the memory usage pattern that's causing the slowdown. With C++ or Rust, the memory usage patterns are (usually) obvious from the code itself.
> I would love to hear the "very sound technical reasons" explaining Common Lisp's obscurity.
* Lisp is dynamically typed. Yes, I know Common Lisp allows for type annotations, but the type system is just barely enough to enable speed hacks in the compiler; it is neither sound nor robust when compared to, say, fully parametric type systems (now commonplace in blub languages like C# and TypeScript). The lesson of the past couple decades or so is that strong, static typing is an unmitigated win, especially when building large systems; to choose dynamic typing in 2024 is to forgo early checking of all sorts of bugs.
* GC. You will incur a performance and memory consumption penalty using a GC'd language, an in this day and age with RAII and ARC, those are penalties you need not incur in order to safely, conveniently use memory within deterministic space and time constraints. With languages like Rust we can get the same kinds of advantages with static lifetimes that we do with static types. There is no need to defer object lifetime management to runtime; and to do so just wastes electricity and contributes to climate change.
* The syntax. Programmers are not just annoyed by all the parens. Today, they're used to being able to get a list of all the methods valid on a given object by typing '.' after the object's name. You need a certain kind of syntax to do that, plus static typing, and Lisp eschews both.
* Lispers wax nostalgic for the Lisp machines, which were pretty groundbreaking in their day... but as the sibling to my comment points out, the tooling for other languages has caught up if not surpassed the best Lisp development environments since. Microsoft has had edit and continue in its IDEs since decades ago; I believe Eclipse got something similar later. Modern refactoring tools. Project Lombok. There has been a lot of research done into how to improve the developer experience in practical languages. Meanwhile, Lispers are still recommending frickin' Emacs to beginners.
* Plus, the library support for practical languages like Java, JavaScript, Python, or C# -- or even C++ -- is just so much better than what's available for Common Lisp.
> You will incur a performance and memory consumption penalty using a GC'd language, an in this day and age with RAII and ARC, those are penalties you need not incur in order to safely, conveniently use memory within deterministic space and time constraints.
A distinguishing characteristic of languages like Lisp or Elixir is the metaprogramming. C++ and Java have nothing like their macro systems, and when an application grows and turns complex it's a very nice tool to keep developer facing code neat and readable. Usually it also keeps it more robust under change, there are fewer lines where typos or logic bugs can creep in when you write less boilerplate and use well designed code generation instead.
Then why are virtually all browser engines, high-performance games, and video codecs written in C++? Why is every massive-scale web application written in C++? The only one that comes close that's written in Lisp is QPX, and even then they had to write a lot of C++ because Lisp's I/O facilities just weren't up to the task of moving the huge amounts of data they had to move.
> C++ and Java have nothing like their macro systems
C++ has template metaprogramming, which covers like 90% of the cases where you'd actually need Lisp macros, and outputs code that's statically typechecked with deterministic memory and runtime constraints.
> Then why are virtually all browser engines, high-performance games, and video codecs written in C++?
Path dependence. There are only two lineages of widely used browsers today: Chromium/Webkit (coming from Konqueror, initial release 2000) and Firefox/Gecko (coming from Netscape, initial release 1994). Nobody's rewriting them.
The same goes for video game engines and other high-performance software.
The C++ bits in QPX were, as far as I know, relatively tiny elements for setting up memory mapped file, done this way because there wasn't good native interface in CMUCL.
The contents of that data file was generated by lisp batch job, too.
Kinda. Language-specific expertise should not be downplayed.
If you've spent a decade honing your craft writing C and assembly for exotic embedded platforms, you'll be a valuable asset for certain projects, but your experience won't translate to Scala or Haskell.
I think I meant more like "every language can have good programmers writing top quality code in it, even (especially) the blubs", not that a given good programmer will be able to immediately translate their expertise to a vastly different language or environment.
Consider an expert in C programming for embedded platforms, and an expert in Ada (or Forth) programming for embedded platforms. The surrounding expertise is similar, but the language-specific expertise shouldn't be dismissed.
Google is famous for only supporting a few general languages. Initially it was just C++, Python, Java, and Javascript (browser only!), although a few more eventually trickled in. The language deviants kept writing "official" style guides and adding support to random infrastructure and libraries, hoping that one day the next major project might get written in OCaml. New Googlers often had a confusing moment after learning the restrictive language policy when they compiled their first program and blaze would print that it was generating protocol buffer bindings for Haskell (this would eventually be suppressed).