GCC can honestly only blame itself for its inevitable increasing obsolescence. LLVM only has the attention it has because it can be used as a building block in other compilers. GCC could've made a tool and library which accepts IR, performs optimizations and emits machine code, but the project avoided that for ideological reasons, and as a result created a void in the ecosystem for a project like LLVM.
I'd add code quality as a reason. I find it much easier to understand and modify code in LLVM compared to GCC. Both have a fairly steep learning curve and not too much documentation, but often I (personally) find LLVM's architecture to be more thought out and easier to understand. GCC's age shows in the code base and it feels like many concepts and optimizations are just bolted on without "required" architectural changes for a proper integration.
Embedded compiler vendors and UNIXes want a possibly slightly patched C or C++ compiler, maybe with an extra back-end bolted on. I'm talking about use-cases like Rust and Zig and Swift, projects which want a solid optimizing back-end but their own front-end and tooling.
And they do! You can choose not to contribute your changes back to permissively-licensed software, but in actual practice most people do contribute them. It's not like the Rust compiler is proprietary software with their own closed-source fork of LLVM...
> You can choose not to contribute your changes back to permissively-licensed software, but in actual practice most people do contribute them.
They contribute some things, sure. But the also don't contribute some things. It is hard to know how much because it's kept secret from all of us, even their own customers.
> "It's not like the Rust compiler is proprietary software with their own closed-source fork of LLVM..."
Rust no, hut there are a lot of proprietary, semi-incompatible proprietary forks out there.
I haven't advocated for re-licensing GCC to be permissively licensed. And patching GCC is necessarily going to be much easier for vendors than to build a new C front-end which generates GCC IR. So I'm not sure what difference you think what I'm proposing would make with regard to your concerns.
gccrs is a rust implementation for gcc. Just because Rust developers don't want their users to be fully free doesn't mean there are any problems with gcc. And clang is developed by Apple which is a huge warning sign by itself.
LLVM/Clang is evolving more quickly than and is a much richer base for innovation than GCC is. LLVM spawned Rust, Swift, and Zig. The most recent GCC languages are Spark and COBOL.
One of the reasons that LLVM has been able to evolve so quickly is because of all the corporate contribution it gets.
GCC users want Clang/LLVM users to know how dumb they are for taking advantage of all the voluntary corporate investment in Clang/LLVM because, if you just used GCC instead, corporate contributions would be involuntary.
The GPL teaches us that we are not really free unless we have taken choice away from the developers and contributors who provide the code we use. This is the “fifth freedom”.
The “four freedoms” that the Free Software Foundation talks about are all provided by MIT and BSD. Those only represent “partial freedom”.
Only the GPL makes you “fully free” by providing the “fifth freedom”—-freedom to claim ownership over code other people will write in the future.
Sure, the “other people” are less free. But that is the price that needs to be paid for our freedom. Proper freedom (“full freedom”) is always rooted in the subjugation of others.
Some slaves were treated well, fed nicely and overall had rather good life. Yet, they remained slaves, so master could take all of that from them at any moment.
But hey, if you try hard to be nice to your master and do not demand anything, for sure they will always treat you well!
And gccrs is not really very widely used and is not the "official" Rust compiler. The Rust project chose to base their compiler on LLVM, for good technical reasons. That's bad news for the GCC project.
I thought that most FOSS projects took multiple compiler implementations as a sign of a healthy language environment, without much prestige associated with being the "premier" compiler and instead having more of an it-takes-a-village attitude. Granted, I'm mostly extrapolating from Go and Python here - is it a sharper divide in the Rust community?
No, and I think the gccrs project is great and a sign that the language is maturing. I'm just saying that LLVM was chosen instead of GCC to build the official Rust compiler for very good technical reasons, and that ought to worry GCC.
Rust developers don't "not want their users to be fully free", they disagree with you on which license is best. Don't deliberately phrase people's motivations in an uncharitable way, it's obnoxious as hell.
Apple is not a UNIX vendor. They checked off enough of a compliance list to pass in the legal and marketing sense, but in any real practical sense it's not usable as a UNIX system. It's not designed to serve anything. Most of it is locked down and proprietary. No headless operation, no native package manager, root is neutered... and so on. It's not UNIX.
Are you from 2001? Sorry, but Apple at servers' market it's non-existent. RH and Canonical make tons of money thanks to being THE platform of the ubiquituous internet. Support is not free.
Are you calling Linux a UNIX? I mean GNU/Linux is certainly a UNIX-like system, but aren't "UNIXes" typically used to refer to operating systems which stem from AT&T's UNIX, such as the BSDs and Solaris and HP-UX and the like? GNU's Not UNIX after all
Gnu promotes Unix, but also promotes Emacs on top. NT can run Win32 on top of it, but there's far more than Win32 with NT systems. Just get ReactOS and open the NT object explorer under explorer.exe.
Win32 is an API, while UNIX, NT and GNU are all operating systems made by AT&T, Microsoft and the GNU Project respectively. I agree that the GNU operating system isn't the UNIX operating system, and I agree that the NT operating system isn't the Win32 API, but I have no idea what comparison you're trying to draw.
It almost sounds like you think UNIX is an API like Win32, and that GNU is an operating system which "implements UNIX" like NT is an operating system which "implements Win32"? Are you confusing UNIX with POSIX?
It's not like GCC or the embedded toolchains are a shining beacon of ISO compliance... and if you mean video game consoles, are any of them using GCC today? Sony and Nintendo are both LLVM and Microsoft is Microsoft
I would not say that GCC is more focused on C. Also in GCC there is a lot more effort going into C++ than C. C is generally neglected, which is sad given its importance.
GCC's support for C23 is essentially complete. Clang is mostly catching up, but features I need that I am missing are storage class in compound literals and tag compatibility. It is also sad that Clang does not implement strict aliasing correctly (it applies C++'s rules also in C).
With LLVM they don't need to fork it in the first place. But still, it doesn't matter because ISO compliance is a frontend problem.
The one vendor who forks LLVM and doesn't contribute their biggest patches back is Apple, and if you want bleeding edge or compliance you're not using Apple Clang at all.
If you say "isn't it great vendor toolchains have to contribute back to upstream?" I'm going to say "no, it sucks that vendor toolchains have to exist"
Forcing vendors to contribute back upstream is an attempt at making vendor toolchains not exist.
If a company makes a new MCU with some exciting new instruction set, they need to make a compiler available which supports that instruction set and make that compiler available to their customers.
With LLVM as the base, the vendor could make their toolchain proprietary, making it impossible to integrate it back into LLVM, which means the vendor toolchain will exist until the ISA gets wide-spread enough for volunteers to invest the time required to make a separate production-quality LLVM back-end from scratch.
With GCC as the base, the vendor must at least make their GCC fork available to their customers under the GPL. This, in theory, allows the GCC community to "just" integrate the back-end developed by the vendor into GCC rather than starting from scratch.
Now I don't know how effective this is, or how much it happens in practice that the GCC project integrates back-ends from vendor toolchains. But in principle, it's great that vendors have to make their toolchains FOSS because it reduces the need for vendor toolchains.
gcc requires that contributors assign copyright to FSF (or at least "certify the Developer Certificate of Origin")[0], so a fork isn't gonna get upstreamed without the approval of the fork's authors. So that limits the benefit from gcc-forks to individuals keeping the fork alive.
Dude you can't just quote a fraction of a sentence and then argue against that fragment. Read the whole sentence. The part about how "the vendor could make their toolchain proprietary" is kinda important.
So are you aware how many contributions have actually been upstream by Sony, Nintendo, IBM, ARM, Green Hills, Codegear, TI, Microchip, Codeplay, NVidia, AMD, Intel, HP,.... ?
The problem is that "weird CPUs and embedded systems" are not mainstream platforms, nor do they have a lot of money behind them (particularly for open source development). Hence, there is little motivation and/or resources for anyone to develop a new backend for LLVM when a mature GCC backend already exists. Moreover, the LLVM developers are weary to accept new backends for niche platforms when there is no guarantee that they will be maintained in the future.
Exactly - often the GCC branch is barely maintained by whoever is selling the CPU/chip, and it is patches against a specific version that you must use, or it goes haywire.
At least with GCC they can sometimes merge the patches.
Hm, is it so much easier to write an LLVM back-end than a GCC back-end? I haven't looked into GCC's code gen infrastructure that much, but I looked into making a new back-end to LLVM some time ago and it seemed extremely complex and entirely undocumented. All guidance I found on it basically amounted to, "copy an existing back-end and bang it against a rock until you've made it generate the machine code you need". And it certainly wasn't a case of having clean and simple enough code to make documentation unnecessary. It would honestly surprise me if GCC is more difficult to add a back-end to.
I don't have much experience with writing a custom gcc backend, but my experience with LLVM is that its model ends up being a somewhat poor fit for smaller or weirder architectures. For example, LLVM has this annoying tendency to really aggressively promote the size of everything--if you have a 32-bit core with 64-bit memory indexing, LLVM tends to make anything that eventually becomes a memory index a 64-bit computation, even if it would be more efficient to keep everything as 32-bit.
GCC still for Linux distributions using glibc or any other library with many "GCCisms". Also, I'm not sure whether or not Clang is ABI compatible enough for enterprise customers with some rather extreme backwards compatibility requirements. Still, I can imagine a future where glibc can be built with Clang, possibly even one where llvm-libc is "good enough" for many users.
I do remember reading about LTO not working properly, you're either unable to link the kernel with LTO, or get a buggy binary which crashes at runtime. Doesn't look like much effort has been put into solving it, maybe it's just too large a task.
libgccjit is, despite its name, just another front-end for GIMPLE. The JIT-part is realized through compiling the object file to a shared library and using dlopen on this.
One big problem with libgccjit, despite its fairly bad compile-time performance, is that it's GPL-licensed and thereby makes the entire application GPL, which makes it impossible to use not just in proprietary use-cases but also in cases where incompatible licenses are involved.
Yet another reason why I'm not a fan of Richard Stallman.
Most of the decisions he made over the past 25 years have been self-defeating and led directly to the decline of the influence of his own movement. It's not that "the GCC project" avoided that for ideological reason, Stallman was personally a veto on that issue for years, and his personal objection led to several people quitting the project for LLVM, with a couple saying as much directly to him.
I think that's an unreasonable lens for viewing his work. Of course he values purity over practicality. That's his entire platform. His decision making process always prioritizes supporting Free Software over proprietary efforts, pragmatism be damned.
Expecting Stallman to make life easier for commercial vendors is like expecting PETA to recommend a good foie gras farm. That's not what they do.
Alternatively, if you join a Stallman-led Free Software project and hope he'll accept your ideas for making life easier for proprietary vendors, you're gonna have a bad time. I mean, the GNU Emacs FAQ for MS Windows (https://www.gnu.org/software/emacs/manual/html_mono/efaq-w32...) says:
> It is not our goal to “help Windows users” by making text editing on Windows more convenient. We aim to replace proprietary software, not to enhance it. So why support GNU Emacs on Windows?
> We hope that the experience of using GNU Emacs on Windows will give programmers a taste of freedom, and that this will later inspire them to move to a free operating system such as GNU/Linux. That is the main valid reason to support free applications on nonfree operating systems.
RMS has been exceedingly clear about his views for decades. At this point it's hard to be surprised that he’ll make a pro-Free Software decision every time, without fail. That doesn't mean you have to agree with his decisions, of course! But to be shocked or disappointed by them is a sign of not understanding his platform.
But they're only bad/counterproductive when assessed against their own goals. They're a lot more reasonable if you assess them against Stallman's, where you could say "yeah, these actions kept GCC from becoming a handy tool for proprietary interests who don't want to share their work back with us".
On the subject of software freedom, I hate how accurately prophetic he is.
RMS: Here’s how they'll get ya!
Me: Nice, but that'd never happen.
Vendor: Here’s how we got ya!
Me: Dammit.
Seriously, he must have a working crystal ball.
Now, my agreement with him starts and ends on that subject. He says plenty of other things I wholly disagree with. But his warnings about proprietary software lock-in? Every. Single. Time.
I mean, I think he is right. If everyone stopped supporting Windows, it would have long died out in favor of easy-to-install Linux distributions... probably.
As gray dog around the prairie, had Microsoft actually been serious with POSIX subsystem on Windows NT/2000, instead of some marketing material and low level effort, GNU/Linux adoption would never taken off, at least not at a level that would have mattered.
With OS X on one side, and POSIX subsystem on Windows NT/2000 side, everyone would be doing their UNIX like workflows without thinking once to try out GNU/Linux.
At my university we only cared about Linux on its early days, Slackware 2.0 days, because naturally we couldn't have DG/UX at home, and that POSIX support was really unusable beyond toy examples.
> His decision making process always prioritizes supporting Free Software over proprietary efforts, pragmatism be damned.
No, this is giving him too much credit. His stance on gcc wasn't just purity over pragmatism, it was antithetical to Free Software. The entire point of Free Software is to let users modify the software to make it more useful to them, there is no point to Free Software if that freedom doesn't exist - I might as well use proprietary software then, it makes no difference.
Stallman fought tooth and nail to make gcc harder for the end user to modify, he directly opposed letting users make their tools better for them. He claims it was for the greater good, but in practice he was undermining the whole reason for free software to exit. And for what? It was all for nothing anyway, proprietary software hasn't relied on the compiler as the lynchpin of its strategy for decades.
Don't forget that the LLVM folks actually went to GNU and offered it to them, and they failed to pay attention and respond. (It's not even that they responded negatively; they just dropped it on the floor completely.) There's an alternate history where LLVM was a GNU project.
With the benefit of hindsight, I'm glad that that didn't happen, even though I have mixed feelings about LLVM being permissively licensed.
Gcc is the flagship of the GNU Project - allowing an endrun of the the spirit of the GPL in gcc was never going to happen. The project paid more attention than you give them credit for because allowing closed source plugins and improvements that use gcc as a frontend is anathema to Free software.
There's an impedance mismatch between people who think gcc should have maximized user utility vs. the actual GNU philosophy. The actions of the gcc project make a lot of sense if you consider the FSF/GNU are monomaniacal about maximizing users freedoms, and not popularity, momentum or other ego-stroking metric.
Whether you care about number of users or not, there's value in considering whether what you're doing is actually advancing the cause of Free Software or not.
GCC today has a very interesting license term, the GCC Runtime Library Exception, that makes the use of runtime libraries like libgcc free if-and-only-if you use an entirely Free Software toolchain to compile your code with; otherwise, you're subject to the terms of the GPL on libgcc and similar. That is a sensible pragmatic term, and if they'd come up with that term many years ago, they could have shipped libgccjit and other ways to plug into GCC years ago, and the programming language renaissance that arose due to LLVM might have been built atop GCC instead.
That would have been a net win for user freedoms. Instead, because they were so afraid of someone using an intermediate representation to work around GCC's license, and didn't do anything to solve that problem, LLVM is now the primary infrastructure people build new languages around, and GCC lost a huge amount of its relevance, and people now have less software freedom as a result.
> If people are seriously in favor of LLVM being a long-term part of GCC, I personally believe that the LLVM community would agree to assign the copyright of LLVM itself to the FSF and we can work through these details.
Without GNU, GPL and Richard Stallman, FOSS would not exist in the first place. The GPL forced companies to make FOSS a thing, whether they liked it or not.
I disagree. They got software that’s open under a different license that otherwise would have just been purely proprietary, or not created at all (due to the compounding effects of open source)
> Without GNU, GPL and Richard Stallman, FOSS would not exist in the first place. The GPL forced companies to make FOSS a thing, whether they liked it or not.
I was responding to this, which is more widespread than GCC (although it was one of the first wins of the GNU).
There were various companies who wanted to add on backends and other bits to GCC, but wouldn’t due to the license. That’s one of the reasons LLVM is so popular.
When VSCode et all beging shipping DRM and who knows what in their extensions, then we'll se what with happens with these half-shareware semilibre projects.
Specially when propietary dependencies kill thousands of projects at once.
As with any other achievement of civilization, younger generations will at some point find why previous ones fought for something and how it sucks to loose it.
But when this realization comes, it will be too late.
I doubt that. People often say "without <person that started thing> we wouldn't have <thing>!" but it's nonsense. Someone else would have done it just slightly later.
And Free Software does not benefit from a morass of mutually-incompatible copyleft licenses that may as well have been proprietary since you can't use them together.
None of the permissive licenses have this problem.
The number of significant software projects which are licensed as GPLv2-only, and which are therefore incompatible with GPLv3, can probably be counted on one hand. Normal GPLv2 projects are licensed – as instructed by the text in the license itself – as “GPLv2 or later”. Which means that any normal GPLv2-licensed software can be relicensed to GPLv3, and can therefore be combined with any other GPLv3-licensed program without any issue whatsoever.
For GPL v2 only, let's start the list with Linux and Git...
The "or later" has been used in creative ways, like relicencing all the Wikipedia content, or the Affero to AGPL transition. Nothing shady, but unexpected.
Do you trust RMS to avoid doing shady things in the later GPL licence? I do, but he is not longer in the FSF.
Do you trust the current members of the FSF to avoid doing shady things in the later GPL licence? I don't know them.
Do you trust the future members of the FSF to avoid doing shady things in the later GPL licence???
Section 14 of the GPL says "The Free Software Foundation may publish revised and/or new versions of the GNU General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns." Given that the preamble to the GPL explicitly says "the GNU General Public License is intended to guarantee your freedom to share and change all versions of a program--to make sure it remains free software for all its users", I don't think that a judge would find that a hypothetical GPLv4 that was basically MIT or something is "similar in spirit" to the present version.
If you're worried about the other direction (i.e. a hypothetical GPLv4 that had some bizarre restriction like "all users must donate to the FSF"), the "or any later version" means that as long as you don't decide to update the license you use yourself, people can continue to use it under the GPLv2 or v3 indefinitely.