Good idea! Go has always impressed me on how every single new release works perfectly with zero effort on my part. There is no reason I can think of that someone would want to be on an older version of Go.
I understand the objections here. I upgrade software a lot. Upgrade Emacs, your .emacs fails to load. Upgrade Linux, your system fails to boot. That's because they make backwards incompatible changes and release if the unit tests pass. I don't know what Go is doing, but this simply never happens. Every 6 months, you get a performance improvement and new features for free. For free! It's truly amazing. (While I'm here, I'll also point out that PostgreSQL is similar here. Have you ever upgraded and had your app broke? Absolutely not. Every version is a free performance and feature win.)
At the risk of being downvoted, I have to say that if you don't believe this, you've been Stockholm Syndrome'd into your programming language of choice. I work primarily in Go, but occasionally have to do frontend stuff. The pattern that I run into is that if I use the version of Node in Debian, the code doesn't compile. If I go to nodejs.org and download the latest version, it doesn't compile. Only if I install 20000 lines of bash that inspect some files and switches 300 environment variables to put the right version of node.js in my path can I even begin to ... run a script that concatenates 300,000 lines of code and translate it to a version of Javascript that Chrome and Firefox both have? Guys, it's crazy. I love a good user interface, but if I was trapped on a desert island with node.js I would instantly kill myself. No questions asked. Go isn't like that. If your users can handle a CLI, you are one "tar xzvf" way from sanity. It's great.
I digressed, but this is a great choice. The Go team always seems to know how to improve the lives of end users. I appreciate that.
On the contrary, that you do believe this means that you don't have long enough experience in golang. The points where suddenly a seemingly innocent version upgrade becomes a massive real-world problem usually happen years or decades into a tool/language/whatever. golang hasn't existed nearly long enough yet.
"simply never happens" is incorrect. "hasn't happened to me yet" is correct.
Or, put another way: Come back to us when you are on version 3.x, years or decades from now, and then a statement that this never happened to you will actually be significant. (-:
Go is over a decade old. I’ve been using it for 8 or 9 years now and entirely concur with GP. As an example, I have routinely deployed to prod for lower stakes stuff using CI Go versions that didn’t match development. Aside from needing new features this has never caused an issue. I’ve had more issues arise from OS updates than Go toolchain updates.
There's more than that. What I really want is to be able to update the compiler without also updating the standard library since the compiler has been fairly solid, while the http, tls, and archive stdlib packages have been a constant source of breaking changes and bugs with each update.
I'd rather be able to update my compiler without also having to cross my fingers that they didn't decide to yet again delete support for some part of TLS I'm using that those on high at google have deemed is "insecure", even if it's in a test or over a local network.
Beginning Go 1.21, the version in go.mod will control behavior in the standard library, so that you can upgrade the compiler/stdlib and not have your programs break: https://tip.golang.org/doc/godebug
Past results are not the whole story, when you see a new signal like a project "changing toolchains on its own" is a new signal that does point to a huge potential future problem.
It's like someone saying "Mike? He never hurt nobody! He's a total sweatheart!" when in the past weeks we've seen Mike starting stockpiling guns and getting a "KILL EM ALL!" tattoo on his forehead.
OK, perhaps not as dramatic, but changing toolchains behind the users' back, is a red flag for future trouble, even if past behavior was good.
> Come back to us when you are on version 3.x, years
Someone who has long enough experience in Go would know that the Go compatibility promise[1] only applies to Go 1. Go 2 and later are already allowed to break compatibility. I have code I wrote on something like 1.13, if not earlier and it still works fine with 1.20. Sure, there are rare exceptions, but each release documents them.
My question to you, then, is: what exactly did you see in your years of Go experience that broke on a version upgrade?
> There is no reason I can think of that someone would want to be on an older version of Go.
New Go toolchains regularly drop support for running the generated binaries on older versions of Windows and macOS. I definitely want to remain in control of this.
> That's because they make backwards incompatible changes and release if the unit tests pass.
> At the risk of being downvoted, I have to say that if you don't believe this, you've been Stockholm Syndrome'd into your programming language of choice.
> but if I was trapped on a desert island with node.js I would instantly kill myself.
Too many bold claims to engage with.
It’s tempting to shoot fish in a barrel but you have to try a little harder. ;)
In general people often understand better and prefer the tool they use often to the tool they occasionally have to use but never thought it worthwhile to study in depth.
> There is no reason I can think of that someone would want to be on an older version of Go.
Go 1.20 is is the last release that will run on any release of Windows 7, 8, Server 2008 and Server 2012, macOS 10.13 High Sierra or 10.14 Mojave.
In addition to these OSes there are also platforms like Google App Engine. Google is notoriously slow at updating the Go runtime. Until January 17, 2023 the latest Go runtime generally available on Google App Engine was Go 1.16, while in the broader ecosystem Go 1.20 was already scheduled to release two weeks later. That's two whole years behind. It sucks, but it is the reality of how Google operates things. They still haven't added support beyond Go 1.15 to the local development server.
The comment about Node on a desert island made me actually LOL. Thanks for that!
I recently had to update 2 Node (Next.js, static sites) projects from v16 to v18 and it was brutal. Granted it was a big jump but still took over a week to get done. So many dependencies just broke and had to be updated or patched.
Coming from the PHP world, where there is also real effort put into maintaining backwards compatibility, it is really painful. I had sites on version 5.6 that went to version 8 just fine...
I'm not sure but one of the challenges we had was the changes to NPM package resolution. We had to use the "legacy peer deps" flag to get certain things to work without upgrading tons of packages.
These were both LTS releases so you would hope that is would be easy to upgrade or at least be able to have a reasonable upgrade path
I run my startup on Go and react. The ease of upgrade process can't be further apart.
We have our entire package.json locked down to the exact patch version because we've had issues where random patch upgrade kills the app, we run our own npm server because we can't trust dependency attacks in the js ecosystem. Just absolute pain.
Ten years of Go, I can count on one hand the times a dependency upgrade broke the build.
> we run our own npm server because we can't trust dependency attacks in the js ecosystem
What does this mean? Your deps get locked down with sha1(?) checksum automatically after you install your packages (unless you go out of your way to delete the lock file). Must be a valuable startup you have for someone to attack your build with a hash collision..
Then don't delete your lock-file and the vector doesn't exist?
Rolling your own package management solution because understanding the existing tools is too hard.. That is peak "javascript sucks and I'm going to comment about it" energy lol.
I think this is more of frontend vs backend thing. I have a similar experience to you with our ~5kloc React frontend, while dependency upgrades for our 50kloc backend have been mostly painless.
We have a service that uses Go http proxy for implementing a reverse proxy.
Upgrading to Go 1.16 introduced an unexplained performance drop. We suspect it is due to how it flushes when the content length is not known (typical for streaming). In earlier versions of Go, the call to flush was less frequent and in other versions, flush was called on every write. As a team, we sunk a lot of time trying to debug this and finally changed some of the application architecture/logic to circumvent these overheads.
And this comes at a cost, having tweaking with Go internals and the compiler I wouldn’t say it’s a mess but it’s a real confusion a lot of things (maybe some) have broken some of the go way, if the go team hadn’t decided to the runtime as a package, 8 years would have taken Go to break changes
Some of it is differences in usecase. If it happens on a dev's laptop, that's... okay, it still rubs me the wrong way, but I get it. If it happens that now every CI job reinstalls the compiler, I'm going to be annoyed.
One workplace I had used GCP. There was a Postgres upgrade and one of our services had a major performance degradation, some people worked for days to mitigate the problem.
Sadly I don’t have the version number or any details as I wasn’t working on the issue. It was sometime in mid 2022.
> (While I'm here, I'll also point out that PostgreSQL is similar here. Have you ever upgraded and had your app broke? Absolutely not. Every version is a free performance and feature win.)
I'm all in on Go and enjoy how the toolchain simplifies things so you can just think about writing code. But I am not a fan of this. I don't want my compiler to execute remote code uninstructed.
I don't have extensive experience with Go, but for the bit that I was maintaining a production Go service, I had a Go version update break things at least once.
Default smells of same ignorance like proxy and GOPRIVATE introduction, where after introduction they forced using proxy so no private repositories for dependencies worked any more without fucking with env
It sounds to me like this is downloading the toolchain as though it were any other Go package and compiling it. I'm not sure it's downloading binaries, as the article suggests.
From the Go Toolchains documentation:
"When using GOTOOLCHAIN=auto or GOTOOLCHAIN=<name>+auto, the Go command downloads newer toolchains as needed. These toolchains are packaged as special modules with module path golang.org/toolchain and version v0.0.1-goVERSION.GOOS-GOARCH. Toolchains are downloaded like any other module, meaning that toolchain downloads can be proxied by setting GOPROXY and have their checksums checked by the Go checksum database."
Curious what your concerns are? As of Go 1.21 the toolchain will be fully reproducible [https://github.com/golang/go/issues/57120] and since the binaries will be distributed through the module system it'll be possible to verify that everyone is getting the same binaries. So you can be pretty confident you'll end up with the same bytes as if you downloaded the source and compiled yourself.
In principle, I like the idea of it. Whether or not it's a binary or source to be compiled locally is irrelevant, I suppose.
My concern is that it's not clear to me how this will look on my computer. Currently, I download the source tarball and build with the latest Go toolchain I have previously installed. I then change the path to point to the newly built toolchain.
So, in the future, what is it I'm downloading? How does the toolchain get updated exactly? Where does the new executable reside? What happens to the toolchain that I explicitly built and installed? How does Go1.21 change my path?
As I say, I like the idea in principle but there are details that I think are important, that don't seem to have been explained anywhere.
The Arch Linux maintainers, on the other hand, already tried and gave up on isolating Go and Rust builds from the network, from what I’ve read. (That was for package dependencies and not the toolchain itself.)
> Not everyone is going to be happy with the idea of the 'go' command downloading and running binaries from the Internet. If this is the case for you, you need to set GOTOOLCHAIN to either 'path', if you're happy to search $PATH for the right toolchain, or 'local', if you want to turn off this behavior entirely and have 'go' just fail.
It's customizable, but it downloads by default. Anyone who cares about controlling what they download from the Internet would prefer opt-in, not opt-out.
The following quotes seem to indicate that “auto” is not the default. At least, not if you haven’t run some command to set it at a default (and I’m not seeing one).
> If the $GOROOT/go.env file is missing or does not set a default, the go command assumes GOTOOLCHAIN=local.
> When GOTOOLCHAIN is set to local, the go command always runs the bundled Go toolchain.
In the intro, however, it does say:
> The default GOTOOLCHAIN setting is auto, which enables the toolchain switching described earlier.
I guess I’m also confused. :/
Note that go.work is only created by the `go work init` command when you intend to use multiple modules in one directory. I’m guessing that doing this is relatively unusual. This could be how a “default” gets set for newer versions (unconfirmed conjecture on my part).
I don’t see any command that creates a `go.env` and therefore might set GOTOOLCHAIN in that file.
The only other thing I can think of is that `go mod init` could set this to “auto” in the `go.mod` of a new project. But, go.env is probably missing in this case.
Seems crazy to me. I want to see an error if the version in my PATH is too old, so I can address. I don't want my toolchain to do -anything- implicitly, including searching PATH. If my PATH is wrong, I'll fix it.
And downloading is even weirder. What happens if the download fails? Am I presented a random network error, or a Go too old error? Who decides which is even more correct?
Reminds me on GOPRIVATE bullshit where since some go version private repositories outright didn't work without setting env variables, because go team decided to not add fallback if something is not available via their shitty proxy
Seeing a lot of discourse here talking about how Go releases are so stable and don't break shit. As someone who also loves Go, I mostly agree with these takes, but let's not forget how much of a mess dependency management used to be in the bad old days. I remember Glide, Dep, Godep, etc. I think given the Go teams attitude towards not breaking backwards compatibility we will probably see less breaking upgrades like we see in frontend-land for example, but things were not always this nice and stable.
I really don't understand. Maybe I'm stupid. But I don't see the issue.
I work with go daily, but I'm pretty sure our binary is pegged to a specific go version, using go.mod
I don't see in what scenario I would add a dependency and it would unexpectedly upgrade my go toolchain? Isn't my toolchain version pegged by a go.mod file?
This article may raise a valid concern, but it doesn't propose a clear alternative or solution.
I'm not a go truther but there seems to be lot of pessimism around it (error handling, generics, and now toolchain) and not enough optimism.
The go version line in go.mod doesn't peg the version of the actual compiler, it only pegs the version of the language. It's analogous to flags like --std=c++20 in C++ compilers. If you have code that is, for example, valid go 1.17 and also valid go 1.18, but it behaves differently somehow between the two for reasons that are considered implementation details instead of language changes (for instance, depending on behavior changes in standard libraries), the behavior you get will depend on the version of the go tool that you happen to be running when you build it, not by what's defined in go.mod. That's sort of what's changing in go 1.21 - there's now a "toolchain" directive to specify the precise compiler version that should be used, which is slightly independent of the version of the language semantics that you want to apply.
But of course, it's still not that clear - because the toolchain directive is only respected in the "main" module, and it's also still only a minimum. That means you can't actually guarantee that your module will always be built with exactly that toolchain version - if it's included in someone else's module then it will be built with whatever they happen to be running, and it can always be built with a later version no matter what.
I think the part they take issue with that might surprise you is if you decide to start using a dependency, and that dependency declares in it's go.mod file (not yours) that it needs a newer version than yours, and that triggers a download and toolchain swap.
I think it's... Iffy. I'd prefer it to fail and for it to be up to me to upgrade, personally.
I think that's only what happens for binary (tool) dependencies that are built on their own, though. For library dependencies that you're only building against, their "toolchain" directive would be ignored, because they're not the main module (https://go.dev/doc/toolchain).
Idk, I just looked at the go.mod docs (I assume go.mod is the current standard but I really can't say for certain, the go ecosystem is more of a cluster than python, imo): https://go.dev/ref/mod#go-mod-file-go
> The go directive sets the minimum version of Go required to use this module. Before Go 1.21, the directive was advisory only; now it is a mandatory requirement: Go toolchains refuse to use modules declaring newer Go versions.
Seems like a non-issue. I'm still unclear what TFA is saying
IIRC the go version in go.mod is a minimum go toolchain version. In the future if you pull in a dep with a higher minimum toolchain version, it would be unbuildable without upgrading your toolchain. I expect there is an env variable to disable this, though.
I don’t see downloading a new copy of go which matches a dependency’s version requirement as being fundamentally different from downloading that dependency’s other dependencies. It’s surprising due to being a new thing, but in five years I suspect people will consider it completely obvious and normal.
> I don’t see downloading a new copy of go which matches a dependency’s version requirement as being fundamentally different
Well, you'd be wrong then. If you went into a car dealership and asked for a new rearview mirror, and they gave you (and charged you for) a new car, you'd be rightly surprised.
Or better yet, if you ask apt get to update a package, you damn sure don't expect it to update itself.
Nothing else works like this, and shouldn't. It's just not what most people need or expect, or even want.
`go build` is a very high-level command. I'd compare it to going to a car dealership and asking them to 'make the car usable'. You think only a small part is broken but the dealership determines that it's incompatible with the current law and you need to get a new one.
If you want to build go project in a hermetically sealed environment (where it doesn't fetch anything unknown or unexpected from the internet), use for example Nix. When building software with Nix, the sandbox doesn't have access to the internet and all project dependencies must be specified via the Nix language.
... except that Visual Studio Installer, the pkg tool on FreeBSD, and others do do various forms of self-update. Visual Studio Installer does exactly what you describe, updating itself before updating any version of Visual Studio. FreeBSD pkg has an update from the static to the packaged version of itself.
So several things in the world at present work like this; whatever one might say about their being expected, needed, or wanted.
I think the difference is that the go compiler is not expected to be at the same layer of the system as go modules. Having apt update apt is expected, since apt is just another .deb, having pkg update pkg is expected because pkg is just another package, but this is like running pkg and it deciding that your system is too old so it runs freebsd-update for you.
> Or better yet, if you ask apt get to update a package, you damn sure don't expect it to update itself.
If a package requires a newer version of apt and said version is available then I would absolutely expect apt to do so. apt is after all just another package.
> Or better yet, if you ask apt get to update a package, you damn sure don't expect it to update itself.
This might be true for apt, because that’s how it always worked, but is this true of modern package managers? Like if I do `brew install X` I expect the latest version of X, which might require updating brew. I definitely don’t want the latest version of X my local brew install happens to know about. As an example, doesn’t pretty much every Dockerfile installing dependencies from apt call apt-get update -y?
Go 1.21 will (probably) download newer toolchains on demand by default
Will it fail gracefully and quickly when firewall rules do not permit Go to make outbound connections? If I give it a tcp-reset will it stop trying? Can I set an environment variable that says to never try in the first place?
Why would I want this? Simple. I want to know that a set of applications and libraries at specific versions have gone through Dev -> Security Audit / Code Review -> QA -> Load Testing -> Staging -> Production. Bug fixed? No problem, rinse and repeat the QA process.
I'm... really not sure I agree with this, from a philosophical point of view. It feels like this is making "eh, we'll just upgrade our Go version next quarter" too easy; ultimately some responsibility toward updating your application's Go version to work with what new dependencies require should fall on Us, the application developers. Sure, we're bad at it. Everyone's lived through running years-old versions of some toolchain. But I think this just makes the problem worse, not better.
Its compounded by the problem that, when you're setting up a new library, the `go` directive in the mod file defaults to your current toolchain; most likely a very current one. It would take a not-insignificant effort on the library author's part to change that to assert the true-minimum version of Go required, based on libraries and language features and such. That's an effort most devs won't take on.
I'd also guess that many developers, up-to this point if not indefinitely because education is hard, interpreted that `go` directive to mean more-of "the version of go this was built with"; not necessarily "the version of go minimally required". There are really major libraries (kubernetes/client-go [1]) which assert a minimum go version of 1.20; the latest version (see, for comparison, the aws-sdk, which specifies a more reasonable go1.11 [2]). I haven't, you know, fully audited these libraries, but 1.20 wasn't exactly a major release with huge language and library changes; do they really need 1.20? If devs haven't traditionally operated in this world where keeping this value super-current results in actually significant downstream costs in network bandwidth (go1.20 is 100mb!) and CI runtime, do we have confidence that the community will adapt? There's millions of Go packages out there.
Or, will a future version of Go patch a security update, not backport it more than one version or so, and libraries have to specify the newest `go` directive version, because manifest security scanning and policy and whatever? Like, yeah, I get the rosy worldview of "your minimum version encodes required language and library features", but its not obvious to me that this is how this field is, or even will be, used.
Just a LOT of tertiary costs to this change which I hope the team has thought through.
Is it possible to fully embed the "go sdk" into your own golang program?
The motivation is shipping my single binary program which would include go itself. Then the program could proxy the "go" command to build and run code without needing "golang" installed or downloaded if airgapped.
Go downloading a toolchain is not any different to dustup downloading the version of rust that is in a toolchain toml file. At least with Go it will presumably only download officially released versions, and not gotip/nightly.
The rustup toolchain toml file is under your control, dependencies have no affect on it.
This Go feature is like the rust-version key in Cargo.toml where if a dependency specifies in it's own Cargo.toml file that it needs a rust version higher than you what you have you will get an error and your code won't compile (there is no option to auto upgrade the toolchain in rust).
This is incorrect afaict. The Go docs specify that the toolchain directive is only considered in the "main" module. If your dependencies set it, it has no effect (unless they're binary tool dependencies that you go install).
According to the docs[0] you can't declare a go version in the main module or workspace that is lower then the go version of one of your dependencies.
A module’s go line must declare a version greater than or equal to the go version declared by each of the
modules listed in require statements. A workspace’s go line must declare a version greater than or equal
to the go version declared by each of the modules listed in use statements.
For example, if module M requires a dependency D with a go.mod that declares go 1.22.0, then M’s go.mod
cannot say go 1.21.3.
so the "go" directive is like "rust-version" in Cargo.toml and the "toolchain" directive is like the rust version in a rust-toolchain file.
It's totally different.
In go "go build" will fetch a new toolchain without your asking it to.
For that matter it seems every go subcommand will do this (even --help [1]).
Not having full control over the compiler is generally bad if your product needs to be supported for a long time or has to undergo some type of certification.
Rustup has slowed down rust adaptation in certain industries. Yes, it is technically possible to download a fixed version and stick to it but it will be challenging when the rest of the ecosystem is on nightly...
> Not having full control over the compiler is generally bad if your product needs to be supported for a long time or has to undergo some type of certification.
rustup, and this mechanism in Go, don't take away your control over the compiler. You can always choose to build your software with whichever version you like.
> Rustup has slowed down rust adaptation in certain industries. Yes, it is technically possible to download a fixed version and stick to it but it will be challenging when the rest of the ecosystem is on nightly...
The vast majority of the Rust ecosystem is on stable, not nightly. Crates that require nightly tend to get limited adoption.
> rustup, and this mechanism in Go, don't take away your control over the compiler. You can always choose to build your software with whichever version you like.
It certainly overrules your choice by default. Before, you could choose whatever version you wanted and it would work; now, you have to run the version you want, and also set an option to tell it to not just ignore the version you installed.
It does take away your decision by default. This is opt-out, not in.
You don't get to choose when it happens or to what version.
This is a totally different beast than rustup.
I understand the objections here. I upgrade software a lot. Upgrade Emacs, your .emacs fails to load. Upgrade Linux, your system fails to boot. That's because they make backwards incompatible changes and release if the unit tests pass. I don't know what Go is doing, but this simply never happens. Every 6 months, you get a performance improvement and new features for free. For free! It's truly amazing. (While I'm here, I'll also point out that PostgreSQL is similar here. Have you ever upgraded and had your app broke? Absolutely not. Every version is a free performance and feature win.)
At the risk of being downvoted, I have to say that if you don't believe this, you've been Stockholm Syndrome'd into your programming language of choice. I work primarily in Go, but occasionally have to do frontend stuff. The pattern that I run into is that if I use the version of Node in Debian, the code doesn't compile. If I go to nodejs.org and download the latest version, it doesn't compile. Only if I install 20000 lines of bash that inspect some files and switches 300 environment variables to put the right version of node.js in my path can I even begin to ... run a script that concatenates 300,000 lines of code and translate it to a version of Javascript that Chrome and Firefox both have? Guys, it's crazy. I love a good user interface, but if I was trapped on a desert island with node.js I would instantly kill myself. No questions asked. Go isn't like that. If your users can handle a CLI, you are one "tar xzvf" way from sanity. It's great.
I digressed, but this is a great choice. The Go team always seems to know how to improve the lives of end users. I appreciate that.