Hacker News new | past | comments | ask | show | jobs | submit login
Indie game Haunts unable to compile its Go code (thedailywtf.com)
178 points by pavel_lishin on May 14, 2013 | hide | past | favorite | 193 comments



How is this Go's fault? Just because you can pull code automatically from a remote repository doesn't mean you always should. It seems to me that keeping the entirety of the code necessary to compile a project in a local build directory is a very good idea if you don't want to compile against moving targets.


Maintain your own forks of the libraries you need. Commit your changes there, and also submit upstream. Pull upstream changes back to your fork when you are ready. You have this problem with any language or tool that uses github directly as the registry.


Maintain your own forks of the libraries you need.

Exactly this. This is just one reason I really like to "git clone --mirror" onto my server, then clone from that, and work from there.


What's the benefit of that, as opposed to cloning upstream and working on that clone? How does the intermediate mirror help?


I've run into the problem (with some Emacs add-ons in particular), of changing something on one machine, only to miss that change in another. So, if I've got my own "central" repo, I can push changes to that and pull them down to other machines.

Since I site the repos on a server I have online already, it is also trivially easy to make it publicly "pullable" for others if need be.


It sounds like it's even worse than that - the developer was maintaining a fork of the upstream projects, but didn't put it in version control. That is, he/she used Go's github import mechanism to get a copy of a project onto their own machine, and edited the locally cached copy. So of course, when other people tried to build, the libraries didn't match up.

The real issue here seems to be that this was a half-finished project, at a very early stage of development, and the developer probably never intended their local scratch copy to be used in production [http://www.joystiq.com/2012/10/24/haunts-anatomy-of-a-kickst...]


You are correct. Go's remote imports are dangerous for long-term project maintenance, but the feature is still useful for quick, throwaway projects.

Go badly needs two things:

1. A best practice that dictates that you never import from remote repositories in production, long-term code; the feature is fine for one-offs and experimentation, but the article summarizes only one way this style of work can lead you in to a maintenance world of pain. What happens if the repo you're importing from Github is deleted? What do you do for fresh clones? You're going to end up changing the URL anyway. I feel the Go community has kind of glossed over this (and I like Go).

2. An equivalent of CPAN or PyPI, which you could then import from in concert with a tool to manage those dependencies, a la:

    import (
        "cgan/video/graphics/opengl"
    )
This model works for CPAN, PyPI, and so on for a reason, and that reason is avoiding several of the dependency/merge hells that remote repos can create. CPAN provides Perl a lot, such as distributed testing in a variety of environments. I personally think such a thing is necessary for long-term maintenance of any software project that utilizes third-party libraries. This is one of Google's oversights in Go, because they have an (obviously) different take on third-party code. Here's a good case:

Developer A checks out the code clean. Five minutes later, developer B checks out the code clean. In both cases, your "go get" bootstrap script fetches two different commits, because in that five minutes, upstream committed a bug. Developer B cannot build or, worse, can build but has several tests fail for unknown reasons or, even worse, the program no longer functions properly. Developer A has none of those problems. In a world with a CPAN-like, developer B can see that he has 0.9.1 and developer A has 0.9.0, developer B can commit "foo/bar: =0.9.0" to the project's dependency file, then everybody else doesn't suffer the same fate. In the current world, you're either massaging your local fork to keep the new commit out, or any other troublesome, non-scalable approach to this.

Building large software projects against a repository never works. You need tested, versioned, cut releases to build against, not master HEAD. It only takes one bad upstream commit to entirely torpedo your build, and you've now completely removed the ability to qualify a new library version against the rest of your code base. Other people are suggesting "well, maintain your own forks," so you're basically moving merge hell from one place to another. I, personally, have better things to do with my time; I've seen (Java) apps with dozens of dependencies before, and keeping dozens of repositories remotely stable for a team of people will rapidly turn into multiple full-time jobs. Do you want to hire two maintenance monkeys[0] to constantly keep your build green by massaging upstream repositories, or do you want to hire two feature developers? Exactly.

I've started writing a CPAN-like for Go a couple times but I'm always held back by these threads:

https://groups.google.com/forum/?fromgroups=#!topic/golang-n...

https://groups.google.com/forum/?fromgroups=#!topic/golang-n...

The second one highlighting how difficult Go is to package as a language -- my personal opinion is treat Go just like C and distribute binary libraries in libpackage, then the source in libpackage-src. If one message in that thread is true and binaries refuse to compile with different version compilers, I'm troubled about Go long-term.

[0]: I'm not calling all build engineers maintenance monkeys. I'm saying the hypothetical job we just created is a monkey job. I love you, build engineers, you keep me green.


You're right that syncing directly with the net is a problem. However, the problem is basically developer education. Go doesn't work like other languages and the consequences aren't well-documented. If you do it right, the process works fine; it's basically how Google works internally.

The basic idea is to commit everything under $GOPATH/src so that you never need to touch the net to reproduce your build. After running "go install" you should check in the third-party code locally, just like any other commit.

Then updating a new third-party library to sync with their trunk is like any other commit: run "go install", test your app, and then commit locally if it works. If it doesn't work, don't commit that version; either wait for them to fix it, sync to the last known good version, or patch it yourself.

If you aren't committing your dependencies then you're doing it wrong.


> If you aren't committing your dependencies then you're doing it wrong.

Disagree strongly (and hate absolutes like "doing it wrong"). Their metadata, yes, by all means, commit that. There is absolutely no reason, however, to have the source code for a dependency in my tree, Go or not.

Give me a binary I can link against, or at least the temporary source in a .gitignored location, and let's call it a day. When I want to bump to a new version my commit should be a one-line version bump in a metadata file, not the entirety of the upstream changes as a commit. I've seen a sub-1MLoC project take 10 minutes just to clone. Internally! You're telling me you want to add all the LoC and flattened change history of your dependencies in your repo? Egads, no thanks! Where do you draw that line? Do you commit glibc?

There's just no reason to store that history unless you are in the business of actively debugging your dependencies and fixing the problems yourself, rather than identifying the issue and rolling back to a previous version after reporting the problem upstream. I guess it's paying your engineers to fix libopenal versus paying your engineers to work on your product; one's a broken shop, the other isn't. Some people will feel it's one, some the other.


Thank you.

It makes me weep when I hear gophers telling me to vendor all my dependencies. That is insane. Completely insane.


Actually what is insane is to have your production code do anything else. "pip install foo" and similar schemes open your code up to the following problems:

- incompatibilities that were introduced in the version 1.2.1 while you've only tested your code with 1.2

- the host is down so you can't compile your own code because your dependency is not available

- the host is hacked and "foo" was replaced with "malicious foo"

- exponential increase of testing (you really should test with all version of your dependencies you use)

Ultimately, I don't understand the doom and gloom point of view. C, C++, Java, C# etc. programmers have been pulling dependencies in their repos for ages. In my SumatraPDF I have 12 dependencies. I certainly prefer to manually update them from time to time than to have builds that work on my machine but fail for other people or many other problems that are a result of blindly pulling third party code.


None of the things you listed are problems. The other comment demonstrates solutions to all of them, and I do not understand your fourth bullet point in context at all.

> C, C++, Java, C# etc. programmers have been pulling dependencies in their repos for ages.

I'm not making this up: in my career, I have never worked on a project where this is the case, and I've worked for shops that write in three of those languages.

> I certainly prefer to manually update them from time to time than to have builds that work on my machine but fail for other people

That's your choice, and it's a little bit different because I'm assuming "other people" are end users -- those that want to recompile SumatraPDF from source for some bizarre reason -- not developers. Fixing a broken build is a skill that every developer should have, but not an end user. Once I learned how to write software, I never came across a situation as an end user compiling an open-source Unix package that I could not solve myself.

The opinion I'm sharing here is related to developing on a team, not distributing source for end-user consumption. It sounds like you don't develop SumatraPDF with many other people, either. Nothing like merge failures on a huge dependency that I didn't write to ruin a Monday.

Also, wait, SumatraPDF is built with dependencies in the codebase? What if a zero-day is discovered in one of your dependencies while you're on vacation for a month; what do distribution maintainers do? Sigh? Patch in the distribution and get to suffer through a merge when you return?


> C, C++, Java, C# etc. programmers have been pulling dependencies in their repos for ages.

The first time I worked on a C# project started in an age where nu-get was not widespread, I saw with dismay a "lib" directory with vendored DLLs. It does happen.


Why dismay?


Many reasons:

- Binary artifacts under version control is no-no for me, unless we're talking assets. Third-party libraries are not assets. - Where do these DLLs come from? How do I know it's not some patched version built by a developer on his machine? I have no guarantee the library can be upgraded to fix a security issue. - Will the DLLs work on another architecture? - What DLL does my application need, and which ones are transitive dependencies?

That's many questions I shouldn't have to ask, because that's what a good package management system solves for you.


Spot on. We had this problem taking on some legacy code during a round of layoffs. They had checked in /their own/ DLLs from subprojects. It turned out that one DLL had unknown modifications not in the source code, and another had no source at all.

Another problem was that by building the way they had, they'd hidden the slowness and complexity of the build - including the same code from different branches via a web of dependencies, and with masses of unused code. They never felt this pain, so had no incentive to keep the code lean.


If you don't trust your developers, then you've got bigger problems.


Sure. But at the same time, if you make it a policy to forbid nailguns at the workplace, you have less people shooting themselves in the foot while you're not looking.


Great unless you're a home construction business.

Anyway, this analogy isn't helping anyone. You think libs in source control is a problem because some people might not do it properly. I'm contending that there's nothing wrong with libs in source control--there's something wrong with letting people who might not do it properly near your source control.


There are clear benefits from having a package manager (if anything, pesky things like version numbers, direct dependencies, etc are self-documented). In addition, it does prevent people from taking shortcuts, and even good people take shortcuts when the deadline is short enough.


But if you didn't write the dependency (and thus presumably don't commit to it), why would there be a merge conflict?

As for upstream maintainers rebuilding your package, I don't see how having to update a submodule is vastly different from updating the relevant versions in a file. Both seem like they'd take mere seconds.

It's not like you're writing gotos straight into library code, it's merely a bookkeeping change. You're just importing everything into your repo instead of referring to it by ambiguous version numbers. In the end, the code written and the binary produced should be identical.


- you can freeze the version: foo==1.2.1

- you don't need the internet to install dependencies. There are many options e.g., a locally cached tarball will do (no need to download the same file multiple times). Note: your source tree is not the place to put it (the same source can be built, tested, staged, deployed using different dependencies versions e.g., to support different distributions where different versions are available by default)

- if your build infrastructure is compromised; you have bigger problems than just worrying about dependencies

- you don't need to pull dependencies and dependencies of dependencies, etc into your source tree to keep the size of the test matrix in check even if you decided to support only a single version for each your dependencies.

As usual different requirements may lead to different trades off. There could be circumstances where to vendor dependencies is a valid choice but not due to the reasons you provided


> - incompatibilities that were introduced in the version 1.2.1 while you've only tested your code with 1.2

Which is why you pin dependencies to specific versions.

> - the host is down so you can't compile your own code because your dependency is not available

Which is why you have (caching) proxy and use mirrors.

> - the host is hacked and "foo" was replaced with "malicious foo"

Which is why you use GPG signing. (sadly, Python lacks in this respect).

> - exponential increase of testing (you really should test with all version of your dependencies you use)

Which is why you only use a specific version.


No really! Go isn't like other languages. You have to think differently. Please try it!

There's no such thing as, say, Maven binary dependencies for Java. If you don't check in the source code for your dependencies, your team member won't get the same version as you have and builds won't be reproducible. You won't be able to go back to a previous version of your app and rebuild it, because the trunk of your dependencies will have changed. By checking in the source code you're avoid a whole lot of hurt.

Checking in source is okay because Go source files are small and the compiler is fast. There isn't a huge third-party ecosystem for Go yet.


> There's no such thing as, say, Maven binary dependencies for Java.

I'm saying there should be, but not necessarily the same thing. That's my entire point.

I'm also not a fan of the "you have a dissimilar opinion to mine, so obviously you've never used Go properly" attitude in this thread. One way to read your last is that I've never used Go at all, though I'm giving you the benefit of the doubt and assuming you meant used Go properly. Either way, I don't get the condescension of assuming I'm unaware of everything you're explaining to me simply because I have an opinion that is different than yours. Especially since half of your comment is repeating things to me that I said earlier.


Maybe it sounds like condescension. I was in the same place at the beginning. No exceptions? No generics? Heresy. How dare you Go people ignore my many years of experience? I wrote a few rants to the mailing list, which were basically ignored.

The reason I assume you haven't used Go much is that your examples of problems with checking stuff in aren't examples of problems happening in Go. It's an analogy with other languages and other environments. Such arguments don't seem to get very far.

Maybe it won't scale and something will have to give. I expect the Go maintainers will find their own solution when it happens, and it won't look like Maven or traditional shared libraries. (If anything, it might be by replacing Git/hg with something that scales better.)


Except this does not work for commercial software, which is expected to give you only packages.


Right, "go install" is for open source projects.


> There is absolutely no reason, however, to have the source code for a dependency in my tree, Go or not.

Disk space is cheap; much cheaper than time needed to fix something if it goes to hell and some remote repository isn't available.


Interestingly, the only ecosystem which suffers from such issues is Go.


You already have that in Javaland. It's called maven, and it allows you to change one number in one file to upgrade the dependency version. Clojure also has that with Leiningen.


I'm not entirely familiar with how go dependencies work, but does using git submodules for the dependencies with go make sense?


Almost. `go get` will not resolve your sub-moduled dependencies. It'll work well enough for developing your library, but will break when people try to consume your library (at least, without git cloning it)


Alternatively, a build system that expects me to throw away the change history of my dependencies is completely cracked.

Fortunately there are alternatives (go getting a personal fork, etc.) that aren't entirely anathema to maintainable long-term development.


Hopefully the project maintainer will keep a go1 branch in their repo that only contains 'released' versions of a consistent API.

And in cases where that's not obviously the case, my habit is just to fork the repo and "go get" my own fork. I can then update it as needed.

Not saying these are optimal solutions, but they are solutions. I do personally wish I could "go get" a specific commit or tag.


I agree 100% with your first point. It should be spelled out.

Your second point I am less sold on. Transitive dependencies (pkgA imports pkgB@v1 but my code need pkgB@v2 which is incompatible with v1) are the thing of nightmares in large systems development, which is what Go is designed for... that lack of versioned imports wasn't an oversight, it is a feature.

Centralized repos are centralized points of failure, and only as good as they are well managed. NPM versus CPAN if you will. Any serious project will localize dependencies, even if they are in CPAN, you never know when CPAN will be down or other unforeseen things might happen.


Instead what we have is that pkgA needs pkgB@then (which happens to be when the author of pkgA last cached pkgB) but my code needs pkgB@now. That's worse in pretty much every way, mostly because there are no identifiers anywhere to clearly work around or even detect the problem. I'm all for "your build can only use a single version of pkgB" (linking two versions of pkgB into the same binary is insane) but I need to say what version that is, not leave it nondeterministic and dependent on uncontrolled, unrecorded state of the machine running the build.


No, you just mirror CPAN. This is already done in lots of shops I know of for PyPI. IME, I've only ever had PyPI down on me once, and there are mirrors (that are usually up) if that is ever the case[0]. I think localizing dependencies as you say is a waste of time.

[0]: http://jacobian.org/writing/when-pypi-goes-down/


Sure, so now instead of praying that the main host won't get hacked you now should pray that none of the N mirrors will get hacked.

And if you understand the basics of probability, that's not a good scenario.


I do understand the basics of probability. The likelihood of your serving infrastructure or application being compromised is an order of magnitude higher than the most popular repositories in software development. I'm not saying it doesn't happen, but I also don't walk around worried about having an asteroid land on me simply because I understand probability. If it happens, it happens, and we deal accordingly, but using a much more difficult software engineering process because of (arguably) paranoia is silly.


And, that the package(s) you're trojaning aren't signed[1] (I'm not immediately sure if new releases are automagically signed/digested when uploaded via PAUSE, or what fraction of currect packages are signed)

[1] http://search.cpan.org/~dagolden/CPAN-1.94_65/lib/CPAN.pm#Cr...


or at least make sure a remote Github reference is for a specific checkout!


Exactly! But trusting a remote repo to contain an essential part required to build the project is just extremely short-sighted, no matter what language is being used.


I think it is okay, as long as you have a backup plan. Can be advantageous to keep the code required to build the project small and have people pull dependencies as required.


IIRC, Go does not support importing anything but HEAD (yet, at least).


Go supports some versioning. It will checkout from:

1. A tag named for the version of go used, or 2. A branch named for the version of go used, or 3. The master branch

But no, you can't spec a version to import explicitly.


Transitive dependencies (pkgA imports pkgB@v1 but my code need pkgB@v2 which is incompatible with v1) are the thing of nightmares in large systems development.

This is a decided on feature by the golang team, not an oversite or something beyond their technical capabilities.


Yeah, but you can use any number of systems to keep track of it if needed. I'm using git submodules in my current project, which keeps track of what commit you are using.

Submodules are flawed in a lot of ways, but as a simple way to to keep a pointer to the appropriate version of an external project they are great. It's a little extra work to set them up, but it's fairly marginal. Check them out with go get, add the repo after the fact with git submodule, and after that just use git submodule to always grab the correct versions.


Github is not a substitute for a packaging system.


I don't understand the issue, or why it's such a show stopper.

If the original programmer is using git, he should have a complete history of his code. They know the date (or thereabout) when the last version of the code "worked". Revert down to the a commit around that date.

For each dependency, do the same. Pull the dependencies repo down, revert to the last commit that was around the date of the working version, and use a new fork of that repo to get things back to the working state.

One by one, move the dependencies forward, resolving bugs as you go.


Another possibility is that the developer is using this as a convenient excuse to abandon a project where they bit off more than they could chew. $28K isn't a lot of money having already put in months of work and with potentially many months of work remaining.

The acid test will be if someone fixes their issue and says "Here you go, it's compiling again. Now just make sure you handle dependencies like this to keep everything working." Will the project be picked up again?


I was also mystified as to how not a single one of them did this after alleged months of effort.


Unless he lost his local copy he wouldn't even need to guess what commit around the date he was at in each repo. The info is stored in the ref-log of the local git repo, (as well as the dependencies' local repos).

So you could rewind all the repos to where they were at a particular date/time with: `git co HEAD@{2012-11-26 18:30:00}` in each repo. If you run that same command in the project repo, and the dependency repos you should have a snapshot of the developers local checkout at that time.

HEAD could be a branch name or whatever ref. You can specify the time like ref@{2.days.3.hours.25.minutes.ago} as well, and these time specifiers work wherever you can specify a revision for example `git log -n1 master@{yesterday}`

see: https://www.kernel.org/pub/software/scm/git/docs/gitrevision...


Maybe they did and it didn't work? You're giving them very little credit.


That wouldn't make any sense. Either the code worked at some point, or it didn't.


You know, I haven't worked on that much software, and I've run into situations I at first thought impossible often enough that your comment just makes me laugh. :P

Unlikely? Surprising? Insane? Sure. But never call something impossible unless you can not only write down a formal proof for it but also verify your assumptions.


Unlikely, I would go with. It's unlikely that it couldn't be done. Based on the level of understanding in that thread, I'm not terribly surprised.


As other people are saying, maybe the original author made changes to the external dependencies he pulled in. Or maybe he's just bullshitting and indeed the code may have never worked. Or, sure, maybe the people trying to reconstruct the application went about it entirely the wrong way, like you suggest. It's just curious that you'd instantly pick the least charitable interpretation.


What about branching and then doing:

git reset --hard commit-hash

where commit-hash is the last known working commit.

Just do that with all the repos around the date it last worked. If there are alot of commits between the last working version and HEAD it is easier to do this then reverting each commit.


The problem is dependency which changed with any of the git-tracked content changing.

My interpretation:

1. Original author writes a program using Lua. Author downloads the latest version of Lua into his system and includes and links it. Note that Lua is not checked into his version control system nor does he note the used version.

2. Lua upstream does a backwards incompatible change

3. Author extends his local (old) copy of Lua

Result: An external dev will find no combination of Lua version and program revision which compiles. The original author seems to have lost his local modified Lua version.

You can probably spot various mistakes an experience software developer would avoid. The interesting point is that Go seems to encourage Step 1 nonetheless?


Wow, that's awkward.

One of the things I spent a bit of time on when Java was being developed was class management. We had an ongoing discussion about embedding versioning information in class files, creating signatures that could be checked at runtime, and just winging it. Bill Joy was convinced that it was possible to give the class loader the ability to statically analyze a class and determine whether or not it would work with the code that was trying to invoke it. This was when I was first exposed to the idea of creating a digest type 'signature' for a method/class invocation that would encode semantics.

The end result was 'just wing it' of course. This was expedient but didn't advance the state of the art :-)

Go's implementation seems a bit more dangerous. Not the least of which being drive by malware injection attacks on abandoned/poorly administered code repositories. It seems in principle to be no more or less dangerous than something like "apt-get install libcairo-dev ; make" but that would only be true if there was a trusted repository system.

Can someone from Go comment on the ability to create and manage trusted repositories? Should we look for a go 'distro' to build with?


> It seems in principle to be no more or less dangerous than something like "apt-get install libcairo-dev ; make" but that would only be true if there was a trusted repository system.

Not that this is really a response one way or another, but given the way that Go currently works, you effectively place the same amount of trust in each individual library maintainer as you do with the collective trusted repository maintainers in other approaches.

If someone were to decide to start curating repositories and forking/maintaining them, people could choose to trust that individual. So the possibility for a trusted repository exists, but as far as I know nobody has fully stepped up to create such a thing at this point.


> Can someone from Go comment on the ability to create and manage trusted repositories?

You basically just need to throw up a HTTP-accessible HTML page with some <meta> tags that describe how to fetch the actual code with a variety of source control systems. Anyone can create their own trusted repositories.

[docs] http://golang.org/cmd/go/#hdr-Remote_import_path_syntax


When you install a package with `apt-get`, that package can potentially change your system in any way it pleases. Running `go get` will download and compile code, but there is no security risk until you run or use that code without checking it first.


You're comparing things that are different. When you install a package with apt-get:

- it comes from a repository that has been signed by the distributor.

- the package has been tested and has a good level of integration with the system (ie different distributions may package things differently).

- depending on the distribution, there may be a commitment to API/ABI stability (ie. Debian backports security fixes most of the time).

That said... running "go get" is very convenient if you control the code repository where you're fetching the code, but it's not a good idea when using 3rd party software that is in active development.

EDIT: formatting


If you don't trust your package manager & distribution supplied packages (read: your whole operating system), the security risk between an 'apt-get' and 'go get' is most likely not the most important thing to worry about.


> Bill Joy was convinced that it was possible to give the class loader the ability to statically analyze a class and determine whether or not it would work with the code that was trying to invoke it.

Here, 'work' must mean 'has the methods and other stuff the client code is looking for with the types the client code expects', except the Java type system isn't really expressive enough to do a very good job of that, especially if there are any generics involved (such that a lot of stuff has been hammered down to type Object). But Java didn't have generics that long ago, did it? Something like Eiffel, with design-by-contract and automated contract checking, would be able to make more use of the concept.

Pretty trivial, especially if you can rely on the compiler to do dead code elimination before checking to see what the client code is expecting. I don't know if javac (as opposed to the JIT) does dead code elimination, though.


The actual explicit definition of work was 'not throw a ClassCompatibility Exception' because the JVM was convinced that the method it was about to call was the same semantically as the method had been when the calling class was compiled.

The idea was to create a 'semantic message digest' (I don't know if such a thing exists or not) which could be computed at class load time and checked/validated at method invocation time. The canonical case for the talking point was you have a class Foo with methods doA, doB, and doC if you change doB incompatibly but the client only calls doA and doC the class is still 'compatible' (in a call contract context) with the class even though the class is 'different' in an incompatible way. Guy Steele and Bill came up with a pretty interesting list with things like methods added, methods missing, more parameters, fewer parameters, additional fields, fewer fields, additional constructors, Etc. All the ways you might mutate a Class and yet for a class of callers it would still work just fine.

If one can do this, another interesting feature is better capability type systems. Those systems have to deal with changes that may change the capability represented by a method even though it is still semantically compatible.


But it does not protect you from the guy who changes his API without changing the signature of the method ("off with these pesky empty lists, let's return null instead").


Thats correct, and ultimately these were the rocks I flailed against for a few months before giving up. There are ways that you can change what something does that don't look semantically different to the compiler. That means you can't capture a semantic signature completely.


Go didn't screw anything over. It sounds like poor version control practices. Although the Go team eschews versioning of dependencies, they do so mainly because they are in control of said dependencies. If you're building software that relies on untrusted source code, do yourself a favor and make a copy of it that is known to work well and use that copy in your build environment.


>Go didn't screw anything over.

Erm, yes it did. The civilized world (Java, Ruby, Python, Clojure, Scala, Haskell, OCaml) has version numbers in their dependency management. Albeit out-of-band from the source files (pom.xml, gemfile, requirements.txt, project.clj, sbt, .cabal, OPAM version pinning) but it does work.

Hell with Clojure and Leiningen (the standard choice for dependency mgmt) even the language version is a per project dependency ala:

[org.clojure/clojure "1.5.1"]

So you can still build jars that "just work" even if it's some legacy stuff that you haven't updated to the latest Clojure version yet.

That's not to say Clojure would've helped here - it's a game. And one that's aiming for better graphics - JVM isn't a good idea.

But that doesn't excuse Go's poor package management design that decided being facile and hid complexity of the real problems (changes breaking existing code) was more important than working builds.

The decision to make it facile, weak, and fragile was egregious and very out-of-character with the rest of their design decisions. I think Go can be faulted for failing at their own goals on this particular matter.


> Erm, yes it did. The civilized world (Java, Ruby, Python, Clojure, Scala, Haskell, OCaml) has version numbers in their dependency management.

Go has localised dependencies: you package your code with exactly the versions you need.

> So you can still build jars that "just work" even if it's some legacy stuff that you haven't updated to the latest Clojure version yet.

And with a Go project, one would have a complete system which 'just works,' complete with all dependencies, regardless of how old those dependencies actually are.

> But that doesn't excuse Go's poor package management design that decided being facile and hid complexity of the real problems (changes breaking existing code) was more important than working builds.

Go doesn't do what you think it does (probably not your fault: the article is misleading). It only pulls down updates if you tell it to. It sounds like the developer of the original code was doing the wrong thing, not using GOPATH the way it was designed, not checking his entire source tree—including dependencies—into version control. I could be wrong, of course: it's possible that he really did use it properly but discovered a misfeature I've not yet found.


"The developer should have done X!", well, then why doesn't Go just do it by default instead of choosing an approach which has been discarded by pretty much every other modern language/ecosystem?

I think the fact that every Go proponent attacks the developer of the game is one thing which keeps me from using Go. The community seems to be incredibly closed-minded and hostile.


This line of reasoning seems to make it hard to suggest that certain things are and aren't idiomatic in different languages without being accused of being closed-minded and hostile. If I read an article claiming that Ruby sucks because of some project where the developer decided to use a Makefile instead of bundler, I would say "the developer should have used bundler!", and you could accuse me of the same hostility you're accusing Go proponents of here. Conventions really do exist for a reason, and really should be followed until you understand them well enough to reject them for a good reason.


Amazingly, other communities manage to do that without being so utterly condescending.


I'll see your anecdotes with my own; I've had nothing but extremely pleasant interactions with the Go community and think it is much less closed-minded and condescending than most.


I noticed the same about the Go community. Pretty hostile and condescending. It seems that the attitude of some of its authors are reflecting on the community.


Go developer here. It really looks like the game developer is trying to cover something up here. The article is very misleading. Go doesn't work as they describe - it implies that the library can change underneath you and that's not true.


> it implies that the library can change underneath you and that's not true.

From the perspective of one developer's checkout it can't. But a new clone of the project since dependencies have moved yields a new product where the libraries have "moved" since a previous clone. I think that counts as "changing underneath you" for some definition of that term.

That being said I don't really think it is Go's fault for the developer not understanding that they need to keep a clone of dependencies if they don't want them to move.


> But a new clone of the project since dependencies have moved yields a new product where the libraries have "moved" since a previous clone.

No, because a project's tree contains the source code of all dependencies. Why do folks keep on saying this?

Here's how it works: you create a directory foo-proj, then export GOPATH=/path/to/foo-proj:$GOPATH (or whatever you like); then you run 'go get github.com/baz/bar example.org/quux'. Go downloads the current version of the bar and quux libraries, and then creates foo-proj/src/github.com/baz/bar and foo-proj/src/example.org/quux, putting the right files in the right place; it then builds each package, putting the object files in foo-proj/pkg.

As the developer, you configure your VCS to ignore foo-proj/pkg, then you commit foo-proj. You might put your own code in foo-proj/src/fooproj, or foo-proj/src/example.com/foo or whatever.

When another developer clones your project, he gets foo-proj, which includes foo-proj/src, which contains foo-proj/github.com/bar/baz and foo-proj/example.org/quux and everything else.

I suspect what happened in this case is that the developer was building his code as another library, rather than as a project, and pulling it into his GOROOT, letting Go grab his dependencies but not putting them into version control.

It's also possible that he was doing the right thing, but didn't realise that git wasn't tracking submodules without his involvement.


> I think the fact that every Go proponent attacks the developer of the game is one thing which keeps me from using Go.

Nah, not that I like Go very much but while reading this article I was like "oh, and how is this Go's fault?". It is like I'd depend on some library and then the library gets pulled from the internet or updated in an incompatible way, how is this a problem of the language that I develop in.

I might as well blame my text editor for allowing me to do stupid things.


> I might as well blame my text editor for allowing me to do stupid things.

The difference is whether the editor turns around and deletes /home as a punishment (Go community) or tries to educate the user about a better approach without acting like an asshole (pretty much all other language communities).


> "The developer should have done X!", well, then why doesn't Go just do it by default

It _does_ do it by default; the problem in this case appears to be (although I could be wrong) that the developer went out of his way to do the wrong thing. As I've noted elsethread, I could be wrong; possibly he was misled by something else.

But the default way Go handles dependencies is exactly the way one would expect: you code against one version of a dependency, and have to manually pull in a new one.


Actually, the civilized world has version numbers in the source too:

   use Moose 2.08;
That's valid Perl and will blow up if you have Moose 1.02 installed. Of course, it'd be even better if you could be more specific than just "2.08 or greater", but it's still useful as is.


That still allows your deps to be stateful and out-of-band of the build. I'd prefer the build to be exclusively aware of version numbers in one place and let the code be version agnostic.

I'm not a Perl user, I don't consider running sed on my source code to be a "bonus".

Still better than Go though.


There are a number of tools in Perl that will scan your code base and automatically generate a dependency list based on what it finds. The one I use is called Dist::Zilla (http://dzil.org/), and it automates the entire packaging and library release process. Again, this is how civilized languages work.

I'm sure Perl isn't unique here, of course.


> I'm not a Perl user, I don't consider running sed on my source code to be a "bonus".

That was completely unnecessary.


How else do you deal with all the import invocations in the source code files with outdated versions?

M4? Make + C preprocessor? Perl generating Perl? I tried to guess the least-absurd option for updating a library.

Please inform the rest of us what the standard tooling for handling this in the Perl community is.


Of course. A true Perl user has no need for sed, when there's "perl -pi -e".


> poor version control practices

I agree that, if this problem had been foreseen during development, better version control could have helped.

> Go didn't screw anything over

This is where I disagree. Having a language feature to fetch dependencies from mutable online sources means that you're practically asking for programs to break unpredictably based on external changes. The local cache makes it worse, because every machine will have a slightly different version of the library, and bugs won't be reproducible.


> Having a language feature to fetch dependencies from mutable online sources means that you're practically asking for programs to break unpredictably based on external changes.

Go devs always said that Go is designed to solve Google problems in Googley ways. All Google code builds from HEAD, at least it did when I was last there. The reasoning for this is that breaking changes can be discovered ASAP, instead of depending on code several versions back that might be much more painful to upgrade should the need arise.

This is another case of the Google way having a sane reasoning, but perhaps does not work in other cases. I'd personally prefer some versioning system, but I can see why they've done it the way they did.


Thanks for the insight. I fully support the "build from the HEAD philosophy" for the code I completely control. Which is quite different from building stuff from 3rd party HEAD. I assume Google does not build it's kernels from kernel.org HEAD.

Looks like Google should not advertize Go as general purpose language for everyone to use? Especially the tone of Go related postings here on HN is that everyone should replace C, PHP, Python, Java, JavaScript, etc. with Go because it's better, faster, more scalable on server.


By your argument, no one should use the Google cache of jquery. Or any of these: https://developers.google.com/speed/libraries/devguide#Libra...

Do I understand you correctly?


The Google cache of jQuery is for a specific version. It will not change from under you.

Yes, you are trusting Google not to maliciously change the file to something evil, but that is quite different to linking to the trunk of a repository that is expected to change.


Yes, and that is why it is the developer's fault and not Go's. The script tag doesn't magically know that this is a specific version of jquery, it will happily let you reference code from a version that will change from under you too. The script tag simply allows importing any external code, its up to you to choose a good source. In exactly the same way, this Go feature can be used to reference code that is not expected to change (for example, by referencing a specific tag in a github repo), but can also be "misused" by referencing something that will change from under you. There exists no way to write this feature such that it only allows linking to "immutable sources". Its the same as writing a bootstrap script that downloads HEAD sources and compiles them, and then getting angry at bash or wget because it should "know about versioning".


Presumably the developer didn’t come up with this on their own, but instead learned it from a Go tutorial or documentation.

Visiting http://golang.org/doc/code.html I see examples like this:

    import "code.google.com/p/go.example/newmath"
There is no discussion about versioning in the 'Remote Packages' section - instead we are told "This convention is the easiest way to make your Go packages available for others to use.” My reading is that importing from HEAD is a Go convention.


Importing from HEAD _is_ the Go convention (so far as I can tell from others' code); it's also Go convention (again, in my experience) to have each system in its own source tree specified in GOPATH, with all remote dependencies installed locally at a known version.

Again, as I mentioned elsethread, I'm pretty new to Go, so perhaps I've been misusing it.


with all remote dependencies installed locally at a known version.

How would you guarantee that known version? Using the normal import above would not do this, esp. if you are working with other developers or have to switch computer/reinstall your dev environment/update. You'd get the current head (not a known version) of all dependencies whenever you run go get on a fresh install, because there is no package versioning in go import statements or go get.

The only way currently to guarantee you get a known version is to fork your dependencies, put all the code in your own repository and update them manually. That's not the end of the world but it's not as elegant as the rest of the go build system.


> How would you guarantee that known version? Using the normal import above would not do this

Because that's the source code sitting in your src tree. That's actually exactly what 'go get' will do: pull the source code of the current version of the dependency into your src tree and build objects in your pkg tree.

> The only way currently to guarantee you get a known version is to fork your dependencies, put all the code in your own repository and update them manually.

Nope, you just use (e.g.) a git submodule for the dependency. When you need to, but only when you need to, you can update each dependency or all at once.


There are definitely workarounds (submodules would be even better than forking probably), but default go get behaviour is to clone dependencies into your src tree - not a known version of them, but just the latest head. So if you're sharing a package with dependencies, you'd have to first check in your dependencies and not rely on go get to manage them.

Using go get on its own is not enough to end up with a known version - each go get can pull different code, unless you have manually set up dependencies first as part of the package or checked in the entire src tree and shared that. This works fine for one person working on code but obviously requires a bit more work if you're sharing code with dependencies with others.

I not trying to say it's impossible to manage or a huge flaw in go get, but it does require you to deal with dependency versions explicitly yourself, unlike many other packaging systems.


> So if you're sharing a package with dependencies, you'd have to first check in your dependencies and not rely on go get to manage them.

That's what I was describing. The developer of the dependent code pulls in the dependency with go get, and then other developers see both his dependent code and the dependency he was using.


I think that's kind of like saying that printing "Hello World" is the convention of every programming language.


Regardless of the convention, every user needs to understand how to use the tools.


Go feature can be used to reference code that is not expected to change (for example, by referencing a specific tag in a github repo)

No it doesn't. Go get or import doesn't let you specify a tag of a repo - if it had the developers would have an easy way out of this situation as they could specify their dependencies on import when first importing. That's not to say this is the go developers fault, but it does highlight a weakness in go dependency management:

Go get always checks out the HEAD. They have a scheme for checking out tags only for golang versions so they have the code to handle tags etc in popular repos, but hardly anyone uses the language versioning support because it just isn't (and shouldn't be) required, and requires a specific scheme of tag names to work anyway. At present golang versioning is only used for this, and it's not widely used as a result.

Of course you can work around this, and it's hardly a showstopper - at the simplest level you could go get then checkout a specific branch with git instead before building, or fork the repo, keep local copies in your source control etc. you only have to do this once for dependencies and then they won't change unless you decide to update them.

The bizarre thing is that go get does know about git tags and versioning it just refuses to make that useful for users unless they want to version the language, not the libraries they use - I imagine this was useful in early language development, but now that they've hit 1.0 and are promising to be backwards compatible it seems pretty pointless.


> I imagine this was useful in early language development, but now that they've hit 1.0 and are promising to be backwards compatible it seems pretty pointless.

I'm not sure why you think that. It's perfectly reasonable to want to use new features in Go 1.1 while maintaining backward compatibility (or an older version) for Go 1.0.


As go 1.1 is backwards compatible and much faster, and gofix makes the transition easy, I doubt many people will stay on 1.0 long enough to make this a useful feature to have built in. Have you used it? I think they initially tried to make the language versionless too but backed off and went back to versions when they realised this wasn't workable, I suspect the same will happen with packages and import long term.

I haven't seen much use of the go language auto-versioning feature in the wild, whereas lots of people have noticed the lack of support for versioning packages/imports and the implications for unpredictable builds - any dependency imported the normal way could break at some time in the future or for different coworkers who checked out at different times, unless you fork which has its own problems for maintenance. Personally I think versioning packages is more important, and would prefer to have that option rather than the option to maintain branches for different versions of go (which can be done manually where really required and is unlikely to be an issue in the 1.x series).


> Personally I think versioning packages is more important, and would prefer to have that option rather than the option to maintain branches for different versions of go (which can be done manually where really required and is unlikely to be an issue in the 1.x series).

The features have nothing to do with each other---it's not an "either or" deal.

I haven't used the `go1.0` or `go1.1` tags myself.

People have tried setting up versioned packages, but it hasn't taken. As for me, I'm quite happy with the simplicity of `go get`. It's been working great for me for over a year.


Go get fetches a go1.0 tag or similar automatically with the version of the language, if it exists. This would break for obvious reasons if it also wanted to fetch a specific tagged version of a package repo, thus I considered the two features related and exclusive.

I also like the simplicity of go get and don't feel this is big deal, but having used other packaging systems suspect that go will start versioning packages eventually as it makes life a lot easier if you are building on code from elsewhere and want to share code. The present setup works fine, with the caveat that if you don't fork all dependencies, other users might see different build results for the same code.


> but having used other packaging systems suspect that go will start versioning packages eventually as it makes life a lot easier if you are building on code from elsewhere and want to share code

I strongly disagree. I've used other packaging systems and they are a constant source of grief. I very rarely need to pin dependencies so it doesn't make sense to work with a super-powerful tool that allows dependency management at the version level. If I did, then I understand the need for the power.

> The present setup works fine, with the caveat that if you don't fork all dependencies, other users might see different build results for the same code.

That's not a caveat; that's a strength. When this happens, I get a bug report from a user, and I fix my software. Then it works.

Bitrot is a damn hard problem to solve. I'd rather it smack me in the face then creep up on me and strike when I least expect it.

N.B. I fully understand there are scenarios when reliability of builds is important. If I were in that scenario, I'd accept the pain of pinning dependencies or writing my own tool to do so. The great thing about Go is that I could do that with relatively little pain. (Its standard library contains amazing tools for analyzing Go source files.)


It's likely a bad idea to use the Google cache of "jquery". It's likely a good tradeoff to use the Google cache of "jquery x.y.z" on the assumption that they won't change it. The Go convention is to the former.


Those libraries are clearly versioned from the page. That means they are immutable if you choose to use them.

An equivalent would be linking to a HEAD version of a library.


He was talking about mutable sources. These are not (you embed them with their version number, too).


It depends on your definition of mutable.

My links to documents on SGI don't seem to work...


There are a lot of people that think that using those is a bad idea, but those are at least versioned. There's a pretty big difference between pulling a specific version from an external source and always pulling the latest.


No, really, the Go answer to version control of dependencies seems to be "clone the head of the master branch of a Github repo". Blaming someone for using the only thing that seems to be supported is a little harsh.

https://groups.google.com/forum/?fromgroups=#!topic/golang-n...


>>the Go answer to version control of dependencies seems to be "clone the head of the master branch of a Github repo"

Yes and it is only the fault of these devs that they did not do that. One of the first things I did when starting a job programming Go full time was go- woah, we need to clone our dependencies so we can manage upgrades. It was just common sense and easy.


How does that work with transitive dependencies? If my app uses A, which in turn uses B, do I fork A and B and then edit all of the imports in my local fork of A to now point to my fork of B? When I take new drops of A, do I have to make sure my locally changed imports don't get borked?


That's just non-sense. The issues were caused by exactly that. Multiple people cloned the head at different times and things broke.


They obviously didn't snapshot their dependencies into a git repo they shared, you meant they cloned them locally, which is not what I meant at all.


Wait, are you saying that embedding a reference to some other fast moving master could be dangerous? No way!


I used go get on the source code and the errors are caused by dependencies on 2 C libraries: opengl and lua. Basically the Go bindings for the C libraries are out of date and/or unmaintained.

Sample output:

    # github.com/runningwild/glop/gos
    /usr/bin/ld: cannot find -lglop
    collect2: error: ld returned 1 exit status
    # github.com/runningwild/opengl/gl
    gl.go:142: cannot use _Ctype_GLint(mapsize) (type C.GLint) as type C.GLsizei in function argument
    gl.go:147: cannot use _Ctype_GLint(mapsize) (type C.GLint) as type C.GLsizei in function argument
    gl.go:152: cannot use _Ctype_GLint(mapsize) (type C.GLint) as type C.GLsizei in function argument
    gl.go:158: cannot use _Ctype_GLenum(internalformat) (type C.GLenum) as type C.GLint in function argument
    gl.go:164: cannot use _Ctype_GLenum(internalformat) (type C.GLenum) as type C.GLint in function argument
    gl.go:170: cannot use _Ctype_GLenum(internalformat) (type C.GLenum) as type C.GLint in function argument



    /usr/bin/ld: Warning: size of symbol `luaX_tokens' changed from 8 in $WORK/github.com/xenith-studios/golua/_obj/lcode.o to 256 in $WORK/github.com/xenith-studios/golua/_obj/llex.o
    /usr/bin/ld: Warning: size of symbol `luaT_typenames' changed from 8 in $WORK/github.com/xenith-studios/golua/_obj/lapi.o to 88 in $WORK/github.com/xenith-studios/golua/_obj/ltm.o
    # github.com/xenith-studios/golua
    /usr/bin/ld: Warning: size of symbol `luaX_tokens' changed from 8 in $WORK/github.com/xenith-studios/golua/_obj/lcode.o to 256 in $WORK/github.com/xenith-studios/golua/_obj/llex.o
    /usr/bin/ld: Warning: size of symbol `luaT_typenames' changed from 8 in $WORK/github.com/xenith-studios/golua/_obj/lapi.o to 88 in $WORK/github.com/xenith-studios/golua/_obj/ltm.o
Seems like a familiar problem that every similar language encounters when using bindings to native code, not just a Go issue.


The Lua issue is weird. The bindings are maintained (last update one month ago), and the repo provides its own Lua source files (but the API of minor versions of Lua is stable, so it shouldn't matter).

They should file an issue on GitHub.

https://github.com/xenith-studios/golua/issues


Or you could fork.

Or you could create a version of your src folder as a github checkout with submodules for each of the things you want to import.

I struggle to see what the issue is. I knew the first time I wrote an import in Go that this was possible, and that as the libraries for Go are relatively young that it is also likely.

My personal preference is just to keep track of changes fairly regularly and not allow much drift. But that's because up until Go1.1 I was following tip, which meant I was already in that habit.

Though it would be nice if ``go get`` could take a revision or branch/tag name.


Yeah, this article made my head hurt. It can be summarized as

"Original author didn't localize dependencies, obviously caused problems."


Yep, and the comments can be summarized as

"I don't know anything about Go but I have incredibly strong negative opinions about it based entirely on this article. Oh also, I hate GitHub."


I'm a little confused about how this would be an issue in the first place. I mean, surely all the changes are checked into a repository and the working code is on the programmers computer right??? So why doesn't he just share the 'working' code with all of the dependencies if he somehow didn't check in his local copies of the dependencies.

I'm a little suspicious when it is claimed that the original author can't get it to compile. That is just so strange and the simplest explanation was that it was never in a great state anyway.


Yeah... it smells a little odd, goes against the general golang development (when you have a dep, it ends up in src and you check in src... so... yeah).


I offered to help out on the Haunts game when they had their first round of problems. They had lost the developer who had chosen Go and so were left with no way to continue development. I wrote and offered my help and apparently so did a large number of other people because we were all sent a long questionnaire to fill in with our skills, experience and what we could offer the project. It just seemed like a big barrier to cross simply to offer my volunteer services, so I didn't bother completing it.

In retrospect they were probably being protective of their IPR, but erred on the side of being overprotective and pushing away a few folk that could have helped.


This might sound harsh, but...

Maintaining software is hard.

Blaming your own incompetence on a library or an entire language is easy.

I've been guilty of this at times, but when you get down to it, it's your job as a software developer to understand how your software works and how to maintain it well. This doesn't just include the code you write; it includes your tools, third party libraries, and the internals of the language (VM, compiler, etc.). Its not easy, but it's part of the job.

On a more positive note, versioning packages in Go is already possible and is starting to gain momentum. Check out http://www.gonuts.io/


> it's your job as a software developer to understand how your software works and how to maintain it well

And it's the job of language (and language ecosystem) developers to make this as straightforward as possible. It's perfectly possible to blame both the author and the language.


I've only been writing Go for a short while, but my experience so far is that 'go get' only pulls down updated versions of dependencies if you tell it to, with -u.

The developer should have set up a proper Go tree for his project, prepended it to GOPATH, run 'go get DEPENDENCY1…' (which would have put the complete current source of that dependency into his src directory) and then checked his project tree into his VCS (adhering, of course, to the licenses of his dependencies). Anyone who downloaded his code would have gotten the compatible version of all dependencies from the VCS. Then when he felt it was time to upgrade the dependency, he should have committed his work (and branched in something like git), then run 'go get -u DEPENDENCY1…' to fetch the latest version, which—yes—would break his code's compilation. The next step would be to fix his code to work with the new version, then commit that.

It sounds like the guy was doing something developing his project in his main Go tree, not putting his dependencies into version control, blindly updating dependencies and so forth.

But possibly I missed something.


Go pulls down remote repos/libraries locally. If the original developer has any sort of backup of his system from the time the game was working, he will have all of the working code.

If the original developer does not have that but at least hasn't nuked his src directory, he will have all of the working code (but may need to, e.g., git bisect).


Yeah exactly. This shit is ridiculous.


From now on, remember this anecdote every time you curse at Maven for making dependency management a huge effing hassle unless you publish publicly on version drops to an official repo.

Go's sin here is being permissive. And it's true that clearly the game dev team just... didn't do things right. They were pretty much on a collision course with failure by managing their project's code the way they did. But Go allowed them to get much further along than they otherwise would have.


I'm not making any judgements on the project itself, or its dependencies, but I see a lot of people just use code off github (or butbucket, or launchpad) with no regard for the quality of said code. Just because it's on github doesn't mean it's good code, or the maintainer has any idea what they're doing. In general, if I can't trust a project to maintain a stable master branch (or a Go version tag) following best practice guidelines, I'm not going to trust that code in my project without a full review. In that case, I'd fork it (or clone it into another VCS) and work from there.

There are some issues with the way the Go tooling handles dependencies, but I think the minimalist way in which it does is very useful, and easy to build upon. Go uses a distributed dependency system, and you can't give it the same level of trust you give to centralized repositories, like one does with gems, Pypi, NPM, CPAN, and so on.


This indie project goofed up, so now they're (probably unintentionally) hurting Go and Kickstarter with more bad publicity. It's going to be information that's hard for the public to properly understand, so it will probably hurt that understanding as well.


They claim there is nothing about Go that makes it bad for games, but there certainly is. There are no large, mature game engines for it. If someone was hiring me to write a basic mobile game I could do with a big engine like Cocos2D, and said I couldn't use the engine, I'd probably multiply my time estimate by five.

I've written complex OpenGL ES games from the ground up before, and its a long slog, especially at the end where you you are trying to crank it up to 60 FPS and doing tons of work rewriting it to be high performance.


I think everyone here saying "just fork your dependencies" is missing the idea that it's supposed to be an open-source game.

Managing DLL hell may be acceptable to ship a game, but it's not acceptable to openly develop software.

In this case it wasn't even intended to be open-source, which may be why the dev went the route he did, but doing so essentially closed the door on the later option of easily letting the community take over on code development.

There's a reason people codified semantic versioning. Projects like gstreamer take it even further and make the major version part of the library name, even allowing for co-installability of old and new (e.g. gstreamer-0.10 and gstreamer-1.0 can both be installed and used).


I totally agree, as long as you can trust your dependencies.

Go's dependency management will even allow a library to have multiple versions depending on the version of Go being used to build (either a branch or tag matching the Go version). This isn't a major feature right now, since Go is backwards compatible within major versions, but could be much more important when there's both go1.x and go2.x in production.

The problem here arises when your remote library can't maintain a stable master, either because the maintainers don't care, don't know how, or the project is new and still in flux. The onus is then on you to incorporate that code into your project.

People need to think of this system as giving all dependency developers commit access to your project (which is nearly is). Would you just let the world check-in code willy-nilly, or are you going to review what's going in?


> Managing DLL hell may be acceptable to ship a game, but it's not acceptable to openly develop software.

Why not? Having something that compiles and runs is probably a bit of a boon to the development effort.


Yes, that's my point.

It's not acceptable to merely "manage" dependency hell as the net result is still unmanageable at OSS scale. You have to essentially eliminate the interplay of "which exact version do I have" with your code, which is why things like semver and soversions (on .ld.so's) try to accomplish.

Being able to have an unknown user download your code and the listed dependencies and have a repeatable build is invaluable for development.


I totally agree that packaging and dependency management is Go's weak spot, and I've experienced similar issues. That said, I wouldn't blame this on golang per-se. I have run in to the same issues in Ruby, although versioning is a bit better.

Whenever I see a fast-moving dependency I fork it and use my fork, then integrate upstream changes when appropriate. If you have to do that for every damn dependency I can see that being horrible, though...


Why? I can't imagine building a serious product and being dependent on a github repo I don't personally control. It is terror inducing to even consider any real project that would be... even if you are based on a tag or a hash, the author can destroy or rewrite it.


Agreed. Also, if your dependency graph starts to get unmanageable, it's usually a sign you've made a mistake somewhere (likely by relying on something else that's poorly designed).

If you keep your system simple and don't rely on too many "kitchen sink" frameworks, then keeping your own source tree of your dependencies isn't that much different from maintaining a local repository of the same code in compiled form (with the added benefit of having the original source available).


Hmm, they have issues with dependency handling and 3rd party libraries.

Go doesn't have a gem/pip/cpan/npm/... like ecosystem, instead you import git/mercurial repos directly. This is pretty nice for rapid development and to test things out, but not the right approach for complex long term projects like a game.

The original author of the game missed to include the dependencies into the project and now they are doomed. Put the libs you rely on in your version control, be it as a submodule.

I'm wondering if anybody is working on a package manager for Go libs... That would be great.


> I'm wondering if anybody is working on a package manager for Go libs... That would be great.

@aleksi is: http://www.gonuts.io/


Seems to me lame excuses from the original developer who made horrible version control decisions and then abandoned ship.


It's not even a Go problem, you can be equally stupid by putting "package": "*" in npm's package.json instead of, say, "package": "1.2"


IMO this is some sort of clueless scapegoating. Dependency could be an issue in software development but only for large projects. A project with $28k funding and probably 1 or 2 developers, seriously? And no details like lines of code, libraries being used, etc makes it hardly plausible.


I agree. I believe is scapegoating too. The guy that wrote the mail had no clue about anything technical, and the other guys that help him explain, didn't really understood what the programmer did and perhaps wasn't even a Go dev. And it seems that the original developer just left and doesn't want anything to do with the game anymore. So...they blame it on the language and 3th party hobby libraries from github because they can't set up the project right. That shit happens on any language even with inhouse built libraries if you put unskilled people to to the job.


why would they use go, a language aimed at building scalable web services to build a game is puzzling...I am not sure if people who made that decision actually knows what they are doing. Many open source libs and system-level linux libraries can be installed from git or github repos... that is not news at all. Can't see why people are fighting over such things. O.O


Looking at their Kickstarter page, they really could have used anything.


This sounds like a problem the community can resolve. Check http://golang.org/cmd/go/#hdr-Remote_import_path_syntax -- anyone can create a Go "repo" by using a <meta> tag in an HTML page.

I don't think it'd be difficult to create a CGAN/GoPI/Gobal/gojars/etc. that provided versioning.


That's a good idea. The limitation being that your service can redirect go get to a different repository url, but go will still fetch that repository and check out what it thinks is the best branch for that version of go: one with the name of the current go version, or master. So you'd have to play some tricks to keep a full clone-able repo with master locked at some commit for each version of each library, rather than just one repo per library with tags. I guess there are some ways to do that where the clones share the objects so you aren't wasting space, though so not insurmountable.


Space usage doesn't worry me too much. If a "package" is just a set of files, then storing them in some sort of compressed form (.tar.gz) and simply un-tarring them and making the API mimic a git repo with one revision should be fairly straightforward.


FYI source code at https://github.com/mobrulesgames It seems to me the developers must have been working really hard and the failure is due to that the size of the project is way too large for the two person team. It is unavoidable and the dependency issue is just a symptom.


That GitHub page tells me they did clone all of the dependency repos six months when they were trying to recover it.

If the game was finished and working but does not compile anymore, that sounds like a trivial thing for a hardworking developer to fix in a pretty short amount of time.

I've seen much worse when porting games across platforms. This doesn't look even close to "beyond repair".


+1. Although one might argue that failure is not really due to the size of the project, that's just the side effect. Failure was created when they decided to use Go without understanding how it works.

Starting to build something big with tools you are not yet very familiar with is always risky.


This is ridiculous, there's a correct way to handle this and it's called git submodules. Instead of using go get for non-trivial long-term projects you should just be adding the dependency as a submodule in your git repo and including the files directly. This way you can have your submodule pointing to a specific commit in the upstream repo, you can update it when you want to/need to and all this information is kept track of in version control.

Also, if you want even more control(ie, you want to modify the library you're using in some way), fork it on github and include your fork as a submodule. That way you can patch bugs easily(and send the upstream a pull request) without having to wait for upstream to fix things you know how to fix yourself.


This is ridiculous. I have come across numerous projects written in half the languages under the sun that were impossible to build and/or run for various reasons, mainly because the original developer had so much stuff in his environment that it depended on. At least Go has urls to original source repos which is a whole lot better than mysterious binary blobs that noone knows what the hell they do let alone where they came from (but were most likely brittle chunks of Fortran77). And if you modify someone elses code and it gets out of sync...well that's a problem but its not Go's fault.


I'm a Go developer using Go for commercial projects in production.

I can tell that the article and its comments is absolutely misleading. Seems that the developer(s) are trying to justify the lack of professionalism and blame language/tools/someone else. In the article I see total misunderstanding of how the Go ecosystem works, strange decisions, obvious failures during development and weird situations like 'original dev cannot compile the code anymore'.

I think an engineer should have a decent understanding of the tools/concepts/... BEFORE starting to use them. Not AFTER some problems occur.


The dependency could have easily been avoided by forking the repo which you're using, and using that instead. Obviously the maintainer(s) of the original will be making changes, so wouldn't you want to make sure that if you're distributing the code, you're all on the same version?

> Why does this matter?

Maybe I'm not understanding this, but the dependency issue and the issues in this paragraph are not related. Issues like improper syntax? That's not related.

Sounds like bad planning as far as the dependency stuff goes. For everything else, sounds like a distribution issue.


Let's say I depend on github/A and github/A depends on github/B. If I fork A, that does not solve the problem. Now I need to fork B and modify import paths in A to resolve my fork. Wouldn't this be a mess in time?

I assume, instedad of forking, it might be possible to import a specific tag, or commit hash from a git repo. If it's possible, it'd solve the backward incompatible change problem but I guess the original author can always delete the github repo.

What is a good way to manage dependencies locally?


I hadn't considered the A/B issue, and yeah that can get messy. Importing a specific commit hash to the repo would be great. As far as managing dependencies locally, I couldn't say since I haven't had much experience with Go, but there exists a package manager called "Go Nuts" (see: http://www.gonuts.io/). While obviously not widely used, it seems to fix this issue. It's almost like rubygems all over again though.

What I'd like to see is a site similar to this, with a similar tool. Instead of having to upload the package to the site though, you could add a `pkg.json` file (or something similar) with application data. Every time this is committed, the site would automatically check to see if the version has changed, and then index that commit hash as a new version.

This almost sounds fun...


It's a good point. I've had it as a worry in the back of my mind for a while now, the 'best' solution for now seems to be linking to your own personal forks of projects you depend on.


Isn't that usually what people do in developments? They obtain a copy or fork of a library, and only update it at their discretion? This seems like an oversight by the developers rather than Go.

I realise that since Go supports this, it is encouraging to simply do that, but it isn't wise in the long term.

For instance, I've had some issues with Go-SDL, because one of the forks became unreliable. I decided eventually to simply fork another of the forks myself to overcome all these issues, and simply update my fork with commits to the main project whenever they came around.


That's a pretty heavy solution.


Sure, but it's also a common solution outside of Go. I write iPhone apps and we have all of our major dependencies forked, because at some point you're going to butt up against the limits of what your lib does. Even just redirecting a git submodule at a later point is a minor brainfuck - so nowadays we just instinctively fork as soon as we decide to pull it in as a dependency.

What I really want to know is why they chose Go in the first place. Go isn't a game development language and it has little to no support, documentation, or just plain old previous community experience in the gaming context. This feels like the classic software development of putting the technology ahead of the product - using a tool because it's cool, rather than using it because it's the best thing for the job.

It's all about shipping, people. Use whatever you need to you can ship what you want, when you want to, at the quality and reliability you desire.


> Sure, but it's also a common solution outside of Go.

In the C and C++ world, yes. Because they lack the software repositories many mainstream languages have.

The problem is that we got used to Maven, Ivy, CPAN, Gems, eggs, NuGet, ...


Back when I worked on a large commercial Python project, we did the same thing. All dependencies where on our server and that's the only place we grabbed them from when building a new release.


I think the main problem is that we got used to these dependency tools, and many developers coming to Go don't have experience how development was done before they came into existence.


Or the fact that many developers look at the previous methods and quite rightly say "That is a hideous method, we have moved on (and IMHO improved) from that, why the hell has a new language reverted to old and busted methods"

As I said in another comment regarding Go. If any company has the ability to create a stable, maintained CPAN / rubygems equivalent for the Go language, it would be Google.

Which just makes the lack of it more jarring (although I see the arguments that Go is designed for Google and may not fit other's ideas of what important features are)


Why Go? Because it's new and cool, of course. The whole Haunts project already had serious problems; this was just the final nail in the coffin, if you'll excuse the pun: http://www.joystiq.com/2012/10/24/haunts-anatomy-of-a-kickst...


Not for a video game.


I really struggle to understand what is going on here. Wrong usage of a feature isn't tools fault. I saw lots of projects that messed up things with gcc and Make. So can we consider gcc and Make are both broken. Developer should know pros and cons of feature that is about to be used.

Using an unofficial release or development version of a library is always risk. It is not language dependent. Things can always be broken if you don't follow release notes on an actively developed project.


It seems that Haunts uses Makefiles to wrap the Go tools so there is some non-standard complexity there. See https://github.com/losinggeneration/haunts/blob/type_fixes/M... which was linked from the 7 month old issue https://mrg-trac.sourcerepo.com/mrg_Haunts/ticket/34


It's bewildering to see a project that "can't compile" after having $70k+ poured into it. This article sums up this clusterfuck: http://www.joystiq.com/2012/10/24/haunts-anatomy-of-a-kickst...

Back on topic, maybe Go should take a page from node's NPM: simple versioned tarballs with isolated dependencies.


Wait, so, they were expecting to develop agains an X version, and then after release, when they release the source it will compile easily for all it's users with zero dependencies problems?

I don't see how bashing Go would help these guys out of the hole they are in, same would have happened with any other programming language in a project that uses fast-moving dependencies...

They were making a game, they could have just forked the projects they needed, and build from that, that's the way to do these kind of projects, you stick on a revision, and when there is an update on a dependency, the team should agree to make the upgrade changing all the required source code.

For what they say, the main developer had no idea that the repositories were changing a lot, and he never updated them, so everything was going thru smoothly... I don't see the need to bash Go, neither the main programmer, i just believe he needed more experience to handle the project correctly.

What i don't like is their attitude "nobody can fix this Go garbage code because Go is all wrong"...


I love how people jumped all over Go on that thread, without having any idea of what they are talking about.

Gotta love the Internets.


It's a poor workman who blames his tools.


Wow, how is this Go's fault? Shouldn't it be obvious that one has to be very careful when patching upstream libraries and maintain a forked repository and/or submit patches upstream?


Did they have a fork with local changes? The way I read it, they just did

  import "github.com/go-gl/opengl/gl"
and then at some point the go-gl project (which they don't manage) made incompatible changes that broke their build. Presumably there is some version of go-gl they could build against (which happened to be cached on their developer's machine), but they don't know which one because there's no version or tag in that string.


But this is what software development is all about anyways... it's like complaining that they have to type a lot in order to get _anything_ working. My whining detector went off.


I'm willing to bet that any Go programmer worth their salt not tired of the project can get it running if it were simply build dependency version issues.

I'll take a stab at it this weekend (if I have time).

For the curious, here is their repo: https://github.com/runningwild/haunts


I bet tomorrow morning half of people here will download and backup their 3th party libraries :). And perhaps even burn them on a CD (just to be safe)


Death march project fail, blame the tools.


From the Kickstarter trailer: "If we don't get this game out by Halloween, something will claim our souls".


Question: Could we give Open Source git repos permalinks?


What do you think the hashes are for? :-)


Well, I know. But seems people need to be told about it in a different manner. Not every product fails due to the product itself. Sometimes the failure is due to how it is referred to. If we had something like permalink this repo, then people who be moved to used that.


This project was haunted from the get-go.


Go fuck itself?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: