Agreed. I was just bitching about that on twitter.
I was just working through some code to check for the intersection of two lists and use the result to iterate though. Creating new slices to put values in with loop in loops is not a productive use of my time.
I don't really want generics personally, I would just like some magic methods on top of the existing slice/maps in Go. They already are magic in the way they work so why not provide those few helper methods and solve a massive pain point I have with Go.
What magic? Go is very unmagical, and that’s the way it should be. If you need to merge two lists you can just write a method that does that. Publish it as a package and reuse it. In Go algorithms should be apparent. Go is a code heavy language, not data heavy. You should see and understand the real cost of your solution.
Well, if you had generics, you could do that. Without generics, you get to either (a) write an efficient method that merges 2 lists of Foos, which is not really package-and-reuse material, or (b) write a horribly inefficient method that merges 2 lists of some interface type, forcing your users to copy their data to new slices every time (since an []x is never an []y).
Faced with these 2 wonderful options, most people simply decide to copy or write again the same piece of code...
There’s also option C which is better than b (but still awful) which is to write a method that takes two interface{}s and uses reflection to figure out their concrete types at runtime and merged them. No type safety, unidiomatic, decently ergonomic (assuming you panic on type errors), and probably reasonably performance for large slices.
Copying is encouraged. Like I said, having the algorithm there is a good way to make it apparent what the code is actually going to do and the associated cost of the solution.
interface{} could be avoided by having an interface that reflects the behavior you actually need from the types involved in the algorithm. If you need to only test for equality, you could have an algorithm that works on []Eq. That is one of the downsides to Go, unlike Haskell, interfaces are usually defined by the consumer, which makes negotiating common interfaces like Ord and Eq difficult.
>Copying is encouraged. Like I said, having the algorithm there is a good way to make it apparent what the code is actually going to do and the associated cost of the solution.
In 2019, having a generic implementation of the algorithm is also "a good way to make it apparent what the code is actually going to do and the associated cost of the solution" in most languages.
On top of that, it helps you have less code (easier to check, code is a liability), skip manual text-template-generated code pre-process steps, let's you code what you know you want without a context switch to copy something that you've already written for another type, and stop bugs stemming from e.g. changing things in one implementation of the same algorithm and not another (because you're forced to have 10 implementations for different types).
And copying has been an anti-pattern since the times of Algol. Except if it's replaced by the "wrong abstraction" prematurely. But nobody ever called having a generic version of an algorithm "the wrong abstraction".
I know Go encourages copying code, but that is a well known anti-pattern in all other language communities,one of the few with some research behind it (as someone else mentioned, basically the only quantitative science we have on software engineering says that more code implies more bugs).
The idea of []Eq sounds nice, but again, the problem is that I can't pass an []X to a function that wants an []Eq, I need to build a new array and copy all my elements over. This is correct from the type-theory point of view (as slices are not covariant) , but of course Go doesn't offer anything else that would be more user friendly and still correct, such as a covariant read-only slice that we could use instead.
> Copying is encouraged. Like I said, having the algorithm there is a good way to make it apparent what the code is actually going to do and the associated cost of the solution.
It's also a good way to make bugs proliferate throughout your code base. IIRC bug count correlates well with line count regardless of the language(!).
Yes, go seems to come from a mentality that “excess abstraction is evil” where “excess abstraction” is defined as “any pattern enabled by a language feature developed since the ‘80s”.
It’s madness. When writing application software in any language, you’re already umpteen levels of abstraction away from what’s really happening under the hood. You call a library that calls a library that calls another library over FFI that itself dynamically links to a function written in another language that’s all compiled down to object code that runs in little tiny timeslices chosen by a scheduler in an operating system that abstracts away the hardware running on a CPU that actually rewrites instructions using microcode to actually run on an internal RISC core but one more level of abstraction on top of all this is apparently where everything will go wrong.
I was just working through some code to check for the intersection of two lists and use the result to iterate though. Creating new slices to put values in with loop in loops is not a productive use of my time.
I don't really want generics personally, I would just like some magic methods on top of the existing slice/maps in Go. They already are magic in the way they work so why not provide those few helper methods and solve a massive pain point I have with Go.