There seem to be a few points in your comment, let me try to answer them as best I can:
The Go 1.5 compiler is not slower because it is written in Go, it is slower because it was mechanically translated from another language. Human eyes will improve it over time.
Garbage collection is a throughput hit the language designers (and I) are willing to accept for improved safety and simpler APIs. I'd rather not pay it, but I choose it over spending a large chunk of my API documentation describing various ownership scenarios like I did in C++. It's a tradeoff I find acceptable for most programs. You won't find me using a GC on a clock slower than 100 MHz, or in a sub-millisecond realtime system (but I probably won't be using linux either there).
I'm also willing to pay the performance hit on non-critical generic code, and I use interface{} for that where I can. If benchmarks show it's a problem, I'll do something differently. That might be hand-rolling an algorithm, which is unfortunate for the programmer who follows me and has to read it. But it doesn't come up much.
The performance price for generating large amounts of extra code in the compiler is reasonably well understood, and not something that would be amenable to simply trying and benchmarking. One would have to implement generics, then spend several programmer-years tuning the compiler over various programs, compilers are complex machines.
Again, I haven't met a compiler expert who doesn't think that generating code for widely used generics would be expensive. And they would be widely used, any good generics solution would have to replace maps and slices, and would permeate the standard library.
Or do you think that just maybe using speed as a reason not to do something is a bit of a cop out. Especially without any benchmarks to back it up.