Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I guess I'm making a wider claim about the effectiveness of funding 'directed innovation'.

Very often innovations can happen if you fund accomplishing some previously unaccomplished task. Building a nuclear bomb, sending a man to the moon, decoding the genome. The innovations come about because you have smart people trying to solve something nobody has ever tried to solve before, and to do that they slightly revolutionize things.

I'm not aware of a single case where the goal was defined in terms of innovation itself, as in "find a better way to program" and anything useful or even slightly innovative resulted. They are by definition doing something that lots of people are trying to do all the time. It's just very unlikely that you are creating conditions which are novel enough to produce even a slightly new idea or approach.

Generally what you get is a survey of how things are currently done, and some bad ideas about how you could tweak the current state of affairs a little. But if there was a way to just patch up what we already know how to do then it's very likely someone already tried it, really it's likely 1000 people already tried it.



Sorry, added an edit to my above post before I saw this, just to summarize:

I think that's a more reasonable complaint, but I fear it's too vague to be applicable.

The STEPS folks would probably say that a modern computing environment in ~20kloc is something that was previously unaccomplished and thought to be unaccomplishable, but you're writing that off/not counting it as such, presumably because it failed.

On the other end of the spectrum, things like Git (to my knowledge) did come out of the "find a better way to source control" incremental improvement mindset. (Of course, you can say the distributed model was "previously unaccomplished," but the line here is blurry.)


I don't think Git itself is a revolution or new technology. It took what people were trying to do (but with extremely frustrating bad user experience, like taking an hour to change branches locally) and just did it very well by focusing on what is important and throwing away what isn't. It's an achievement of fitting a solution to a problem .

I don't think they DID build a modern computing environment at all. They built something that kindof aped one, but unlike Git they missed the parts that made a computing environment useful for users. It's more like one of those demo-scene demos where you say "Wow how did they get a commodore 64 to do that!?"

If they did build a modern computing environment with 20k LOC, that is a trillion dollar technology. Imagine how much faster Microsoft or Apple would be able to introduce features and build new products if they could do the same work with 2 orders of magnitude less code! That is strong evidence that this wasn't actually the result.


> I don't think they DID build a modern computing environment at all.

I agree that, now, after they've tried and failed, we can say they didn't build a modern computing environment in 20kloc.

My point is just that, when they were pitching the project for funding, there was no real way to know that this "trillion dollar technology" goal would fail whereas nukes/moon mission/etc. would succeed. Hindsight is 20/20, but at the beginning, I don't think any of these projects defined themselves as "doing something that lots of people are trying to do all the time;" instead they would probably say "nobody else has tried for a 20kloc modern computing system, we're going to be the first."

Given they all promised to try something revolutionary, I'm not sure it's fair to claim after-the-fact how obvious it is that one would fail to achieve that vs. another.

But I do take your point that it's important in general not to fall into the trap of "do X but tweak it to be a bit better" and expect grand results from that recipe.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: