I've read this (and similar diatribes by "real" computer scientists) before and although it's compelling - and, hell, I'd be willing to try the whole "formal methods" thing if it meant I spent less time debugging - it kind of overlooks the whole "we are successfully using computers to solve real-world problems by letting us non-mathematical 'dummies' at them" aspect. He seems to be reasoning that "yes, you're achieving results, but it would be more efficient if you spent two or three decades mastering formal methods before you produced a line of code" which sounds suspiciously to me like the way project managers reason: "yes, you can produce a program that solves the business problem if you just get started, but you would do so much more predictably if you spent 6 months 'estimating' it before you embark on a couple week's of programming".