The article seems to claim that if we did as he says and rewrote all of our code to be data-oriented instead of object-oriented, then our code would run ~2X faster and all of our performance problems would be solved. However, I got my first 64 thread server CPU 15 years ago and my first 64 thread desktop CPU six years ago, so I could get a much larger increase by instead keeping my OO code and making it concurrent (64X > 2X). Also, if being 2X faster would make everything better, we could just wait 18 months and let Moore's law take care of the 2X savings, rather than the much longer period of time it would take to redesign our languages and rewrite all of our code.