Hacker News new | past | comments | ask | show | jobs | submit login

These stories really expose how fast and efficient computers can actually be. I remember a post (don’t have the link) where someone sped up a Python script by 3000x or something crazy like that with C++. A script anyone would’ve been fine running, but it was just so much slower than what was possible.

Sometimes I wonder whether we should not focus more on these micro adjustments. We usually have this “it’s fast enough” attitude, but probably everything we use today (including web services) could be instant if focus was given to optimization (although yes, I understand the drawbacks of focusing solely on that).




We as in the general every day programmer especially your web services example should definitely not focus on these micro adjustments I would say. Sure it's fun. If you make library type code, yes please do focus on them! If you're a run of the mill corporate dev, very rarely I would think.

Other much simpler optimizations are usually possible in "corporate code". Lots of stupid things being done. To use your python script example, proba ly you could've gotten 2950x improvement rewriting the bad parts better but still in python :)

Do you remember what the python script did/was for? How much time was spent building the original script? What was the guy that built it paid?

How long did it take to build the C++ version and how much was that guy paid?

If these were all open source/unpaid, what would it cost for these different types of people and could the company conceivably pay those salaries/keep the guy happy and busy enough to stay around?


I understand all that and apply that day to day. But my thought here is more to think of the possibilities if the performance of most things was brought to its absolute best. There’s a company that went from 30 servers to 2 just by rewriting from Ruby to Go (I can find the link later if interested). Even if they had kept Ruby and achieved a speed up, there was something just massively inefficient and they didn’t really care as they were in “we're shipping fast” mode and were making money. Which I understand perfectly but that’s a big waste of energy and resources.

Not that I think that should be priority #1 but it’s staggering how much more efficient things can be than the “naive” approach.


Perhaps you're thinking of Matt Parker's (of standup maths and numberphile fame) video:

https://m.youtube.com/watch?v=c33AZBnRHks


I think that’s it! Really shows my point: the first code was already ok and solved the problem in a reasonable amount of time, in a normal “let’s ship faster” setup the ridiculous speed up achievable would’ve never even be considered. 4 million percent faster! A whole class of possibilities open up at that point.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: