Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have to disagree. Naive algorithms are absolutely fine if they aren’t performance issues.

The comment you are replying to is making the point that “better” is context dependent. Simple is often better.

> There is no doubt that the grail of efficiency leads to abuse. Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%. - Donald Knuth



Having a human-visible delay to calculate a single statistic about a small block of numbers is a bad thing.

Do not use such a naive algorithm on arrays this big. If this code is going to actually be used in something, it's a performance issue.

In general these optimizations don't involve much time thinking them out, and a bunch of them are fine as far as debugging and maintenance. The first prompt-engineered version is fast and simple.

(Though the issue isn't really algorithm, it's that you don't want to be doing much number and string crunching in pure python.)


> Writing naive algorithms

Depends on the circumstance, and how difficult an appropriate algorithm is to write, but in my experience, if code performance is important, this tends to yield large, painful rewrites down the road.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: