Hacker News new | past | comments | ask | show | jobs | submit login

> Naturally, if you can look at a program which is now twice as fast and be unimpressed you are looking at the wrong metric.

Or, a computer scientist would say, the right one. The actual speed doesn't matter as much as the complexity.

In reality, it does, in the end; but it's also important to consider the type of optimization you're doing, and not just stop at being impressed with twice as fast.

For example, a primitively optimized function that's twice as fast for input size 100 would still be twice as fast for input size 100,000, but a function optimized to be an order less complex could be twice as fast for size 100, and a thousand or so times as fast for size 100,000.

The latter optimization would be significantly better, objectively, and thus, it would make a lot of sense to look at the first program and be unimpressed with a mere linear doubling in speed.




> Or, a computer scientist would say, the right one. The actual speed doesn't matter as much as the complexity.

Except, as an engineer would say, in practice the complexity doesn't matter as much as constant factors and the time and cost of design, implementation, and execution.


Neither are strictly correct in isolation. Of course the complexity matters, and of course the coefficient and reality factors matter. Often focusing on a core algorithm that's far too complex will save you time, money, and grief down the road. Engineering is, of course, knowing how to make the tradeoffs; but it certainly also involves knowing when one of those smart tradeoffs is time invested in getting your algorithmic complexity down.


There is a sorting network called AKS which has O(log(n)) depth. Batcher's Odd-Even Mergesort has a depth of O(log^2(n)). Which is better?

In all cases where you can fit n in the total memory capacity of the earth, Batcher's network is vastly superior. Constants matter a whole lot more than you might think.

I'm not arguing against revising algorithms to look for better complexity classes, but if you rewrite your algorithm in a smaller complexity class you had better be sure that your constants are of reasonable size.


For sure. But that's a far more nuanced argument than "twice as fast." :)


It's like grocery shopping and coupons, however. If you shouldn't be calling the function, then even a 99% reduction is not useful in that context.

Would you rather call a slow function, a fast function, or skip it altogether?

(A coupon only saves you money if you would buy the item in the first place.)


And then no watter what, if you're broke you will go to all lengths jumping through hoops and calculating tiny scores even if it's a waste of time... Because you're broke :p

Sorry, what's the context? Admin / account and login areas don't care / matter much, because they want it :p they'll wait...


I've experienced slow admin/internal UIs that were so slow and inefficient they brought down the consumer site... and yes, I'm talking algorithmically. Processing data in triple nested loops because it was "quick and easy" and then the data grew beyond a relatively small threshold (something like 10k records was where it started to break).




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: