Hacker News new | past | comments | ask | show | jobs | submit login

This just shows that the percentage of total time a function takes is really not the thing you should be looking at when optimizing code and you want to know how much faster your code really is.

For example, say your function f() takes 100% of the time. You then make it twice as fast. You look at the percentage, and, surprise, it still is taking 100% of the time!




Naturally, if you can look at a program which is now twice as fast and be unimpressed you are looking at the wrong metric.

It's understandable though in the context of the sequence of actions you might be going through as a tinkerer (as opposed to a scientist). You start with a list of functions and the % of time they take. Improving the top one will have the biggest effect, so this is a good thing to look at when deciding what to to. Then you go back to the same list afterwards.

I can see this (and lots of variations of this) happening


> Naturally, if you can look at a program which is now twice as fast and be unimpressed you are looking at the wrong metric.

Or, a computer scientist would say, the right one. The actual speed doesn't matter as much as the complexity.

In reality, it does, in the end; but it's also important to consider the type of optimization you're doing, and not just stop at being impressed with twice as fast.

For example, a primitively optimized function that's twice as fast for input size 100 would still be twice as fast for input size 100,000, but a function optimized to be an order less complex could be twice as fast for size 100, and a thousand or so times as fast for size 100,000.

The latter optimization would be significantly better, objectively, and thus, it would make a lot of sense to look at the first program and be unimpressed with a mere linear doubling in speed.


> Or, a computer scientist would say, the right one. The actual speed doesn't matter as much as the complexity.

Except, as an engineer would say, in practice the complexity doesn't matter as much as constant factors and the time and cost of design, implementation, and execution.


Neither are strictly correct in isolation. Of course the complexity matters, and of course the coefficient and reality factors matter. Often focusing on a core algorithm that's far too complex will save you time, money, and grief down the road. Engineering is, of course, knowing how to make the tradeoffs; but it certainly also involves knowing when one of those smart tradeoffs is time invested in getting your algorithmic complexity down.


There is a sorting network called AKS which has O(log(n)) depth. Batcher's Odd-Even Mergesort has a depth of O(log^2(n)). Which is better?

In all cases where you can fit n in the total memory capacity of the earth, Batcher's network is vastly superior. Constants matter a whole lot more than you might think.

I'm not arguing against revising algorithms to look for better complexity classes, but if you rewrite your algorithm in a smaller complexity class you had better be sure that your constants are of reasonable size.


For sure. But that's a far more nuanced argument than "twice as fast." :)


It's like grocery shopping and coupons, however. If you shouldn't be calling the function, then even a 99% reduction is not useful in that context.

Would you rather call a slow function, a fast function, or skip it altogether?

(A coupon only saves you money if you would buy the item in the first place.)


And then no watter what, if you're broke you will go to all lengths jumping through hoops and calculating tiny scores even if it's a waste of time... Because you're broke :p

Sorry, what's the context? Admin / account and login areas don't care / matter much, because they want it :p they'll wait...


I've experienced slow admin/internal UIs that were so slow and inefficient they brought down the consumer site... and yes, I'm talking algorithmically. Processing data in triple nested loops because it was "quick and easy" and then the data grew beyond a relatively small threshold (something like 10k records was where it started to break).




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: