Hacker News new | past | comments | ask | show | jobs | submit login

> tell browser vendors to fix their damn engines

Parsing performance of the type they're talking about is an algorithmic fact, not an implementation problem.

Your suggestion is like telling engineers to fix the laws of physics.




SASS parses along those exact nesting rules, so clearly it's possible for the browser to do it as well. Just a question of performance hit at runtime.

In the end having the runtime be fast is likely more important than perfect syntax (as we can still pre-compile via better syntax), but the actual numbers and real world use cases matter a lot.

Are these worst case performance scenarios actually realistic or common? I didn't see anybody speaking to that, but also haven't followed this issue in years. I have seen on a lot of the web standards discussions that many of the contributors will pose theoretical worst case situations as something that should meaningfully impact design of a feature, when those worst case scenarios are extremely rare and unrealistic in practice.


The fact that the article raises the possibility of this limitation being removed in the near future means that it's not a law of physics, only a matter of finding a better algorithm.

Even if it were a law of physics, engineers can often circumvent laws of physics using smart caching and other tricks.

We've been waiting for this feature for decades already. It would have been better if they'd just invested a couple of more years to come up with a better parsing algorithm before standardizing a halfway implementation.


This was a hot topic in discussions and they simply damage control it here for a larger auditory, and in an excellent political way.

The problem is really law of physics-ey. But performance impacts (even naively implemented) were never presented for public evaluation.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: