The problem is that browsers choose not to implement high-quality justification algorithms like the Knuth-Plass algorithm that Tex uses because it is computationally intensive. That’s why justified text looks like garbage on the web.
There are some experimental JavaScript implementations, but without browser support reflowing high-quality justified text is a non-starter on the web.
TeX’s line-breaking algorithm is certainly not computationally intensive. On my 7-year-old MacBook Pro, it takes 0.34 seconds to run the entire 495 page The TeXbook on a single thread. That includes parsing, macro expansion, page-breaking, lots of (slower) math layout, and dvi output, which means that the line-breaking takes at most a few hundred microseconds per page.
Remember, TeX was written to be usable on a 1 megabyte, 10 megahertz machine, where it ran about a page a second. One of my contributions at the time was to modify the Pascal compiler on the Sail PDP10 to count cycles for every machine instruction executed by TeX over all users over a number of months, and Knuth fine-tuned the inner-loops of TeX here and there based on the results (the code that automatically inserts kerns and ligatures got the most attention, IIRC).
My comment was based on what I've read about this from multiple supposed authorities on web development. Intuitively it made sense that you wouldn't want to install a computationally intensive algorithm on browsers on mobile devices or when the content area changes size frequently, as on web pages. It's fascinating to have those assumptions overturned by someone so deeply involved in Tex.
Well, if you're going to be nice about it, here's some more info: The Mozilla discussion claims that TeX's line-breaking algorithm is "quadratic," which seems a bit far-fetched. So, I just pulled the raw text of Moby Dick off the web, removed the blank lines so it's all one paragraph, and ran it. TeX produces 112 pages (hmmm, it was just "Volume 1") in 2.1 seconds. So, 30x slower than "normal-size" paragraphs, but hardly quadratic, as the single-paragraph Moby Dick is 1000x as large as the average paragraph in The TeXbook. Of course, as pointed out elsewhere, with a little effort, one could make minor changes that would remove even this speed penalty.
I'm much more sympathetic to the point that, while TeX's line-breaking algorithm can easily handle paragraphs with different line lengths for each line, it needs to know at the start what the different line lengths are. It's not clear how to generalize it to be able to handle layouts where the length of the nth line of a paragraph depends on the earlier (or later!) line breaks. Think tall floating figures which impinge on the text area of the paragraph they're in. I'm guessing that was the real impediment in using it in Web-land.
My assumption was also that performance is the reason we aren't getting more esthetically pleasing line breaking. Until I read a comment[1] by Philip Walton, who works on WebRender at Mozilla, that is.
> ... it's not possible in the general case, at least not with the specs as they are today.
That's a fair response, but how about changing the (CSS) specs to allow better line breaking? Surely that would take less time than WebUSB, and Google or Mozilla could quickly push it through the IETF.
I'm not a web developer, so take this with a huge pinch of salt, but, if floats are the problem, does that imply that with layouts that use CSS Grid or Flexbox, we could have a decent justification algorithm?
There are some experimental JavaScript implementations, but without browser support reflowing high-quality justified text is a non-starter on the web.