TeX’s line-breaking algorithm is certainly not computationally intensive. On my 7-year-old MacBook Pro, it takes 0.34 seconds to run the entire 495 page The TeXbook on a single thread. That includes parsing, macro expansion, page-breaking, lots of (slower) math layout, and dvi output, which means that the line-breaking takes at most a few hundred microseconds per page.
Remember, TeX was written to be usable on a 1 megabyte, 10 megahertz machine, where it ran about a page a second. One of my contributions at the time was to modify the Pascal compiler on the Sail PDP10 to count cycles for every machine instruction executed by TeX over all users over a number of months, and Knuth fine-tuned the inner-loops of TeX here and there based on the results (the code that automatically inserts kerns and ligatures got the most attention, IIRC).
My comment was based on what I've read about this from multiple supposed authorities on web development. Intuitively it made sense that you wouldn't want to install a computationally intensive algorithm on browsers on mobile devices or when the content area changes size frequently, as on web pages. It's fascinating to have those assumptions overturned by someone so deeply involved in Tex.
Well, if you're going to be nice about it, here's some more info: The Mozilla discussion claims that TeX's line-breaking algorithm is "quadratic," which seems a bit far-fetched. So, I just pulled the raw text of Moby Dick off the web, removed the blank lines so it's all one paragraph, and ran it. TeX produces 112 pages (hmmm, it was just "Volume 1") in 2.1 seconds. So, 30x slower than "normal-size" paragraphs, but hardly quadratic, as the single-paragraph Moby Dick is 1000x as large as the average paragraph in The TeXbook. Of course, as pointed out elsewhere, with a little effort, one could make minor changes that would remove even this speed penalty.
I'm much more sympathetic to the point that, while TeX's line-breaking algorithm can easily handle paragraphs with different line lengths for each line, it needs to know at the start what the different line lengths are. It's not clear how to generalize it to be able to handle layouts where the length of the nth line of a paragraph depends on the earlier (or later!) line breaks. Think tall floating figures which impinge on the text area of the paragraph they're in. I'm guessing that was the real impediment in using it in Web-land.
Remember, TeX was written to be usable on a 1 megabyte, 10 megahertz machine, where it ran about a page a second. One of my contributions at the time was to modify the Pascal compiler on the Sail PDP10 to count cycles for every machine instruction executed by TeX over all users over a number of months, and Knuth fine-tuned the inner-loops of TeX here and there based on the results (the code that automatically inserts kerns and ligatures got the most attention, IIRC).