HD is lower quality than film. Maybe this is true for home entertainment, but we take less advantage of darkness than ever from my perspective. It's the video-game CGI that kills me.
There was an experiment maybe 15 years ago, where they sent film material through the whole printing and distribution process, and measured the vertical resolution that could still be resolved on an actual cinema screen using analog projection. The result was around 700p IIRC, below full HD in any case.
I interpreted the claim as being under non-ideal conditions (which is fair, frankly—it's well-known that the visual and sound quality is better at the beginning of a run than at the end, and film quality doesn't matter if your local theater doesn't ensure it's preserved as best it can be).
Plus, I saw a film viewing of Sinners this past weekend (quite a fun movie, highly recommend it) and some visual artifacts were very noticeable—it was regular enough I figured there was a slice of the film roll that got damaged somehow.
Just for a quick update, I got curious and it seems the resolution actually should be a touch over 10k for super35, as for regular 35mm it seems spot on at 9.5k resolution.
Watching a 4K digital scan of a master copy on a 4K TV is different from what you would see in movie theaters. The roughly 700-800 lines is the apparent resolution one would experience in a real-life movie theater with analog projection.
The point is that even 1080p TV is an improvement in resolution over what you used to see in cinemas with 35 mm prints.