There's a transition where a quality signal becomes easy-to-fake, during which time humans are "fooled", but then we become sensitized. It's no longer a quality signal and we subsequently refine our aesthetics toward harder-to-fake signals. For example, wood veneer over particle board, tile-patterned linoleum, etc. In light of that, I wonder what will happen to the concept of beauty as a result of this?
Not sure where you were going with this: your examples are examples of fake-detection in real life. If you show a photograph with wood veneer over particle board, and it's a high quality wood veneer, then no: no one can tell it's not wood unless they're at the shoot and have the opportunity to examine it. Same for linoleum: as long as the photo's lit properly, you can't tell.
And the same is true for what's presented here: as "fake photos", once you hit photorealistic, you're done. There is no "until humans start seeing the pixels".
Maybe once, but not over repeat occurrences. Then you have an arms race between consumers tuning their aesthetic preferences to certain quality signals, producers trying to exploit those preferences without actually delivering quality, consumers re-adjusting their preferences after getting burned, etc.
A closer comparison might be dating profile pics, which as I understand dating "shoppers" quickly learn to distrust, at least for certain angles or types of shots. This AI enhancement stuff would presumably cause some rapid evolution in that particular arms race.
Such an arms race is likely until either the AI or the customer brains hit their limits. The in-principle limits of a synthetic mind vastly exceeds anything biologically possible — it’s like comparing wolves to hills, both in size and speed, and silicon has the advantage both ways.
The only thing keeping us safe is that we don’t yet know how our minds work.