I don't think this is how it works. We have a discrete number of rods and cones which work as a well behaved spatial sampler. Human visual system temporal sampling is smeared stochastically across the rods and cones rather than being clocked. If you truly displayed 1 million fps and there were no shutter artifacts (as there are none in any fixed-pixel displays that we are currently looking at), then the motion would be life-like. The human visual system doesn't take a temporally clocked set of frames then apply a low-pass filter to it and doing it as an approximation of actual perceived motion blur would look off (as many gamers lament).
Blurbusters has a fair amount of literature on this topic.
This has nothing to do with biology, it's an argument from signal processing, which is well-understood theory (Nyquist's theorem and so on). If an object oscillates at 1 MHz, and you take 1 million still frames per second, it will rest in the same place in every frame, and thus look static. In reality, such an object would look blurred to the human eye.(+) It's this kind of artifact that motion blur (to be more precise, low-pass filtering) can avoid.
Edit: The article you linked to is very confused about some basic terminology. It equates response time artifacts of an LCD monitor that display sharp, digital images with motion blur. That's so wildly wrong I'm not even sure where to start. Maybe here: When displaying video, motion blur is a property of the source, response time one of the output device.
(+) Edit 2: To expand on this, the human vision system integrates arriving photons over time, and this way implicitly behaves a lot like a low-pass filter. A low-pass filter is different from a fixed-rate shutter, which is what people mean when they say the eye doesn't have a fixed framerate. However, there is a limited temporal resolution.
A more everyday example of this effect would be a dimmed LED. You can pulse an LED at 1 MHz, it will look dimmed, not blinking. But when filming/rendering this LED at 1 million still images per second, it will either be all on or all off, both of which are wrong (i.e., an artifact of your chosen shutter speed).
Ah but it has everything to do with biology. You are proposing a far too simple model for the signal processing actually at play. Unfortunately there is no clock going to the rods and cones, they simply fire signals off asynchronously and the timebase is reconstructed in your noggin. How would you go about filtering a few million samples all on their own timebases that are themselves not uniformly (or even periodically) sampled? It would be a truly awful approximation.
I know there are lots of funny theories about vision on the internet, but as a simple empirical fact that everybody can verify, dimming an LED works. Whatever idea about vision someone might have, if it doesn’t account for that fact, it’s wrong, disproved by everyday experience. (More generally, photons hitting cones are actually also discrete events, and we don’t see everything flickering all the time. But that’s a more complex argument that depends on a prior understanding of quantum physics. In contrast, the LED example is simple and deterministic.)
However, the complexity of the biology behind this doesn’t actually matter for my argument. My point is that we need proper signal processing in order to not irreparably damage the signal before it reaches the eye, when we’re still in the realm of precise technology. I’m not sure if I explain this poorly or if you’re a tiny bit motivated to misunderstand me, but of course, it is a complex subject.
As a last honest attempt, can we agree in the simplest possible case — that an LED blinking at 1 MHz filmed at 1 million still frames per second can’t ever reproduce a signal that can be interpreted as being dimmed? If so, we already agree that we have an artifact. Then the question is only how to remove it, and signal processing theory gives an answer.
If not, I’d recommend that you work through it with pencil and paper: what is the LED doing, what are the still images showing, what is the monitor showing to the eye? You can’t miss the conclusion if you do this carefully.
Blurbusters has a fair amount of literature on this topic.
https://blurbusters.com/blur-busters-law-amazing-journey-to-...