Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> While people have an image frame rate of around 15-20 images per second to make moving pictures appear seamless,

This is just...wrong? Human vision is much fast and more sensitive than we give it credit for. e.g. Humans can discern PWM frequencies up to many thousands of Hz. https://www.youtube.com/watch?v=Sb_7uN7sfTw

 help



NO YOU ARE!

> make moving pictures appear seamless

True enough.

NTSC is 30fps, while PAL is 25fps.

The overwhelming majority of people were happy enough to spend, what, billions on screens and displays capable of displaying motion picture in those formats.

That there is evidence that most(?) people are able to sense high frequency PWM signals doesn’t make the claim that 15 to 20 frames per second is sufficient to make moving pictures appear seamless.

I’ve walked in to rooms where the LED lighting looks fine to me, and the person I was with has stopped, said “nope” and turned around and walked out, because to them the PWM driver LED lighting makes the room look illuminated by night club strobe lighting.

That doesn’t invalidate my experience.


> NTSC is 30fps, while PAL is 25fps.

That's not really right. Most NTSC content is either 60 fields per second with independent fields (video camera sourced) or 24 frames per second with 3:2 pulldown (film sourced). It's pretty rare to have content that's actually 30 frames per second broken into even and odd fields. Early video game systems ran essentially 60p @ half the lines; they would put out all even or all odd fields, so there wasn't interlacing.

If you deinterlace 60i content with a lot of motion to 30p by just combining two adjacent fields, it typically looks awful, because each field is an independent sample. Works fine enough with low motion though.

PAL is similar, although 24 fps films were often shown at 25 fps to avoid jitter of showing most frames as two fields but two frames per second as three fields.

I think most people find 24 fps film motion acceptable (although classical film projection generally shows each frame two or three times, so it's 48/72 Hz with updates at 24 fps), but a lot of people can tell a difference between 'film look' and 'tv look' at 50/60 fields (or frames) per second.


Any idea why movies are still mostly at 24 FPS? Is it just because people became used to it?

Most (or at least many) people visually recognize 24 fps content as film and higher frame rate content as TV/video.

Filmmakers generally like their films to look like film and high frame rate films are rare and get mixed reviews.

Some TV shows are recorded and presented in 24 fps to appear more cinematic (Stargate: SG1 is an example)


That association seems to be an unfortunate equilibrium because higher frame rates seem to be "objectively" better, similar to higher resolution and color. (Someone without prior experience with TV/movies would presumably always prefer a version with higher frame rate.)

In general yes. Low framerates can be used deliberately to make something feel more dreamlike but that is something that should only used in very specific cases.

Pretty much all dramatic American TV shows were shot on film (at 24 fps) before the digital camera era. It's why so many old shows (ex. Star Trek TNG) are now available as HD remasters, they simply go back and rescan the film.

It's more complicated in other countries (the BBC liked to shoot on video a lot) but it was standard practice in the States.


It took far more than simply rescanning the film to get the TNG remasters as all the visual effects were only rendered and composed at broadcast resolutions (and framerates). They had to essentially recreate all of that, which is why we haven't gotten the same remasters for the less popular Deep Space Nine and Voyager series.

From what I have see most series of that era were edited in NTSC after converting the original film material.


I think familiarity is a major factor, but the lower frame-rate and slower shutter speed also creates motion blur, which makes it easier to make the film look realistic since the details get blurred away. I remember when The Hobbit came out at 48 fps and people were complaining about how the increased clarity made it look obviously fake, like watching a filmed play instead of a movie.

> I remember when The Hobbit came out at 48 fps and people were complaining about how the increased clarity made it look obviously fake, like watching a filmed play instead of a movie.

Curiously I can already get in this mindset with 24fps videos and much, much prefer the clarity of motion 48fps offers. All the complaining annoyed me, honestly. It reminds me of people complaining about "not being able to see things in dark scenes" which completely hampers the filmmakers ability to exploit high dynamic range.

Tbf, in both cases the consumer hardware can play a role in making this look bad.


I went out of my way to see the Hobbit in 24 and 48 fps when it came out, and weirdly liked 48 better. It was strange to behold, but felt like the sort of thing that would be worth getting used to. What I didn't like was the color grading. They didn't have enough time to get all the new Red tech right, that's for sure.

Yeah, that's pretty much it. They standardized on 24 back when sound on film took over Hollywood, and we now have a century of film shot at that speed. It's what "the movies" look like. There have been a few attempts to introduce higher frame rates, like Peter Jackson's The Hobbit and James Cameron's Avatar, both at 48 fps, but audiences by and large don't seem to like the higher frame rates. It doesn’t help that we have nearly a century of NTSC TV at ~60 fps[1], and our cultural memory equates these frame rates with live tv or the "soaps," not the prestige of movies.

[1]Technically 29.97fps but the interlacing gives 59.94 fields per second.


I haven't seen a single person complain about avatar. I wonder if the issue with the hobbit wasn't the 48fps at all but rather something more akin to when we shifted to HD and makeup/costume artists had to be more careful.

Because movies (in film form) are projected an entire frame at a time instead of scanned a line (well, actually a dot moving in a line) at a time onto the screen. I read somewhere (but no longer have the link) that when projecting the entire frame at once as film projectors do lower frame rates are not as noticeable. I do not know if modern digital projectors continue to project "whole frames at once" on screen.

Movies are not projected using the scan and hold approach used by typical computer displays. They have a rotating shutter which blinks every frame at you multiple times. This both helps to hide the advance to the next frame but also greatly increases motion clarity despite the poor framerate.

But blinking a frame multiple times rather than once creates a double (or triple etc) image effect. To get optimal motion clarity which compensates Smooth Pursuit without double images, one would need to flash each frame once, as short as possible. But that's not feasible for 24 FPS because it would lead to intense flickering. It would be possible for higher frame rates though.

Even a person with below average eye sight can easily see 24 fps judder in fast moving pans. Vision is not nearly as simple as people want it to be.

Badly phrased but not wrong, this is the minimum frame rate for humans to perceive motion as supposed to a slide show of images.

The maximum frame rate we can perceive is much higher, for regular video it's probably somewhere around 400-800.


Maximum depends on what it is you are seeing. If it’s a white screen with a single frame of black, you can see that at incredibly high frame rates. But if you took a 400fps and a 450fps video, I don’t think you would be able to pick which is which.

The discussion on flicker fusion frequency (FFF) and human vs. canine perception is fascinating. When building systems that synchronize with human physiology, like the metabolic digital twins I'm currently developing, we often find that 'perceived' seamlessness is highly variable based on cognitive load and environmental light.

While 24-30fps might suffice for basic motion, the biological impact of refresh rates on eye strain (especially for neurodivergent users) is a real engineering challenge. This is why I've been pushing for WCAG 2.1 AAA standards in my latest project; it’s not just about 'seeing' the image, but about minimizing the neurological stress of the interaction itself.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: