Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The filmmaker said that when he begins shooting the "Avatar" sequel in about 18 months, he will be shooting at a higher frame rate, though he has yet to decide if that will be 48 fps or 60 fps. He said George Lucas was "gung-ho" to make the conversion, and also called Peter Jackson one of his allies. Jackson, he said, had at one point been heavily weighing shooting "The Hobbit" at 48 fps.

Awesome!

It's finally happening. The 24 fps speed is a relic from the stone age of cinema, back when technology simply could not cope with higher frame rates. So all they could do was shoot at 24.

Then something odd happened. Everyone was shooting at 24, including the grand masters of the art. All the procedures, techniques, all the clever things they devised, took that frame rate as a given. They were not simply shooting AT 24 fps, they were shooting FOR 24 fps.

Thus the "film look", which is the way a motion picture looks if it's shot at 24 fps using the standard cinema techniques. From a technical limitation, it became a cultural given. Because Bunuel and Eisenstein and Bergman all shot at 24 fps, everyone simply assumed this is how cinema is supposed to look like. Higher speed video ended up being seen as "cheaper looking, made for TV".

But it's exclusively a habit, a purely conventional thing, imposed by culture.

I remember thinking a while ago that, what it would take to break the mold is a few big names in cinema to start purposefully shoot at higher frame rates, and at the same time try and figure out new ways to express the same things at the new speed. Some geniuses need to reinvent the art to break free from the old stereotype.

Cameron filming at 48 or even (please, please, please) at 60 fps cannot do the same things that Cecil DeMille did at 24. Or, rather, is free to not do the same things, because the lower frame rate is limiting in many ways.

And now finally it happens. This is great. I hope 24 fps finds its true place - in the history books. It's time to move on.



I'm not a cinematographer but having gone to USC I met and talked with some (both aspiring and those who came back to give guest lectures). There was a digital projection system back in the 80's which had 60fps video, the output was much more 'real' in the sense of looking at the screen seemed more like you were looking at real life, and it jarred folks. One of the directors who was giving a lecture on the future of film felt it was the future but acknowledged the following counterpoint.

24FPS celluloid, with its known gamut limitations and temporal resolution creates an ambiance for the film. This is not unlike the way that canvas and oils create a certain ambiance about an oil painting. Now is the future of 'art' digital printers that can reproduce what an artist might have painted in oils, something they can paint using a Wacom tablet and Illustrator? No, not really. The medium is part of the art, its part of the palette.

Do digital films at high FPS have a place? Of course they do (art is pretty liberal in what it will accept as a medium :-) but will 24fps celluloid be consigned to the history books? Answering that question is less obvious.

The change from hand cranked to mechanically operated shutters was a clear win for cinematographers because it took some randomness out of what the audience would see, talking pictures did the same. The change from celluloid to video is however a medium change not a technique change.

I believe the comment at the particular talk I attended was "If Leonardo could have taken a picture of Mona Lisa instead of painting her, would it be in the Louvre today?" Can't easily answer that.


This sounds like an argument similar to the one lamenting the advent of digital photography because the grain of the film no longer provided the same effect in photos. But although the grain effect is available for digital photos, it is seldom seen.


Your claim "But although the grain effect is available for digital photos, it is seldom seen." is a fallacy. I'm sure if someone wanted a grain effect in their photograph they would be more likely to actually use film rather than use a digital simulation of what film would have looked like had they chosen to do that. I don't see wide spread use of the 'water color' effect either but people still make water color pictures. It is just that they start out with water colors.

I was rebutting the claim that shooting film on celluloid would 'go away' because there were higher frame rate technologies available. I don't believe that conclusion is supported by either current or past experience the art world. It was not a 'lament', it merely challenged the reasoning that 24 fps celluloid movies would be replaced in their entirety by higher frame rate ones.


> Your claim "But although the grain effect is available for digital photos, it is seldom seen." is a fallacy.

Yet it's factually true.

> I was rebutting the claim that shooting film on celluloid would 'go away'

Perhaps I did not choose the best words.

It will not disappear from the face of the Earth, of course. But it will be relegated to "retro" or nostalgic movies, etc.

It depends on how quickly the new, less limited technologies are adopted by the big names in cinema, and of course by how quickly the old generation, accustomed to the old way a film "is supposed to look", will retrain their taste or die off. :)


a grain effect added to a digital photo is a kind of affectation, an artifice that is not intimately connected to the process of making the image, included as a stylistic decision; possibly an afterthought. it seems dishonest (not necessarly wrong) but it is part of a panoply of effects that can be applied that constitute what's possible with digital. there's no need to 'lament' since each medium has its own panoply of possiblities: photochemical emulsion gets to be 'honest' about grain, however, and digital does not, since with emulsion grain is not an 'effect': it is what emulsion, in fact, consists of. the same cannot be said about digital.

there are analogous issues between film and video, and even between older video formats and digital video. the future of movies no doubt includes a choice of frame rates, frame resolutions, interlace vs. progressive, and aspect ratios as creative choices. they will be part of the panoply of possibilities that artists will be able to use.

n.b.: in the silent era, there was no real standard film speed, and the film was often exhibited (in the days of hand crank projectors) at varying speeds to suit the content of the image and the music that was invariably played live alongside it. for example, romantic scenes might be cranked a bit more slowly; comedy scenes could be cranked a bit faster. it was only when the industry started to be professionalized that silent films were shot a 16 or 18 fps. with the sound era, the fps was upped to 24 partily to improve the fidelity of the optical sound. most projectors have for decades now multi-leaf shutters which actually interrupt the light path as many as 3 times per frame elimitate the perception of flicker.


...interlace vs. progressive...

Please, no. Interlacing doesn't have the same nostalgic effect as a 24fps frame rate, and only serves to lower video bandwidth for devices that can't handle a progressive image at the desired frame rate. It's a tradeoff between vertical and temporal resolution that works well for limited technology because it preserves the sharpness of still images. Basically everything except broadcast TV can handle 1080p60, though, so let's let the hack die.


generally speaking, i'm no fan of interlace either, but my argument still stands that it could be used expressively -- in other words, to enhance the experience of the content for appropriate material. as an example, it might be a good choice if you were making a program and you wanted to exploit the impression that the viewer was watching a tv broadcast from a few decades ago.


I definitely agree that no option should be removed from the palette available to artists, just as long as the final delivery medium doesn't require deinterlacing.


There's simulated and real. I see scanlines all the time for CRTs appearing in cartoons, which has nothing to do with the actual format of the video playing.


Instagram is showing that people do like low-fi effects.

But I agree with you -- this is a mere affectation, a slightly way of turning common images into something evocative. And this becomes much easier to achieve when starting with a high resolution image.


That is an argument of professional taste. This is different. Many non-professionals unconciously associate the more realistic quality of video with TV movies, soap opera's and game shows. (e.g. 60fps video is a sign of lower quality content.)

The same was not true for still photography. It certainly is an open question on whether a negative association with Video will go away or if it will remain.


I think any picture Leonardo would have taken would be a significant cultural artifact, as would any telephone he built, or any computer he made.


Now we have higher framerate digital and post-production teal and orange effects.


Isn't everything about film and film-watching "exclusively a habit, a purely conventional thing, imposed by culture"? Or are there aspects that are objective? (Would I even watch a James Cameron film if it weren't for cultural norms encouraging me to?)


A postmodernist would reply that everything is purely conventional. Others may disagree. But that's an entirely different can of worms.


I don't see how it's any different, I suppose. The aesthetics of the 24-fps film are aesthetics, just like basically any other aspect of film, ranging from acting styles to types of cuts. I happen to like the aesthetics of 24-fps film for certain kinds of scenes, and actually prefer 14-fps and 18-fps even more, again for certain kinds of scenes. You seem to be arguing that I don't "really" like the aesthetics, but am being misled by my associations. But I think I do really like the aesthetics!

Mostly, I don't see why my film fps preference is more culturally constructed, or constructed in some more objectionable way, than basically every single other thing having to do with film, which is nothing but an aesthetic and cultural phenomenon to begin with.

I mean, we might inquire into the habits and conventions that lead you to prefer 60-fps film, just as much as we look into those that lead me to prefer 24-fps film! Perhaps you have an aesthetic preference that films "should" be more realistic, and less stylized? Or a technological-aesthetic ideology that films should use "better" technology because it must necessarily be better aesthetically?


Wow, that's a lot of sophistry.

There are objective differences. 24 fps contains less information than 60 fps. Its temporal resolution is 2.5x worse.

Believe me, I can clearly see my own cultural bias. It keeps telling me 24 fps somehow does look okay. But we are not complete automata, we can distinguish between instinctive reactions (acquired tastes) and reality.

And the reality is, the higher the frame rate, the more information is recorded. An intentionally crippled low-framerate movie may serve certain esthetic norms, but that's pretty much all it does.

As far as 24 fps cinema is concerned, thanks to a century of conditioning, we're all pavlovian dogs.


I guess I'm missing how information recorded is even relevant. If recording footage of a scientific phenomenon for later study, sure. But art? What principle says that more temporal resolution equals better aesthetics? This seems like some sort of cultural bias, assuming higher fidelity = better aesthetics, which I don't think is true.

I tend to prefer more stylized films overall, not just in temporal resolution either. If you prefer more realistic ones that seem like you're really watching a scene that's happening, that's a legitimate preference, but it's not mine. I'm not sure the 24-fps (or better, 18-fps) effect is the most important or best anti-realism aesthetic effect in a film-maker's toolbox, but it's one of them. It's probably, as you say, an accident of history that it's as widely used as it is, but it's in my view a positive accident of history. Modern Hollywood is far too realistic in its shooting style imo, and this is one of the few fortunate areas where it hasn't gone all the way off that cliff.


I think you're missing GP's point. The look of black-and-white film creates a different feel than color film. Clearly, there's a lot of "information" missing from a black-and-white film. But filmmakers still use it when they want that feel. Same with the way CSI colors everything blue. Sure, it loses a lot of color info and distorts the gamut etc. but it has a particular feel because of that. So, sometimes (maybe even most of the time) you want a smoother picture with more motion information. But sometimes you want something that feels choppier. Example: http://revision3.com/filmriot/warfilm There's a short film at the beginning that demonstrates the techniques they're talking about, with commentary starting about 5:00 in. Specifically, in this film, they used a high shutter speed at 24 fps to create a choppier look. If they used 60 fps, motion would be smoothed out, so even with a high shutter speed the violent feel created by missing information wouldn't be the same.


more != better


This sounds a lot like "Now that we have color film, it's time for B&W to find it's true place - in the history books. It's time to move on."

Which is not to say that 24p won't get largely eclipsed. I suspect it will. But not because it's universally 'better'- just more appropriate to a growing number of things.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: