A big part of the difference is the timelines and scale.
When you ship a piece of software, it's often expected to be usable by a million people reliably for years.
In film and video production, you're duct taping shit together to get it to stay in one piece just long enough to get the shot and get the film out the door. You're fixing shit in post because you were in a hurry on set. It's a sort of barely controlled chaos.
> When you ship a piece of software, it's often expected to be usable by a million people reliably for years.
Is this really true anymore? I feel like people release software now expecting to continue to patch it repeatedly, so there isn't a push to get it perfect the first time.
You are right. It certainly depends on the industry and rigor involved. Safety and mission critical certainly needs to last years whereas people have become accustomed to accepting updates in other cases
The shot, once made, is forever unless you do an expensive reshoot. Software bugs happen and people have low expectations for bugs even in highly scaling software like YouTube.
Aside from what others have pointed out about fixability in post: People have pretty low expectations for most shots in most films. Every film has many mistakes in it that film buffs have fun cataloguing and no one else ever notices. Part of the art of filmmaking is knowing what must be fixed and what can be ignored, which is awfully similar to the job of a product manager.
One thing about Andor is that Gilroy specifically encouraged the directors not to shoot a lot of coverage and focus instead on very intentional shot lists, which both sped up shooting (critical given the cost of the production, COVID requirements, and the Hollywood strikes) and resulted in a much more crafted look, though it required a lot of detailed up-front planning. It’s really quite striking how different that approach is from the usual way things are shot, edited, and cleaned these days.
When someone is fixing it in postprod most of complexity of the initial shot is gone never to be seen again. It is not like they fix some colors on the scene and then last scene of the movie suddenly changes to something different :D
Have you seen the amount of changes George Lucas did when editing the prequel trilogy of Star Wars? They were digitally composing individual actor performances within a shot.
Isn't Lucas known for being an outlier in the industry in his devotion to post-production editing? I honestly don't know. But I am unsure that other production teams are capable of pulling this off and shipping on schedule with the same aptitude.
I've also heard this practice blamed for enabling a decade of "bad cgi"—visuals are reworked and reworked until time runs out, and in a rush they ship visuals that are worse than in movies 20 years ago. Explicitly: it's not the artist, it's a production pipeline failure. And indeed, it's easy finding marvel movies whose CGI looks closer to that of a video game than Lucas's prequels. Heck, even the sequel movies fall into this category.
Wonderful documentary, super recommended watching.
Though, nit pick -- when people say "fix it in post", they're not usually talking about editing.
Editing is an expected, normal, traditional part of the art of filmmaking. You're not changing or fixing the shots themselves; you are cutting and pasting footage sequentially to tell a coherent and compelling story. The YT you linked is a masterful example of how Star Wars was saved through this sort of editing.
"Fix it in post" refers more to making sloppy shots full of errors and then correcting those errors later with CGI.
Does that make sense? The art of stitching shots together, versus shooting crappy shots and CGIing them later.
But that’s where “we’ll get it post” comes in. The shot is just the starting point.
That said, directors shoot v differently - some will do literal hundreds of takes and still do plenty of post, others will shoot max three takes and send to edit.
Yea a movie is basically the most expensive demo imaginable in software engineering terms. It's a feat that you only need to get right a single time.
Curiously I think this shares a lot with other types of engineering. If you're putting men on the moon, you have to get everything right a single time.
Game dev sits in that weird middle ground where you have the unpredictability of a player interacting with the system and the high production demands of film. You can't just duct-tape a level together and hope no one notices
When you ship a piece of software, it's often expected to be usable by a million people reliably for years.
In film and video production, you're duct taping shit together to get it to stay in one piece just long enough to get the shot and get the film out the door. You're fixing shit in post because you were in a hurry on set. It's a sort of barely controlled chaos.
Game development is somewhere between the two.