What I don't understand is how film crews can work together when they are larger than two pizza teams? And when they want to change something, it's almost like they just do it? Surely they have to file a ticket with the Product Owner first? And why don't they wait until the current sprint is done before doing things that clearly belongs to the next one? Why do the producer run around speaking in precise terms when he is clearly in the position of Business Owner and should stick to user stories? It's a wonder that the result is even watchable!
Sarcasm aside, there is something to be said about industries that let professionals do their work, and everyone is doing their bit towards a clearly defined shared goal. Considering the IT industry has taken so much ideas from industrial production, it wouldn't hurt to take some from artistic production too. After all, both are work concerned with refining blueprints where the final draft ends up being the product.
All the things you're describing -- in the spirit of tickets, sprints, etc. -- do happen. They're called pre-production. It takes years (months if you're lucky) to set up how everything will run on set. Producers have a huge list of actionables (tickets), and there is constant iteration (sprints) of parts of the script, of what the film's visual look will be, the tone, figuring out the budget, the crew, etc. And there are huge differences in responsibility between producer and director. A producer doesn't "run around speaking in precise terms" when that would step on the feet of the director, the cinematographer, etc. That would be micromanaging and unprofessional. The producer does very much stick to "user stories". When film crews want to change something, they don't "just do it". They very much do check in with the director or showrunner.
I suspect you're talking about execution, where everyone does "just do" things. When filming, every minute counts and shit needs to get done. Yes, every single person is tremendously empowered to do what's right, within their remit. But that only works because preproduction already worked out most of the kinks, and they should all basically be on the same page. But even then, things constantly go south. Shots take hours to set up and then turn out to be wrong for an infinite number of reasons. There are endless compromises. And during that process, only one person is in charge -- the director -- because they have to make a ton of decisions to compensate for all the things going wrong. So it's teamwork... but it's also a dictatorship and once the director makes a decision after collecting the input they want you do not argue.
You seem to be under the impression that film production is somehow more individually empowering or trusting than software development. It's not.
> Kirill: How much time did you have in pre-production to talk about ideas, visuals and inspirations?
> Christophe: We had a lot of time, and it’s a rare thing. The director Ariel Kleiman and I went through the same process for each episode. We were reading the scripts together, and throwing ideas and brainstorming. We did that twice for each episode, and then we started making moodboards. After that we did another read through, and then we started blocking the scenes. We had a lot of 3D pre-viz with ILM, with our camera and lenses in those virtual sets. That allowed us to start looking for shots and to refine everything.
> You seem to be under the impression that film production is somehow more individually empowering or trusting than software development. It's not.
It's sad that tens of thousands of kids attend film school, yet there are too few roles of autonomy for them.
The "Hollywood" system only makes a few thousand film and tv productions of scale per year. There are way more people with visions and ideas and dreams, and they're all left to wither on the vine.
How many Chris Nolans, Stanley Kubricks, and Ridley Scotts have we lost to the rat race?
I think this is the biggest potential for AI. Suddenly all of those directors and dreamers who couldn't { hack, struggle, nepo } their way to the top of the pyramid can pursue their vision.
YouTube and TikTok have been huge enablers of creativity. They're a much fairer and wider platform for enablement and distribution, and already today's youth are setting this target as their new generational dream.
We're likely about to see a film industry that resembles the gaming, publishing, and music industries. Studios will exist for large scale "AAA" fare, but individual auteurs and small teams will be able to make their mark. Steam Greenlight, Bandcamp, Wattpad, and Medium for the director. It's like what the DAW did for music production - no more need to spend tens of thousands to book a studio - except even more orders of magnitude in cost reduction.
We've needed studios for two things historically: (1) distribution (2) financing. YouTube and streaming solved #1, and Gen AI puts film [1] squarely within the "ramen budget" of college students. So pillar #2 is about to fall.
[1] I don't mean low budget films. Gen AI will give directors the VFX to achieve expansive science fiction and fantasy visions, exotic locales, and a stunningly beautiful cast (that most audiences prefer to watch).
Also, unlike software engineering, the film industry has been around for 100+ years. They've figured stuff out.
(We could say software engineering has been around for 50+ years, but I think that in its modern form it's probably more like 20 years. Highly subjective statement, I know.)
A big part of the difference is the timelines and scale.
When you ship a piece of software, it's often expected to be usable by a million people reliably for years.
In film and video production, you're duct taping shit together to get it to stay in one piece just long enough to get the shot and get the film out the door. You're fixing shit in post because you were in a hurry on set. It's a sort of barely controlled chaos.
> When you ship a piece of software, it's often expected to be usable by a million people reliably for years.
Is this really true anymore? I feel like people release software now expecting to continue to patch it repeatedly, so there isn't a push to get it perfect the first time.
You are right. It certainly depends on the industry and rigor involved. Safety and mission critical certainly needs to last years whereas people have become accustomed to accepting updates in other cases
The shot, once made, is forever unless you do an expensive reshoot. Software bugs happen and people have low expectations for bugs even in highly scaling software like YouTube.
Aside from what others have pointed out about fixability in post: People have pretty low expectations for most shots in most films. Every film has many mistakes in it that film buffs have fun cataloguing and no one else ever notices. Part of the art of filmmaking is knowing what must be fixed and what can be ignored, which is awfully similar to the job of a product manager.
One thing about Andor is that Gilroy specifically encouraged the directors not to shoot a lot of coverage and focus instead on very intentional shot lists, which both sped up shooting (critical given the cost of the production, COVID requirements, and the Hollywood strikes) and resulted in a much more crafted look, though it required a lot of detailed up-front planning. It’s really quite striking how different that approach is from the usual way things are shot, edited, and cleaned these days.
When someone is fixing it in postprod most of complexity of the initial shot is gone never to be seen again. It is not like they fix some colors on the scene and then last scene of the movie suddenly changes to something different :D
Have you seen the amount of changes George Lucas did when editing the prequel trilogy of Star Wars? They were digitally composing individual actor performances within a shot.
Isn't Lucas known for being an outlier in the industry in his devotion to post-production editing? I honestly don't know. But I am unsure that other production teams are capable of pulling this off and shipping on schedule with the same aptitude.
I've also heard this practice blamed for enabling a decade of "bad cgi"—visuals are reworked and reworked until time runs out, and in a rush they ship visuals that are worse than in movies 20 years ago. Explicitly: it's not the artist, it's a production pipeline failure. And indeed, it's easy finding marvel movies whose CGI looks closer to that of a video game than Lucas's prequels. Heck, even the sequel movies fall into this category.
Wonderful documentary, super recommended watching.
Though, nit pick -- when people say "fix it in post", they're not usually talking about editing.
Editing is an expected, normal, traditional part of the art of filmmaking. You're not changing or fixing the shots themselves; you are cutting and pasting footage sequentially to tell a coherent and compelling story. The YT you linked is a masterful example of how Star Wars was saved through this sort of editing.
"Fix it in post" refers more to making sloppy shots full of errors and then correcting those errors later with CGI.
Does that make sense? The art of stitching shots together, versus shooting crappy shots and CGIing them later.
But that’s where “we’ll get it post” comes in. The shot is just the starting point.
That said, directors shoot v differently - some will do literal hundreds of takes and still do plenty of post, others will shoot max three takes and send to edit.
Yea a movie is basically the most expensive demo imaginable in software engineering terms. It's a feat that you only need to get right a single time.
Curiously I think this shares a lot with other types of engineering. If you're putting men on the moon, you have to get everything right a single time.
Game dev sits in that weird middle ground where you have the unpredictability of a player interacting with the system and the high production demands of film. You can't just duct-tape a level together and hope no one notices
Film sets can seem enigmatic, the pacing, the language, the decorum. Film has over a hundred years of cultural development that manifests on set as set etiquette. Combined with mature unions that actively and heavily defend their trades. It can seem ridiculous and wasteful to the uninitiated… and indy productions split off constantly to try and reinvent the wheel only for individuals later in their career to converge back to established industry practices. In software there seems to be an overwhelming and toxic opinion that tech can solve all problems and that disruption at all cost is good. While not to dismiss these opinions wholeheartedly the wake of their destruction is not to be ignored. In film the human element is not only never ignored, it is the soul reason for being. As a creative endeavor whose output ideally is art, the working relationships, delegation of duties and decision making power is well established and enforced in the interest of efficient collaboration. Software on the other hand seems to be fully staffed with individuals who rarely get past tier four on the Maslow hierarchy, are entirely individualistic, highly competitive to a fault and in a never ending combative relationship with management that seems to be highly antithetical to the act of creation. For lack of deeper insights I chock it up to different financial incentives in the respective industries writ large. One is making cultural artistic or purely entertainment artifacts for humans, the other arguably creating solutions maybe for humans maybe not with the only goal of always more money no matter what.
I imagine this to greatly idealize the film industry, I mean, Tony Gilroy did have to “file a ticket with the product owner” to put the word “fuck” in one of the episodes and was denied. If you have a lot of cred you can get Final Cut and creative freedom, but I imagine most film productions are as bad if not worse than your average scrum experience.
Not the least of which, if you screw up a release in your software engineering career, you’ll probably get many chances to correct and have a fine if not better career later in. Fuck up a release almost any time in your film career and you may never work again.
My guess is that adding "fuck" would've changed the show's potential returns and content ratings, which is a pretty big change when projecting revenue, ad sales, etc.
Rather than "hey I just wanted to add one word and they pushed back"
I've never watched Andor but I can't imagine any Star Wars content with profanity. There's nothing wrong with profanity. Many of my favorite movies and TV shows have it in spades. I just don't see it fitting in with Star Wars.
It's incredible, everyone has pride and position. And are unique in stature and glue. No one person is unnecessary. All rely on each other to succeed. It's more army than office
Another thing is that the jobs are very defined and regimented - not that the person doing A1 couldn’t do a bunch of the other things, but he knows exactly what he is doing and not what others may be doing.
Software projects don't usually have the luxury of a violent but intuitive dictator running the thing. Obviously a mega budget tv show is going to be run from afar most of the time.
Software can be like this, e.g. if you were building a small team to do something you cared about it would probably be more like a 3* kitchen than support team inside IBM or something.
It's much, much more collaborative than that. The role of a producer can be limited to managing the financial side of the production. Or they can be involved with coordinating the production itself. The director has the most say over what is shot, how it's edited, what the score should be like. A good producer can help make a film excellent, but if you have a bad director, or the "wrong" director, then nothing the producer does will fix that issue.
Of course there are also Executive producers who are far far up the financial food chain of a movie, and sometimes these titles are also just vanity plates for the production.
As an example of the value of a good/great producer, look at Gary Kurtz. He was essential for both Star Wars and The Empire Strikes Back, but wasn't part of the team for Return of the Jedi. In my opinion, that's one of the reasons that ROTJ is the worst of the three films. Lucas didn't have someone pushing back against his worse tendencies.
For an example of the wrong director, look no further than Rogue One. While it's very difficult to sus out what was wrong with Gareth Edwards directing, Tony Gilroy was brought in to fix the film. Both are appropriately quiet about what the changes were, what was re-shot (simple professional courtesy), but the result is a film that (while I love it) is just slightly off.
You’re talking about something different. Certainly the director is the creative force. The sine qua non.. (Or.. cine qua non??)
But in the context of this discussion, the producer is the one that coordinates and brings together the huge complexity of the project. That is their defined role. It is the director’s vision and decision-making, but the producer ensures the execution across hundreds of people and many departments.
There's a creative energy that comes when something that has been planned for days/weeks/months suddenly can't be done but the expense of the equipment, cast/crew, location is going to be owed anyways is one of my favorite things about working in production. Only the largest of box office budget productions can actually shut down, but even they have the "behind schedule, over budget" issues too. Everyone works within their teams to solve the issue in ways that someone not on set would probably never know about, but ends up "saving the day" or whatever. There's a lot of hurry up and wait on set, but sometimes those hurry up moments are a lot of fun.
You joke but if you're filming in a part of the world with unions then there are a whole other ball of bureaucracy and silliness that gets in the way of just getting the shot that makes tickets, stories and standups seem smooth sailing.
Software development really has gone off the deep end. In any other field, people actually document what they do, and verify that it works, before releasing anything. Restaurants have recipes and food handling requirements, manufacturers have tolerances and verification, architects have building code, warehouses have inventory management, and so on and so forth. Because non-software development relies on products that actually work, and are not built around the meta game of abusing arbitrary metrics, workers can rely on other departments to make sensible choices.
Fun fact: Waterfall development never existed; it's a straw man argument against the common sense idea of finishing what your working on, before starting something new.
AI is going to pop the bubble of software development, not because it's good at it, but because because the entire field is too broken to compete against it.
Found the Amazonian. It’s amazing how the corporate jargon seeps in, no matter how hard you try :). In some ways, deciding to work there is definitely a one way door.
> when they want to change something, it's almost like they just do it?
Film production at this level have a lot of process. There are many teams and the director or showrunner acts as “product owner”. The crew is very hierachical and specialized with clearly defined turfs. Pre-production is iterative but when shooting starts it is waterfall - it is incredibly expensive to make small changes after shooting have started. Digital has made it somewhat more iterative since you can “fix it in post” but this still has its limits. You can remove a beard and add a spaceship but you cannot fix bad acting or bad lines.
Film production is definitely not just letting the creatives run wild. Any idea has a price tag attached and will ultimately have to be approved by the accounting team. And film production tend to have a hard deadline and no possibility of patch updated.
Software development is a lot easier since you can keep developing iteratively.
A regular pizza of that size have about 2000 kcal of energy and an average male would need 2500 kcal a day (less for women), so unless you only have a single meal a day, which is unlikely, then it must be enough.
This is the joy of physical reality as opposed to software. Atoms vs bits.
Software is different because digital systems are messes of rigid causality. If reality were like software moving a table could trigger the elevator to stop working and birds to fly upside down by breaking DNS by way of a change in server load triggering Kubernetes to get into a weird state where it kills and restarts DNS too fast to allow it to properly initialize and serve requests, but only when it is raining in Bangalore, India on a Tuesday.
The other nice thing about reality is reusability. A table used in one movie set could be used in a different movie without rebuilding the table.
There has been a ton of work on good system design to avoid this, like well done (not enterprise Java) OOP, and we were getting there until the web and cloud hit and we decided to trash all that and go back to piles of slop on Unix servers. Still wouldn’t have been as inherently causally ordered as physics but it might have been nicer.
I think in the interest of hn pedantry we need some pushback on what our threshold is for what is critical or not? Lives on the line? Cleary critical!
Anything after this is up for debate.
Lots of money at risk? Maybe.
Lots of people being affected? Maybe
Lots of people losing their jobs? Maybe
A company going out of business? Maybe
While most things in entertainment do not pass the first test. There are plenty of examples that absolutely pass all of the others.
I only say most because there are numerous circumstances where in the entertainment industries where maybe not always the final product but definitely during production where lives are on the line.
I have a problem with defining things as critical or not because you have no idea what your products or solutions may end up being used for. If things are being used by people in a production capacity and the collective stake of the success or failure of the endeavor rests upon the tools and solutions performing as designed then one could argue these are critical. But to whom? When designers don't see their solutions as critical it breeds laziness and pushes the responsibility for auditing onto the user who most likely doesn't even know to audit because they were sold a solution and made the mistake of taking a vendors word. Which maybe we should all be tearing down all of our electronics and reverse engineering all of our software so that we can have some semblance of functionality… And I know plenty of studios which do just this down to tearing down and making all and I do mean all of their cables because for them one stray bit of interference is critical.
Why are they in this position? Because nothing and no one in tech or any other industry can be trusted. And when your career and reputation, and livelihood is on the line then everything starts to look critical. Maybe its only the livelihood of one? But to them its critical.
Critical are things that make your daily life seriously disturbed if there are issues beyond your control. Things like:
1. Life
1. House
2. Salary
3. Reputation
4. Physical or mental health
5. Time
> definitely during production where lives are on the line
You can say that for any company - if they wired electricity in a bad way on my job, lives can be lost.
> because you have no idea what your products or solutions may end up being used for
Many products are used exactly for 1 thing. For example, online banking. You don't use it to for dating, although you theoretically can if you really want to
Sarcasm aside, there is something to be said about industries that let professionals do their work, and everyone is doing their bit towards a clearly defined shared goal. Considering the IT industry has taken so much ideas from industrial production, it wouldn't hurt to take some from artistic production too. After all, both are work concerned with refining blueprints where the final draft ends up being the product.