There's lots of cuts and morphs, so it's not continuous. Also lots of speed adjustments. Many of the shots were probably done very slowly and sped up after.
No reason that it needs to be FPV - if they have a 3D model of the factory they could have created a path for it to fly. I don't think Ardupilot or Betaflight is up to that(?), but a small team of engineers working for a couple of months would be able to make it work.
This is almost certainly FPV as the cost of building an autonomous solution would be incredible. It's more likely parts are CG than that the camera vehicle flew autonomously, and I think neither are likely, this looks like "top 5% FPV pilot" work to me.
Inside-out indoor 3D positioning is an _exceptionally hard_ problem. Like, really hard. It parallels inside-out road positioning enough that there is crossover (and IIRC there is a lot of employee motion between Tesla and DJI and other drone autonomy makers), but I don't think this would be an easy "few months" project and certainly not cost effective.
Picture drawing lines in 3d space in something a little more sophisticated than SketchUp, and then having the drone follow those paths. What's tricky about that? Indoors you have no weather, and could use a beacon system to pinpoint location.
And this is the typical dismissive rebuttal of an HN post.
They built a factory using a high degree of automation in a 3D space - i.e. they have a significant staff of automation engineers.
Intel has been doing tight formation drone light shows outdoors (in weather) since at least 2014, if not earlier. Positioning drones precisely enough (i.e. within N centimeters) is therefore objectively possible if you have the right feedback system (and as a sibling comment points out, IR sensors could work for that). Offload all the processing to a computer rather than the drone, and it isn't an insurpassable problem.
A small team of engineers for a few months is <$1m, which for a marketing campaign this splashy is nothing. Granted, per one of the sibling comments they did use an FPV pilot for at least some of the shots, but there's nothing stopping them from having automated other parts of it.
yeah or it could take a week of planning a week of shooting and a week of editing with a crew of people that make drone videos for a living… they’re not pulling automation engineers off of their projects to work on something completely unrelated for a one-time promo shoot, especially not for an entire month
An array of active IR cameras in fixed, known positions, plus IR reflector markers on the UAV works well out of the box over ranges of a few meters. Eg. Vicon. Same tech as motion capture for movies.
I don’t have experience using it in volumes as large as the video, but in moderate sized lab space it works very well, and at high frame rates.
The trick is to invert the problem and have the active sensors off-board and the robot is the (reflected energy) beacon.
How do you propose you use outside-in to fly through an operating robot? There are too many sources of occlusion for off the shelf outside-in solutions to work at all in this video.
No reason that it needs to be FPV - if they have a 3D model of the factory they could have created a path for it to fly. I don't think Ardupilot or Betaflight is up to that(?), but a small team of engineers working for a couple of months would be able to make it work.