Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think if you handed this image to a VFX artist and said “make this,” they could do it. They probably could have done it back in 2015, too.

But the team making this image didn’t have it in advance. They just had some ideas they wanted to try out.

The story of how it was made is not just the story of the techniques they used, but also how they applied and adjusted those techniques to try things and see how it looked.

This image was “made” in that people built the stuff that was photographed, but it was also kind of “discovered” in that they tried a bunch of things until they discovered what they liked.

That’s possible in VFX too, but the process is different and too many iterations can quickly erase any cost advantage. This is one reason animated movies are disciplined about locking the script early in production. You can’t cost-effectively improv your way through an animated movie the way a director and actors can with a camera and a set.



I agree with all of this -- particularly the exploration and discovery aspects.

But a side note on this:

> You can’t cost-effectively improv your way through an animated movie the way a director and actors can with a camera and a set.

The Jim Henson Company is working on exactly the technology that supports this, actually -- the business of live puppetry capture as distinct from, say, motion capture.

https://www.youtube.com/watch?v=gzbBdRHqGcQ

https://www.youtube.com/watch?v=uDIlylZwLJE

This kit is expensive/bespoke but I don't know that it's _that_ expensive, set against how much money goes into making movies with large-scale bluescreen work these days. And it's wholly amenable to improv.


They've been experimenting with this since at least 1989. https://muppet.fandom.com/wiki/Waldo_C._Graphic

People have done simpler realtime rigs with off-the-shelf hardware and software in the past decade: see "The Dog Of Wisdom", https://www.youtube.com/watch?v=D-UmfqFjpl0 which was made with Blender and a Leap Motion: https://www.youtube.com/watch?v=0a_M9VsZ6Lk

And now we are completely drowning in VTubers, who use software like Live2d that analyzes a webcam image and uses it to control the motions of a pre-made 2D character. I've only ever seen it done to spice up the video of people streaming video games but I'm sure there's someone doing no-budget cartoons with it. There's also Adobe Character Animator, which has been used for various TV stuff like a live performance of Homer Simpson or a few low-budget shows.

And then there's VRChat; a few thousand dollars of head-mounted display/facial capture/body trackers and you can get realtime full-body tracking. There's probably someone fucking around with making movies this way too.

At this point I'm pretty sure that you could get most of the functionality of that hand-tooled puppetry gizmo by just taking a sock and gluing a couple of ping-pong balls onto it and tweaking some tracking software.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: