I've been lightly following this type of research for a few years. I immediately recognized the broad idea as stemming from the lab of the ridiculously prolific Stefano Ermon. He's always taken a unique angle for generative models since the before times of GenAI. I was fortunate to get lunch with him in grad school after a talk he gave. Seeing the work from his lab in these modern days is compelling, I always figured his style of research would break out into the mainstream eventually. I'm hopeful the the future of ML improvements come from clever test-time algorithms like this article shows. I'm looking forward to when you can train a high quality generative model without needing a super cluster or webscale data.