Tyler Perry Puts $800M Studio Expansion On Hold After Seeing OpenAI’s Sora: “Jobs Are Going to Be Lost”::Tyler Perry is raising the alarm about the impact of OpenAI’s Sora on Hollywood.
Tyler Perry Puts $800M Studio Expansion On Hold After Seeing OpenAI’s Sora: “Jobs Are Going to Be Lost”::Tyler Perry is raising the alarm about the impact of OpenAI’s Sora on Hollywood.
Yeah you can.
Same way you can correct parts of a generated image, and have the generator go back and smooth it over again. Denoiser-based networks identify parts that don’t look right and nudge them toward the expectations of the model. Sora clearly has decent expectations for how things look and move. I would bet anything that pasting a static image of a guy’s head, facing the desired direction, will result in an equally-plausible shot with that guy facing the right way.
There have been image-generator demos where elements can be moved in real-time. Yeah, it has wider effects on the whole image, because this technology is a pile of hacks - but it’s not gonna turn red wallpaper into a green forest, or shift the whole camera angle. You’re just negotiating with a model that expects, if this guy’s facing this way now, his hands must go over here. Goofy? Yes. Ruinous? Nope.
And at the end of the day you can still have human artists modify the output, as surely as they can modify actual film of actual people. That process is not quick or cheap. But if your video was spat out by a robot, requiring no actors, sets, or animators, manual visual effects might be your entire budget.
Really - the studios that do paint-overs for Hollywood could be the first to make this tech work. They’d only need a few extra people to start from the first 90% of a movie.