“Sora2’s remix feature exemplifies the core problem with "child safety" in generative AI. When OpenAI trains Sora2 on the entire internet to make it creative, that training encodes patterns that can’t simply be filtered out.

The patterns are essential to its function. The model learns patterns that can be used to recombine content in ways the creators wouldn’t have anticipated. “Good” content can become “bad” content and vice versa. The process is the same.

In fact, the process is kind of the whole point. The entire AI/LLM/ML engine runs on humans and machines collaborating on slight, creative variations on established patters to find novel recombinations. Sometimes that means “Walk My Walk.” Sometimes it means trolls turning videos of women jogging into porn.”