I need to say something that will probably cost me readers, possibly my last shred of credibility as a supposed moderate voice.
I don’t care.
Here it is:
If you let your child consume content on Sora - OpenAI’s shiny new TikTok clone - or whatever “Vibes” slopoff Meta is peddling this week - you are a bad parent.
These “services” offer an unending feed of AI generated videos - garbage, all the way down - intended to remove the already tenuous connection between short form video and reality, in pursuit of the scum coated bottom of the media consumption barrel.
There’s no wiggle room here. No shades of gray. If you let your kids use either of these apps, you’ve failed the most basic test of guardianship: protecting the undeveloped mind of a child from a machine designed to liquefy attention and hardwire addiction.
That might sound harsh.
We’ve developed an instinct to recoil when someone draws an unambiguous moral line in an age of perpetual and near-crippling “nuance.”
Everyone wants to argue about the gradations: “But what if I monitor it closely?” “What if my kid only uses it for fifteen minutes a day?” “What if I use it with them?” Stop. Do you monitor fentanyl use in fifteen-minute increments? Do you sit down with your kid and say, ‘Sweetheart, only one puff of the vape before bed’? No? Then why are you doing the same with the digital equivalent?
These apps are not neutral. They are not tools that can be molded by “responsible” use. They are engineered, down to the sub-second, to bypass executive function and hijack dopamine loops. OpenAI’s press release practically admitted it: they bragged that Sora will use your location, your chat history, your activity, your every flick of the thumb, to serve up the next hit. They are telling you outright: we are building a slot machine that knows you better than you know yourself. And parents will still line up to let their twelve-year-olds (and younger) walk into the casino unsupervised.
I have no patience anymore for the language of “parental controls.” The same companies that perfected infinite scroll are the ones offering you toggles and filters as if that absolves them. Imagine the CEO of Philip Morris in the 1960s saying: good news, we’ve added filters to the cigarettes, so now you can feel comfortable giving them to your toddler. Would you buy that line? Parents are buying it now, and worse, congratulating themselves for “engaged” parenting because they bothered to open the settings menu.
When I first wrote about the dangers of TikTok and got emails from readers saying I was exaggerating, that it was no different than the TV their generation grew up with. That was nonsense then, and it’s nonsense now. The television sat in the living room. It didn’t study your micro-reactions in real time and feed them into a neural net optimized to hold your eyes for thirty more minutes. TV was passive. These new platforms are predatory. They learn, they adjust, they adapt. Sora might “entertain” your child - but it’ll do much worse, by reprogramming them.
And yes, I’ll use that word: reprogramming. Anyone who has watched a child scroll knows the vacant look, the slack jaw, the flicker of a smile that’s gone before it registers. Try to pull them away mid-feed and you’ll see the withdrawal symptoms. How much worse does that become when the content is a feed of the lowest possible form of digital media ever conceived by the human mind: AI slop that has not been conceived by the human mind.
I grew up in a home with books stacked on every table, spines cracked, pages dog-eared. Reading was just what you did. The default state of childhood was // is boredom, and out of boredom comes imagination. What happens when boredom is obliterated? When every microsecond can be filled with a synthetic video that’s been tested against millions of other videos to maximize its grip? You don’t get imagination. You don’t get curiosity. You don’t get intelligence. You don’t get critical thinking. You don’t even get a sign of life. You get a “human” being conditioned to fear silence, to recoil from stillness and to mistake stimulation for meaning.
I anticipate the pushback: you’re blaming parents when the real culprits are corporations. But both are true. Yes, OpenAI and Meta are guilty. They know exactly what they’re building. But parents still have the power to say no. You can delete the app. You can block the download. You can endure the tantrum.
And if you don’t?
Then yes, you are complicit.
People like to retreat to relativism: “Every generation thinks the next one’s media will rot the brain.” Socrates worried about writing, after all. Isn’t this the same?
Writing didn’t reduce human connection to a string of ten-second simulations. Writing didn’t keep children up at 2 a.m. with dilated pupils and sweaty palms. The Sora feed will.
I don’t expect parents to throw their smartphones into the ocean. I don’t expect children to live like monks. But if we can’t draw a bright red line at an AI-generated infinite video slot machine, then what line will we draw? Are we really so cowardly that we can’t endure a few weeks of sulking in exchange for preserving our kids’ sanity? Is it truly easier to outsource our authority to an algorithm than to hold the line as a parent?
Some folks are already saying I’m overreacting. That I should trust the “marketplace.” That the kids will turn out fine, just like we turned out fine with Nintendo and cable. But what if they don’t? What if this time the experiment is different? What if the stakes are higher, the tools sharper, the prey younger? If we wait for the data to come in, it will already be too late. Addiction doesn’t leave long enough control groups.
We’re already seeing the results of a generation of isolated, memified, disconnected, unsocial teenagers becoming adults with access to firearms. We’re already seeing the brain rot of TikTok and short form video manifest in nihilism from the kitchen table to the class room to the voting booth.
If you honestly belief an always on, endless pipeline of inhuman content won’t make that worse, I have nothing to say to you. Nor would I wish to keep your company long enough to say it.
Here’s the line, and as far as I’m concerned, it’s not up for debate: if you let your child consume content on Sora, you are failing them. They don’t need your friendship. They don’t need you to be the “cool” parent with the latest app. They need you to be the barrier, the wall, the force that protects them from the machines they cannot yet resist. You wouldn’t let them play Russian roulette with a revolver. Don’t let them play it with their brains.
History will not look kindly on this era. It will wonder how we let our children’s minds become the raw material for machine learning experiments. It will wonder why we defended our kids’ right to be exploited rather than defending their right to grow up unharmed by algorithmic manipulation. I don’t want to be on the wrong side of that judgment. Neither should you.
Don’t rationalize it. Don’t negotiate with it. Don’t turn the choice into a think-piece debate. Just say no. And if your child screams at you, fine. Better they scream at you now than spend their adulthood unable to sit in silence, unable to read a book, unable to hold a single thought of their own.
That’s the line, and that’s the responsibility. Cross it, and you are no longer parenting. You are volunteering your child for the great experiment. And in that case, don’t be surprised when you discover what they become.
Trust me when I tell you: it won’t be what you hoped.