Editor’s Note: This excerpt is adapted from The Tech Exit: A Practical Guide to Freeing Kids and Teens from Smartphones, by Clare Morell.
Most parents are aware of the dangers of online pornography, so they set up filters to block it on their children’s devices, on their family computer, and even on their home Wi-Fi router. Filters, however, often fall short, especially when it comes to app-based devices. They don’t always work to block sites accessed inside apps (via in-app browsers), which is dangerous when a child can get to PornHub inside of Snapchat in just five clicks.
One mom in an online community I am in shared that even the good apps have back doors that filters don’t catch. She recounted how her son ironically used a parental monitoring app on his phone to access porn. He opened the parental app, clicked the Support button, and a browser opened within the app that he could use to view porn—and the separate porn filter she had installed on the phone didn’t block it. That’s only one example of the many loopholes in filters.
But kids don’t even need to go looking for porn on the web—it finds them on social media. In one leaked study, nearly a fifth of teens ages 13 to 15 saw sexually explicit content at least once a week on Instagram. Filters that can block adult websites or nudity in a web browser are powerless to filter out the explicit content being distributed within the apps themselves.
An investigation by the Wall Street Journal into teens’ experiences on TikTok confirms that explicit material is on social media. The reporters created dozens of automated accounts of minors aged 13 to 15 to see what their feeds looked like. One fake account of a 13-year-old user searched the app for Onlyfans (a platform primarily used by adult content creators to post pornography), and the pretend user watched a handful of videos in the results, including two selling pornography. The user then turned back to the personalized For You feed, and the app quickly served up more sexual content, including dozens of role-playing videos where people pretended to be in relationships with caregivers. The feed was soon almost entirely dominated by videos involving sexual power dynamics and violence.
The investigation found that “the app’s algorithm had pushed the user into a rabbit hole many users call ‘KinkTok,’ featuring whips, chains, and torture devices. Some of this content is banned from the platform.” Yet there it was in this (fake) minor user’s feed. And when users try to help Instagram out by reporting content for removal, “with a simple, well-positioned flower over a nipple, or a slight blurring of certain private parts, an otherwise nude person isn’t flagged as inappropriate.” Or even worse, rather than trying to remove adult content, X’s guidelines now explicitly allow it. X claims it will block this content for users under 18, an empty protection when the app has no age-verification process. Despite its adult content, X remains available on app stores for children to find and download.
Instagram advertises that they default teens under 18 into the Less setting of their Sensitive Content Control, which they say limits teens’ exposure to potentially sensitive con-
tent. But teens over 15 (unless they have accepted parental supervision) can make changes to this setting at any time. Other platforms, like Snapchat or TikTok, have no default content restrictions for minors, though they do allow parents to set stricter content restrictions when they pair their accounts with their children’s (although as mentioned earlier, the teen always has to opt in to this supervision).
Even when content restrictions are enabled for an app, they can be ineffective. One mom who had activated TikTok’s Restricted Mode found during her test run that all the videos in the feed were still inappropriate for her child, including one video that was a play on ejaculation.
The ubiquity of online pornography, the ineffectiveness of app content restrictions, and filters’ loopholes and their inability to filter out the explicit content inside apps’ feeds, mean children are being exposed to porn young and often. In a 2022 study, 73 percent of the teens surveyed reported that they had been exposed to porn. The average age of first exposure was 12. More than half had encountered it accidentally. Of those, 63 percent said they had been exposed in the past week, suggesting it’s a frequent experience. Much of the accidental exposure came from online means, such as social media, a link they didn’t realize was porn, search results, or an online ad. And the porn that children are stumbling across online today isn’t your uncle’s Playboy; it’s dehumanizing, violent, and grotesque.
The ubiquity of online pornography, the ineffectiveness of app content restrictions, and filters’ loopholes and their inability to filter out the explicit content inside apps’ feeds, mean children are being exposed to porn young and often.
Isabel Hogben accidentally stumbled across Pornhub at the age of 10. She shared her story of what that was like: “When I talk to adults, I get the strong sense they picture a hot bombshell in lingerie or a half-naked model on a beach. This is not what I stumbled upon back in fourth grade. I saw simulated incest, bestiality, extreme bondage, sex with unconscious women, gangbangs, sadomasochism, and unthinkable physical violence. The porn children view today makes Playboy look like an American Girl doll catalog.”
Indeed, research shows it’s common for teens to see pornography depicting rape, choking, or pain. And they aren’t just seeing it. Nurses, teachers, and principals are reporting young children acting out on others the mature content they have consumed. Eoghan Cleary, an assistant principal of a secondary school in Ireland who coordinates health education at the school, has found that students’ constant access to hardcore pornography is affecting their real-life relationships. He has his 16-year-old students make a list of what they think is expected of them in a sexual interaction and what they think is expected of the opposite sex. He says, “For our male students, they expect to chase the woman, to slap and to choke her, to be dominant, to be aggressive, to be in control, to know what you want, and to be with as many as possible…. For my female students, it’s to be submissive, to be kinky, and to be slapped, choked, to do what he says and to do what he likes.”
A parent used to be able to poke their head into the basement or step outside to check on what their kid was doing with friends. They didn’t need to monitor a child’s behavior constantly, but they could keep an eye out, and there was an expectation on the child’s part of accountability, knowing that Mom or Dad was nearby and could pop in at any minute. Not so in the online world.
Miriam Cates, a former member of the U.K. Parliament, has been a champion of online safety legislation. She tried to paint a picture of the dire reality of the online world in a speech she gave last year: “Imagine if our streets were so lawless that it was unsafe for children to leave their homes. Imagine if, on their daily walk to school, our children had to witness the beheading of strangers or the violent rape of women and girls. Imagine if, when hanging out in the local park, it was normal for hundreds of people to accost our child and encourage them to take their own life. Imagine if it was a daily occurrence for our children to be propositioned for sex or blackmailed into stripping for strangers. Imagine if every mistake that our child made was advertised on public billboards, so that everyone could laugh and mock until the shame made life not worth living. This is not a horror movie or some imaginary wild west; this is the digital world that our children occupy, often for hours a day.”
Without real parental control over smartphones and social media, children are thrust into a virtual adult world without any adults to guide them or look out for them there. And they don’t know how to handle it. How could they? They’re children.
One frustrated mom, Kathleen Linder, got tired of hearing that parental controls are an effective way to protect kids on social media, so she wrote a letter to the editor in the Wall Street Journal to push back on this argument. She explained how she’d tried parental controls, which didn’t work, and ended her letter by saying, “The only way I can successfully limit my children’s time on their devices and social media is to take away their iPhones. As a society, maybe that’s what we should consider.”
She’s right. That is what we should consider. We don’t take our children to bars and strip clubs and blindfold them or have them wear earplugs. That would be absurd. We just don’t let them go to those places. It should be the same for the virtual world.