11st amendmentadult contentage verificationandrew fergusonchatbotsFeaturedfiltersFree SpeechftcInternet

Yes, The FTC Wants You To Think The Internet Is The Enemy To The Great American Family

from the moral-panic-needs-a-morality-play dept

This is a combo piece with the first half written by law student Elizabeth Grossman about her take on the recent FTC moral panic about the internet, and the second part being some additional commentary and notes from her professor, Jess Miers.

The FTC is fanning the flames of a moral panic. On June 4, 2025, the Commission held a workshop called The Attention Economy: How Big Tech Firms Exploit Children and Hurt Families. I attended virtually from the second panel until the end of the day. Panelists discussed how the FTC could “help” parents, age verification as the “future,” and “what can be done outside of Washington DC.”  But the workshop’s true goal was to reduce the Internet to only content approved by the  Christian Right, regardless of the Constitution—or the citizens of the United States. 

Claim #1: The FTC Should Prevent Minors From Using App Stores and Support Age Verification Laws

FTC panelists argued that because minors lack the legal capacity to contract, app stores must obtain parental consent before allowing them to create accounts or access services. That, in turn, requires age verification to determine who is eligible. This contractual framing isn’t new—but it attempts to sidestep a well-established constitutional concern: that mandatory age verification can burden access to lawful speech. In Brown v. Entertainment Merchants Association, the Supreme Court reaffirmed minors’ rights to access protected content, while Reno v. ACLU struck down ID requirements that chilled adult access to speech. Today, state-level attempts to mandate age verification across the Internet have repeatedly failed on First Amendment grounds.

But by recasting the issue as a matter of contract formation rather than speech, proponents seek to sidestep those constitutional questions. This is the same argument at the heart of Paxton v. Free Speech Coalition, a case the FTC appears to be watching closely. FTC staff repeatedly described a ruling in favor of Texas as a “good ruling,” while suggesting a decision siding with the Free Speech Coalition would run “against” the agency’s interests. The case challenges Texas’ H.B. 1181, which mandates age verification for adult content sites. 

The FTC now insists that age verification isn’t about restricting access to content, but about ensuring platforms only contract with legal adults. But this rationale collapses under scrutiny. Minors can enter into contracts—the legal question is whether and when they can disaffirm them. The broader fallacy about minors’ contractual incapacity aside, courts have repeatedly rejected similar logic. Most recently, NetChoice v. Yost reaffirmed that age verification mandates can still violate the First Amendment, no matter how creatively they’re framed. In other words, there is no contract law exception to the First Amendment.

Claim #2: Chatbots Are Dangerous To Minors

The panel’s concerns over minors using chatbots to access adult content felt like a reboot of the violent video game panic. Jake Denton, Chief Technology Officer of the FTC,  delivered an unsubstantiated tirade about an Elsa-themed chatbot allegedly engaging in sexual conversations with children, but offered no evidence to support the claim. In practice, inappropriate outputs from chatbots like those on Character.AI generally occur only when users—minors or adults—intentionally steer the conversation in that direction. Even then, the platform enforces clear usage policies and deploys guardrails to keep bots within fictional contexts and prevent unintended interactions.

Yes, teens will test boundaries, as they always have, but that doesn’t eliminate their constitutional rights. As the Supreme Court held in Brown v. Entertainment Merchants Association, minors have a protected right to access legal expressive content. Then, it was video games. Today, it’s chatbots. 

FTC Commissioner Melissa Holyoak adopted a more cautious tone, suggesting further study before regulation. But even then, the agency failed to offer meaningful evidence that chatbots pose widespread or novel harm to justify sweeping intervention.

Claim #3: Pornography is Not Protected Speech

Several panelists called for pornography to be stripped of First Amendment protection and for online pornography providers to be denied Section 230 immunity. Joseph Kohm, of Family Policy Alliance,  in particular, delivered a barrage of inflammatory claims, including: “No one can tell me with any seriousness that the Founders had pornography in mind […] those cases were wrongly decided. We can chip away […] it is harmful.” He added that “right-minded people have been looking for pushback against the influence of technology and pornography,” and went so far as to accuse unnamed “elites” of wanting children to access pornography, without offering a shred of evidence.

Of course, pornography predates the Constitution, and the Founders drafted the First Amendment to forbid the government from regulating speech, not just the speech it finds moral or comfortable. Courts have consistently held that pornography, including online adult content, is protected expression under the First Amendment. Whether panelists find that inconvenient or not, it is not the FTC’s role to re-litigate settled constitutional precedent, much less redraw the boundaries of our most fundamental rights.

During the final panel, Dr. Mehan said that pornography  “is nothing to do with the glorious right of speech and we have to get the slowest of us, i.e. judges to see it as well.” He succeeds in disrespecting a profession he is not a part of and misunderstanding the law in one foul swoop. He also said “boys are lustful” because of pornography and “girls are vain” because of social media. Blatant misogyny aside, it’s absurd to blame social media for “lust” and “vanity”–after all, Shakespeare was writing about them long before XXX videos and Instagram—and even if it weren’t, teenage lust is not a problem for the government to solve.

Panelist Terry Schilling from the American Principles Project—known for his vehemently anti-LGBT positions—called for stripping Section 230 protections from pornography sites that fail to implement age verification. As discussed, the proposal not only contradicts longstanding First Amendment precedent but also reveals a fundamental misunderstanding of what Section 230 does and whom it protects.

Claim #4: The Internet Is Bad For Minors

FTC Commissioner Mark Meador compared Big Tech to Big Tobacco and said that letting children on the Internet is like dropping children off in the red light district. “This is not what congress envisioned,” he said, “when enacting Section 230.” Commissioner Melissa Holyoak similarly blamed social media for the rise in depression and anxiety diagnoses in minors. Yet, as numerous studies on social media and mental health have consistently demonstrated, this rise stems from a complex mix of factors—not social media.

Bizarrely, Dr. Mehan noted “Powerpoints,” he said, “are ruining the humanities.” And he compared online or text communication to home invasion: if his daughter was talking on the phone to a boy at 11 o’clock at night, he said, that boy would be invading his home.

This alarmist narrative ignores both the many benefits of Internet access for minors and the real harms of cutting them off. For young people, especially LGBTQ youth in unsupportive environments or those with niche interests, online spaces can be essential sources of community, affirmation, and safety. Just as importantly, not all parents share the same values or concerns as the government (or Dr. Mehan). It is the role of parents, not the government, to decide when and how their children engage with the Internet.

In the same vein, the Court in NetChoice v. Uthmeyer rejected the idea that minors are just “mere people-in-waiting,” affirming their full participation in democracy as “citizens-in-training.” The ruling makes clear that social media access is a constitutional right, and attempts to strip minors of First Amendment protections are nothing more than censorship disguised as “safety.”

Conclusion

The rhetoric at this event mirrored the early pages of Project 2025, pushing for the outright criminalization of pornography and a fundamental rewrite of Section 230. Speakers wrapped their agenda in the familiar slogan of “protecting the kids,” bringing up big right-wing talking points like transgender youth in sports and harping on good old family values—all while advocating for sweeping government control over the Internet.

This movement is not about safety. It is about power. It seeks to dictate who can speak, what information is accessible, and whose identities are deemed acceptable online. The push for broad government oversight and censorship undercuts constitutional protections not just for adults, but for minors seeking autonomy in digital spaces. These policies could strip LGBTQ youth in restrictive households of the only communities where they feel safe, understood, and free to exist as themselves.

This campaign is insidious. If successful, it won’t just reshape the Internet. It will undermine free speech, strip digital anonymity and force every American to comply with a singular, state-approved version of “family values.”

The First Amendment  exists to prevent exactly this kind of authoritarian overreach. The FTC should remember that.

Elizabeth Grossman is a first-year law student at the University of Akron School of Law in the Intellectual Property program and with a goal of working in tech policy.

Prof. Jess Miers’ Comments

Elizabeth’s summary makes it painfully clear: this wasn’t a serious workshop run by credible experts in technology law or policy. The title alone, “How Big Tech Firms Exploit Children and Hurt Families,” telegraphed the FTC’s predetermined stance and signaled a disinterest in genuine academic inquiry. More tellingly, the invocation of “families” serves as a dog whistle, gesturing toward the narrow, heteronormative ideals typically championed by the religious Right: white, patriarchal, Christian, and straight. The FTC may not say the quiet part out loud, but it doesn’t have to.

Worse still, most of the invited speakers weren’t experts in the topics they were pontificating on. At best, they’re activists. At worst, they’re ideologues—people with deeply partisan agendas who have no business advising a federal agency, let alone shaping national tech policy.

Just a few additional observations from me.

Chair Ferguson opened by claiming the Internet was a “fundamentally different place” 25 years ago, reminiscing about AOL Instant Messenger, Myspace Tom, and using a family computer his parents could monitor. The implication: the Internet was safer back then, and parents had more control. As someone who also grew up in that era, I can’t relate.

I, too, had a family computer in the living room and tech-savvy parents. It didn’t stop me from stumbling into adult AOL chatrooms, graphic porn, or violent videos, often unintentionally. I remember the pings of AIM just as vividly as the cyberbullying on Myspace and anonymous cruelty on Formspring. Parental controls were flimsy, easy to bypass, and rarely effective. My parents tried, but the tools of the time simply weren’t up to the task. The battle over my Internet use was constant, and my experience was hardly unique.

Still, even then, the Internet offered real value, especially for a queer kid who moved often and struggled to make “IRL” friends. But it also forced me to grow up fast in ways today’s youth are better shielded from. Parents now have far more effective tools to manage what their kids see and who they interact with. And online services have a robust toolbox for handling harmful content, not just because advertisers demand it, but thanks to Section 230, a uniquely forward-thinking law that encourages cleanup efforts. It built safety into the system before “trust and safety” became a buzzword. Contrary to Mark Meador’s baseless claims, that result was precisely its authors’ intent. 

A more serious conversation would focus on what we’ve learned and how the FTC can build on that progress to support a safer Internet for everyone, rather than undermining it. 

That aside, what baffles me most about these “protect the kids” conversations, which almost always turn out to be about restricting adults’ access to disfavored content, is how the supposed solution is more surveillance of children. The very services the FTC loves to criticize are being told to collect more sensitive information about minors—biometrics, ID verification, detailed behavioral tracking—to keep them “safe.” But as Eric Goldman and many other scholars who were notably absent from the workshop have extensively documented, there is no current method of age verification that doesn’t come at the expense of privacy, security, and anonymity for both youth and adults.

A discussion that ignores these documented harms, that fails to engage with the actual expert consensus around digital safety and privacy, is not a serious discussion about protecting kids. 

Which is why I find it especially troubling that groups positioning themselves as privacy champions are treating this workshop as credible. In particular, IAPP’s suggestion that the FTC laid the groundwork for “improving” youth safety online is deeply disappointing. Even setting aside the numerous privacy issues associated with age verification, does the IAPP really believe that a digital ecosystem shaped by the ideological goals of these panelists will be an improvement for kids, especially those most in need of support? For queer youth, for kids in intolerant households, for those seeking information about reproductive health or gender-affirming care? 

This workshop made the FTC’s agenda unmistakable. They’re not pursuing a safer Internet for kids. As Elizabeth said, the FTC is pushing a Christian nationalist vision of the web, built on censorship and surveillance, with children as the excuse and the collateral. 

Just as the playbook commands. 

Jess Miers is an Assistant Professor of Law at the University of Akron School of Law

Filed Under: , , , , , , , , , , ,

Source link

Related Posts

1 of 30