Sam Glover, the director of SPEAK—a U.K.-based free speech advocacy group—told TMD that the two laws powering this crackdown are the Public Order Act of 1986 and the Communications Act of 2003 (which applies the Malicious Communications Act of 1988 to digital communications). The former makes speech deemed to have been intended to stir up racial or religious hatred potentially criminal. The latter makes it a criminal offense to send “grossly offensive, indecent, obscene, or menacing” messages over any public electronic communications network, including private WhatsApp messages and social media posts.

These laws, originally intended to prevent people from planning race riots or taking hostile actions like sending excrement in the mail, are “just totally outdated,” Glover said, and have gained new force in the social media era. Last week, Elizabeth Kinney, a British woman, was convicted of a homophobic hate crime and sentenced to community service after, in private messages to a former friend, she referred to a man who attacked her as a “fa—t.”
But most arrests don’t lead to convictions. Convictions related to offensive speech arrests have actually decreased over the last few years, dropping from 1,995 in 2015 to fewer than 1,119 in 2023, even as arrests have risen.
But Glover contended that arrests themselves will frighten people, even if there is no legal penalty at the end of the investigation. “There’s an enormous chilling effect,” he said, if police are allowed to bring British citizens in for questioning—and even search their houses—regardless of the final verdict.
Efforts to change the laws in favor of a more liberal approach to speech are emerging. SPEAK advocates for replacing the “gross offensiveness” standard in the Public Order Act with a stricter test based on the imminent threat of harm. “It is a problem to have a criminal offense based on gross offense standard,” Jacob Rowbottom, a law professor at Oxford University, told TMD. He noted that it’s unlikely that Britain will seek to abolish the law altogether.
In September, Health Secretary Wes Streeting proposed reviewing speech laws after armed police arrested Graham Linehan—co-creator of popular sitcom Father Ted—at Heathrow Airport on suspicion of inciting violence, due to the content of three tweets. In one of these, he wrote, “If a trans-identified male is in a female-only space he is committing a violent, abusive act. Make a scene, call the cops, and if all else fails, punch him in the balls.”
In the aftermath of the arrest, the chief of London’s Metropolitan Police, Sir Mark Rowley, said that speech laws are eroding public confidence in the police. “I don’t believe we should be policing toxic culture wars debates, and officers are currently in an impossible position,” he said. “Greater clarity and common sense would enable us to limit the resources we dedicate tackling online statements to those cases creating real threats in the real world.”

Britain’s uncodified constitution has few formal provisions protecting free expression, and a broad, codified free speech guarantee was added to British law only in 1998, when the European Convention on Human Rights (ECHR) was incorporated into British law with the passage of the 1998 Human Rights Act.
But Article 10 of the ECHR, which promises Europeans the “freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers,” is not directly analogous to the First Amendment, and the ECHR still allowed for the “grossly offensive” standard to remain part of British law. “What’s the difference between merely offensive, which is legal, and grossly offensive, which then becomes illegal?” said Peter Coe, an associate professor of law at the University of Birmingham.
Ronald Krotoszynski—an expert on free speech and comparative constitutional law at the University of Alabama School of Law—told TMD that “What happens generally in Europe, and in the U.K., is that speech holds a lower social and legal priority than, say, privacy or reputation or personal honor.” Alongside the assurance of free speech, the ECHR also allows measures to protect public morals, national security, and personal privacy and reputation.
It’s even trickier to apply speech regulations across the breadth of the European Union, or to entities based in other countries. Last Friday, EU regulators sought to fine Elon Musk’s platform X $140 million for violating the EU’s Digital Services Act (DSA), with EU Commission spokesman Thomas Regnier saying the case has “nothing to do with content moderation.” Instead, regulators claimed that X violated the DSA due to “the deceptive design of its ‘blue checkmark,’ the lack of transparency of its advertising repository, and the failure to provide access to public data for researchers.”
Musk disagreed with Regnier’s characterisation. On Saturday, he tweeted: “The EU should be abolished and sovereignty returned to individual countries, so that governments can better represent their people.” Soon after, he retweeted a post that juxtaposed the EU flag with a Nazi swastika flag.
And U.S. officials agree that the EU went too far. U.S. Ambassador to the EU Andrew Puzder called the decision “regulatory overreach,” characterized it as “censorship,” and threatened tariff retaliation against the bloc.
European officials hold that the case has nothing to do with political speech. Daphne Keller, who directs Stanford’s program on platform regulation, told TMD that they have a point—at least in part. While parts of the DSA might open up tech companies to speech-based restrictions, the European Commission chose not to enforce them here. “If the commission had chosen to enforce those, there would be some questions,” she said. “But they didn’t. They chose to enforce provisions that are about transparency and about consumers.” Keller noted that Musk’s complaints aren’t coming out of nowhere, but that there are bigger burdens for tech companies operating in Europe than the DSA. For example, she cited the General Data Protection Regulation, which sets strict requirements for companies that handle the personal data of EU residents. But even if these rules are enforced neutrally, the EU now has powerful tools to penalize tech companies.
“When you are acting against a speech platform,” like X, “I think that the possibility for a political motivation would certainly be something that we would watch for,” Anupam Chander, a professor at Georgetown University School of Law, told TMD.
Krotoszynski said the general European tendency to enforce laws against speech seen as a threat to democracy or public order can lead to “a lot of picking and choosing, which more often than not corresponds to the political interest of the incumbent office holders.” But he also noted that almost all European states remain liberal democracies.
“I think it would be a stretch to say that there are not free and fair elections in Germany or France,” he said. Europeans have “a meaningful commitment to free speech, but in their view, it’s perfectly appropriate for the government to incorporate community values into the law.”
















