Social media companies are bracing for arguments before the Supreme Court on Monday that could fundamentally change how they police their sites.
A riot broke out at the Florida State Capitol after Facebook, Twitter and YouTube banned President Donald J. Trump's activities, which occurred on January 6, 2021, making it illegal for tech companies to ban candidates for state office from their sites. . Texas later passed its own law prohibiting platforms from removing political content.
Two technology industry groups, NetChoice and the Computer and Telecommunications Industry Association, filed a lawsuit seeking to block the law from taking effect. They argued that the company has a right under the First Amendment to make decisions about its own platforms, just as newspapers decide what to publish on their pages.
So what are the risks?
The Supreme Court's rulings in Moody v. NetChoice and NetChoice v. Paxton are a big test of the power of social media companies, potentially giving them control over millions of social media feeds by giving governments influence over how and what they keep online. It can be reconfigured.
“What matters is whether they can be forced to deliver content they don’t want,” said Daphne Keller, a lecturer at Stanford Law School who filed a brief with the Supreme Court supporting the tech group’s challenge to Texas and Florida. law. “And perhaps more importantly, whether governments can force them to deliver content they don’t want.”
If the Supreme Court rules that the Texas and Florida laws are constitutional and take effect, some legal experts speculate that companies could create versions of Feed specifically for those states. Nonetheless, these rulings could lead to similar laws being enacted in other states, and accurately restricting access to websites based on location is technically complex.
Critics of the law say feeds served in both states may contain extremist content from neo-Nazis that platforms would previously have taken down for violating their standards. Or, critics say, the platform could ban all discussion of anything remotely political by banning posts about many controversial issues.
What are the social media laws in Florida and Texas?
Texas law prohibits social media platforms from removing content based on a user's “viewpoint” or expressed in a post. The law gives individuals and state attorneys general the right to sue for platform violations.
Florida law imposes fines if a platform permanently bans a candidate for state office from its site. It would also prohibit platforms from removing content from “journalism companies” and require companies to be upfront about their content moderation rules.
Supporters of the Texas and Florida laws passed in 2021 say they will protect conservatives from the liberal bias prevalent in California-based platforms.
“People around the world use Facebook, YouTube, and X (the social media platform formerly known as Twitter) to connect with friends, family, politicians, reporters, and the general public,” said Texas Attorney General Ken Paxton. One legal briefing. “And like their predecessor companies of the past, today’s social media giants control the mechanisms of this ‘modern public square’ to dictate and often suppress public discourse.”
Chase Sizemore, a spokesman for Florida's attorney general, said the state “looks forward to upholding social media laws that protect Floridians.” A spokesperson for the Texas Attorney General had no comment.
What are the rights of current social media platforms?
Now they decide what stays online and what doesn't.
Companies including Facebook, Instagram, TikTok, Snap, YouTube and X on Meta have long policed themselves, setting their own rules for what users can say while governments take a hands-off approach.
In 1997, the Supreme Court ruled that laws regulating indecent speech online were unconstitutional, distinguishing the Internet from media outlets whose content is regulated by the government. For example, the government enforces decency standards for television broadcasting and radio.
For years, bad actors have flooded social media with misinformation, hate speech and harassment, prompting the company to put in place new rules over the past decade that include banning disinformation about elections and the pandemic. The platform has banned figures like influencer Andrew Tate for violating its rules, including for hate speech.
But there has been backlash from the right against the move, with some conservatives accusing the platform of censoring their views and even Elon Musk saying he wants to buy Twitter in 2022 to ensure free speech for users.
Thanks to a law known as Section 230 of the Communications Decency Act, social media platforms are not liable for most content posted on their sites. Therefore, they face little legal pressure to remove problematic posts and users who break their rules.
What are the social media platforms claiming?
The technology group says the First Amendment gives companies the right to take down content as they see fit because it protects editorial choice over product content.
In their lawsuit challenging the Texas law, the groups argued that, like the magazine's publishing decisions, “a platform's decisions about what content to host and what to exclude are intended to send a message about the type of community the platform seeks to foster. ”
Still, some legal scholars are concerned about the implications of allowing social media companies unfettered powers under the First Amendment's protections of free speech and freedom of the press.
“We don’t want a world where these companies invoke the First Amendment to protect what many of us consider to be commercial and speechless behavior,” said Olivier Sylvain, a professor at Fordham Law School who until recently was a senior adviser. “I’m worried,” he said. To Lina Khan, Chairwoman of the Federal Trade Commission.
What's next?
The court is scheduled to hear arguments from both sides on Monday. A decision is expected to be made in June.
Legal experts say the court may rule the law unconstitutional but also provide a roadmap for how to fix it. Or you can fully defend the company's First Amendment rights.
Carl Szabo, general counsel at NetChoice, which represents companies including Google and Meta and lobbies against tech regulations, said that if the group's challenge to the law fails, “Americans across the country will be exposed to legal but horrific content.” “It is,” he said. It is interpreted as political and therefore subject to law.
He said, “There is a lot of content that is dismissed as political.” “Terrorist recruitment is undoubtedly political.”
But if the Supreme Court rules that the law violates the Constitution, the status quo will be entrenched. The platforms, and no one else, will decide what speech stays online.
Adam Liptak contributed to the report.