There’s a Section 230-sized elephant in the room.
One of the earliest internet speech laws, this section has built much of the present online ecosystem. It’s what protects everyday online businesses from being held liable for the comments of their users, and helps ensure free speech for creators. But, in the age of organized violence through platforms like Twitter, Facebook, and YouTube, the section has become increasingly controversial, with platforms facing heat for not doing enough to stop harmful content.
Recently, the Supreme Court took up two cases seemingly tied to the content liability law, against Twitter and Google respectively. As creators and media giants alike waited for a ruling, the results rendered a bit anti-climactic: Section 230 was not within the purview of these cases. Still, the rulings present interesting consequences for users and creatives everywhere.
A contentious case
After a 2017 shooting in Istanbul’s Reina nightclub, one of the victim’s families filed suit against Twitter, Meta, and Google, claiming the platforms actively recommended ISIS-related content to highly sensitive individuals and thus should be held partially culpable for the terrorist activities.
After Twitter appealed its Ninth Circuit ruling, the Supreme Court took up the case and heard oral arguments in February 2023. In a 9-0 decision, the Supreme Court ruled on May 18 that Twitter was not “aiding and abetting terrorism.” In the words of Justice Thomas, the plaintiff’s suit relied “less on [Twitter’s] affirmative misconduct and more on an alleged failure to stop ISIS from using these platforms.”
In a parallel case suing Google for recommending alleged ISIS-affiliated YouTube videos to members involved in the 2015 Paris attacks, the Court merely punted on May 18, declining to consider the suit outside of what was ruled in the dueling Twitter bid.
While it was only Twitter who appealed this case, the Court took the opportunity to comment on other tech giants like Google and Meta, describing their algorithms as neutral entities.
Justice Thomas noted that these platforms’ algorithms were “agnostic as to the nature of the content,” invoking a striking comparison: We do not hold cell phone providers responsible for “aiding and abetting, for example, illegal drug deals brokered over cell phones.”
What is Section 230, and why should I care?
Surprisingly, the Court also chose to sidestep any direct major ruling on Section 230, turning away from the controversial content immunity clause.
Passed as part of 1996’s Communications Decency Act, Section 230 provides social media platforms legal freedom from the statements of its participants. While this section provides the backbone of internet content freedom, it has become increasingly controversial in the age of social media polarization and online-organized violence.
The Court’s decision to not consider Section 230 reform was especially surprising considering how many eyes were on this case. In the Google suit, the Biden administration even submitted a brief to the Court, arguing that Section 230 has been employed far beyond the original intention of the provision and calling for reform.
President Joe Biden even weighed in with a January opinion piece in the Wall Street Journal, demanding we “reform Section 230 of the Communications Decency Act, which protects tech companies from legal responsibility for content posted on their sites.”
Timothy Shields, a law partner specializing in tech, data privacy, and social media, has also been following the case closely, with a firm belief in the importance of Section 230.
“Without Section 230, our digital world would not exist as we know it today,” Shields said. “Any website, app, or service that allows users to generate content would be impacted. Everything from Amazon product reviews to Instagram would not be able to operate in an environment where the application might be held liable for the content.”
Derek Bambauer, a professor at the Univesity of Arizona specializing in internet law, has also been waiting patiently to see how the Court would avoid ruling on Section 230.
“I expected that the Court would try hard (as it did) to duck the Section 230 questions raised by the case because I think the Justices are split on Section 230,” Baumbauer said.
When asked about the “winners” of the case, both law experts Shields and Baumbauer were quick to stress the broader importance beyond those big corporations like Google and Meta.
“While we often think 230 is the only protection for Big Tech, it protects every small business that has a review section on their website, app developers that allow user interaction, Etsy sellers, etc,” Shields noted.
“[Section] 230 makes it easy not just for Google and Meta, but also for start-up firms that have limited resources to devote to content moderation, which is a difficult and expensive task,” Baumbauer added.
Consequences for creators
For users and creators alike, this case is an effective “no change” for the present state of content liability. Of course, any defamatory or otherwise illegal post you make can be used against you in court, but the platform you posted it on will not be held responsible for any of the ensuing actions.
Where creators may find increased difficulty is in taking action against these social media giants. Shields, who works with creators on social media legal disputes, noted how this ruling could impact these claims.
“Where my work will be impacted is when a user or influencer wants to take action against a platform,” Shields said. “The facts will have to show that the platform itself purposely did something to create the harm.”
In some good news for creators, the case provides some support for those recommendation algorithms that allow creators to find their audience. Though there was clearly some contention among the justices, they ultimately provided some short-term support for your Twitter feed and YouTube-recommended videos.
“They did frame YouTube’s algorithm as neutral,” Baumbauer noted. “It neither favors nor disfavors ISIS content, but just tries to show you things that it analyzes will be of interest to you.”
Ultimately, the case is a reminder to keep one ear to the ground. Social media speech laws are ever-changing, as our slow-moving legal system attempts to keep pace with quickly developing technologies. If Shields has one piece of advice for creators, it’s exactly that: Watch this space.
“It is always a struggle in my area of law to apply existing laws to emerging technologies,” Shields said. “These cases were great examples of that daily struggle at the highest court in the land and it won’t be the last time!”