A Supreme Court Decision Involving Terrorism Brings No Changes To Internet Regulations
It’s been a hot minute since President Bill Clinton signed Section 230 of the Communications Decency Act (CDA) into law in 1996. However, that 27-year-old legislation appears to have stood the test of time – at least for now.
Section 230 was enacted to guide and support the Internet – then only in its infancy stage. It gave the Internet users, not the tech companies that owned the different sites or platforms, the responsibility for the information posted.
Section 230 states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
In recent years however, critics have hoped lawmakers would revisit the CDA and Section 230 to give more accountability for what is posted on the platforms to the companies who own them, including Twitter, Facebook and Google, all of whom were not yet household names when the law was written.
Reynaldo Gonzalez v Google and Twitter v Taamneh
Supporters of the revamp were hoping last month’s twin lawsuits in front of the Supreme Court would bring changes, but they did not.
On May 18, the Justices unanimously decided in favor of the companies and against families whose loved ones were killed by terrorists known to post information connected to attacks on specific platforms. Both cases involved the companies receiving immunity from what was posted.
Among other issues, the plaintiffs alleged the companies were able to concoct algorithms on their platforms to direct interested viewers to specific videos tied to terrorist organizations.
The plaintiffs claimed the companies – Google and its subsidiary YouTube, Twitter and Facebook, violated the federal Anti-Terrorism Act, which specifically allows civil damage claims for aiding and abetting terrorism.
On the other side, the defendants used Section 230, calling for their right to immunity to any terrorist involvement and stressing they were not responsible for what was posted on their platforms.
The first lawsuit was filed by the family of Nawras Alassaf, who was killed in Istanbul in 2017 during a terrorist attack on the Reina nightclub. The second case, which the Justices remanded to a lower court, was filed by the family of Nohemi Gonzalez, a 23-year-old college student killed in Paris during terrorist attacks in November 2015.
In both cases, the families approached the lawsuits with the notion that the companies provided the terrorists the “platforms’’ to communicate their agenda and knew ISIS was using their service as a recruitment tool. Lawyers alleged that by inviting viewers and those interested in ISIS to the videos, and at the same time seeking to get more viewers and increase their ad revenue, they knew ISIS’s goals.
However, the Supreme Court saw otherwise. According to its decision, written by Judge Clarence Thomas, there was not a great enough “nexus” between the platforms and the terrorist attacks in question.
While the tech companies did indeed fail to remove users and content related to ISIS, when it came to intentionally aiding the attacks, the defendants did not provide substantial help to terrorists to be found liable. According to Thomas, they did not “pervasively and systemically’’ assist ISIS.
And regarding some of the allegations made by the plaintiffs, including that the algorithms were “active, substantial assistance” to the Islamic State of Iraq and Syria, Justice Thomas put it simply. “We disagree,” he said.
Reaction to the Supreme Court’s Decision
When the original law was created, Internet users were just getting used to navigating entities like AOL and filling out templates for new online entities like classmates.com. YouTube, Facebook, Instagram and other major platforms used today were not yet household names. At the time, lawmakers were concerned about regulating the Internet and the issues of “obscene, lewd, lascivious, filthy, or indecent” materials, as stated within the CDA.
However, now citizens, both Republicans and Democrats, are expressing different concerns that come with an urgency that the 21st-century version of the Internet needs other laws.
While some are declaring situations of unregulated censorship (President Trump’s Twitter account was suspended in the aftermath of the deadly Jan. 6, 2021 attack on the US Capitol, with the company citing “the risk of further incitement of violence.”), others believe the owners of the Internet platforms are allowed to post dangerous, hateful (and false) content too easily.
For those hoping for a re-do of Section 230, the decisions in the two Supreme Court cases were a disappointment.
After news of the Supreme Court actions, Senate Judiciary Chairman Dick Durbin (D-Ill.) issued a statement saying Congress must step in after the court passed on their chance to clarify that Section 230 is not a get-out-of-jail-free card for online platforms when they cause harm. “Enough is enough. Big Tech has woefully failed to regulate itself,” he said.
And on Twitter, Senator Marsha Blackburn, a Tennessee Republican who has been a vocal opponent of how social media platforms are operating, expressed her wish for Congress to step in to reform the law due to the way the tech companies “turn a blind eye’’ to harmful activities online.
Ron DeSantis and Elon Musk’s Twitter Debacle
While the discussion over how to update the CDA continues, countless Americans continue to rely on the Internet daily, and it is clear that there is still a learning curve when it comes to using its different social media platforms.
Just days after the Supreme Court ruled on the two cases, thousands of Twitter users participated in a Twitter Spaces live event when Florida Governor Ron DeSantis, joined by Twitter CEO Elon Musk, announced his bid for the 2024 Republican presidential nomination.
Unfortunately for DeSantis (and Musk), while, according to news reports, more than 500,000 Twitter users attempted to watch the announcement, a major snafu in the audio for the event occurred, forcing Musk to end the feed 20 minutes after it began.
Thirty minutes later, DeSantis started over.
“I am running for president of the United States to lead our great American comeback,” the candidate said.
However, by then, less than 100,000 users were watching.