The technology platform does not take any responsibility for the content uploaded by the users, as only the person who creates the said multimedia product is responsible. Section 230 of the United States Communications Decency Act (1996) provides them a type of defense based on freedom of expressionHowever, there is controversy about how social networks use this regulation.
At first they cannot censor internet users, but sometimes some posts are removed. The Washington Post notes that “Supreme Court will hear the case testing the limits of Section 230A United States statutory provision that protects companies from liability for third-party postings.”
Section 230 was passed in 1996, lays the foundation of the Internet, and takes care of Immunize websites from being held accountable for what their users post. In short, it protects any owner or user of an ‘interactive computing service’.
The law has caused controversy over the years, As a result, politicians in the United States have called for reform. Although the problem is not related to Section 230 (but to the extent of freedom of expression), any change could evolve towards a greater power of censorship on Internet platforms.
Politicians could carry out any kind of imposed filtering, although there is also the possibility that they would change the rules of the Internet and eliminate the monetization capabilities of various companies.
But why is Section 230 so important recently? Jeff Kosef (Associate Professor at the United States Naval Academy) explained to The New York Times that “this is the first time that the Supreme Court has agreed to hear a case that allows the interpretation of Section 230”, because Tech giants may be protected from misinformation and potentially ‘violent’ content broadcast on the platforms.
The United States Supreme Court may evaluate its application at a trial, whereI accuses Google of allowing YouTube to promote ISIS and war-related publications.
We are talking about the case of 23-year-old student Nohemi Gonzalez, who was killed in a terrorist attack by the Islamic State in Paris in 2015. The lawsuit alleges that YouTube ‘helped’ the extremist group, in addition, Algorithm recommended video related to terrorism,
The complaint states that Social network officials helped Islamic State, noting that the plaintiffs quoted in the application that “video was the central method used by ISIS to recruit for support.” Rather, Google argued that “the complaint does not allege that any terrorist viewed such recommendation or that such recommendations had any connection with the Paris attacks.”
The Los Angeles Times states that “a judge dismissed the case and a federal appeals court upheld that decision. Under US law – specifically section 230 of the Communications Civilization Act – Internet companies cannot be blamed for the content their users post on them.”,
Twitter is also included
The Supreme Court will also consider a separate but related lawsuit involving Twitter. The matter was brought by relatives of Navras Alsaf, who was killed in a terrorist attack in Istanbul in 2017. The claim alleges Twitter, Facebook and Google. Violating anti-terrorism law by allowing Islamic State to access its social networks,
However, the lower courts did not directly address the issue of Section 230 in this case, and Twitter requested that the Supreme Court consider the matter If they also listened to Google’s demand.
Sign up for our newsletter and receive the latest tech news in your email.