Supreme Court blocks Texas social media law tech companies warned would allow hateful content to run rampant
[ad_1]
One person is seen walking down the street near Washington’s Supreme Court Building, Washington D.C. February 16, 2022.
Reuters| Reuters
Supreme Court halted a Texas law on social media that was controversial and could have allowed hateful content to spread online. decision released on Tuesday.
The law says HB20It prohibits online platforms to remove or moderate content based upon viewpoint. The common claim is that social media companies based in California, such as Facebook and Twitter have the right to remove content based on viewpoint. FacebookAnd TwitterThey are biased in moderation strategies, and they tend to be quieter than conservative voices. Platforms claim they adhere to their community guidelines in an equal manner. It’s not uncommon for right-leaning users to rank at the top of engagement.
“HB20 would compel platforms disseminate every kind of objectionable opinions,” say two industry organizations that represent companies. Amazon, Facebook, GoogleTwitter stated in an emergency court application, that they had received “such as Russia’s propaganda claiming it invaded Ukraine was justified, ISIS propaganda claiming extremism to be warranted and Neo-Nazi screeds denying the Holocaust or supporting it, and encouraging kids to engage in dangerous or unhealthy behaviors like eating disorders.
Texas’ attorney General has stated this in writing. response to the emergency applicationThe law doesn’t “prohibit platforms from removing whole categories of content.”
So, the answer is “So, for instance,” “The platforms can decide that pornography should be eliminated without violating H.B 20…” They can ban speech from foreign governments without violating the HB 20 so platforms aren’t required to broadcast propaganda by Russia about Ukraine.
Although the legislation passed, it was blocked by a lower court which granted a preliminary order preventing its implementation. The Fifth Circuit Federal Appeals Court changed that. ruled in mid-May to stay the injunctionIn the meantime, the final ruling on the case could mean that the law can be passed while the court considers the larger case.
This led to the formation of two industry associations, NetChoice and Computer and Communications Industry Associations (CCIA). to file an emergency petitionWith Justice Samuel Alito who handles cases in that area.
CCIA and NetChoice asked the courtThe law was not to be implemented. They argued social media companies can make editorial decisions on what content should be distributed and displayed, and the appeals court’s judgment would eliminate that power and stop chill speech. According to it, the court should lift the suspension while the appeals courts reviews key First Amendment issues.
This Supreme Court decision will have implications for states who may be considering legislation that is similar to the one in Texas. The Florida legislature already has a similar law on social media. However, it was blocked by the courts.
Shortly after the Texas emergency appeal by the tech groups, the Eleventh Circuit federal appeals court was set up upheld an injunction against a similar law in FloridaThe Constitution protects content moderation, the unanimous conclusion of this panel. The Florida attorney general filed an amicus briefOn behalf of the state, and other states, she asked the court for the Texas law’s continued to apply. She argued that the industry misunderstood the law, and that the states have the right to regulate business in this manner.
Congress testing ground
These laws are used as a test bed for how the U.S. Congress intends to reform the legal liability protection tech platforms have relied upon for many years in order to limit their services. It is this law that, Section 230 of the Communications Decency ActThis allows online platforms to be held harmless for the content posted to them by users and gives them the power to remove or moderate posts.
Republicans and Democrats have criticized the law for different reasons. Democrats are seeking to change the law in order to increase responsibility for tech platforms to manage dangerous information, such as misinformation. Although Republicans are in agreement that certain content, such as child sexual exploitation and terrorist recruitment material, should be eliminated from the law, some Republicans want to make it more difficult for tech platforms to use moderation they consider ideologically-based.
An author of Section 230 was former Rep. Christopher Cox from California. amicus briefSupporting the appeal of industry groups for the Supreme Court’s reversal of the stay. Cox argues in the brief that HB20 is “in irreconcilable contradiction” with Section 230. This should override the state law.
Yet, there is at most one Supreme Court Justice. already expressed interest in reviewing Section 230 itself.
2020 will see conservative Justice Clarence Thomas wroteThat “in a appropriate case we should consider whether or not the text of the increasingly important statute is compatible with the current status of immunity enjoyed on Internet platforms.”
He suggested last year in concurrence that online platforms might be sufficiently similar to common carriers and places of accommodation for regulation in this way.
The story continues to develop. Keep checking back for more updates.
WATCH: The messy business of content moderation on Facebook, Twitter, YouTube
[ad_2]