* This Post Has Been Removed *
Whether it’s the sharing of (mis)information, shadow banning, or removing content, members of both political parties are focusing in on social media platforms and how they moderate their content. The events of Jan. 6, 2021, and the subsequent platform bans placed on former President Donald Trump for his role — real or perceived — strengthened legislative debate largely centered on two types of bills: anti-censorship bills and those that would draw lines around the responsibilities of social media companies to moderate their online communities.
Members of Congress have expressed they want to crack down on Big Tech for a slew of different reasons for years. That desire only grew post-January 6th, with primary efforts being focused on anti-trust laws and transparency about some sites’ harmful effects on teens. But a stalemated Congress doesn’t allow for much legislation to move through the pipeline around speech and censorship- so many states are taking initiatives to pass laws regulating Big Tech themselves.
Republican lawmakers in states such as Florida and Texas have been criticizing sites like Twitter and Facebook for removing certain posts and banning users. The dialogue ramped up when companies cracked down on what they deemed misinformation about the 2020 election, and the eventual banning of former President Donald Trump from Twitter after the Jan. 6 attack on the U.S. Capitol.
As of now, only two states have passed legislation focusing on social media censorship, though, other states are moving in a similar direction. Which raises the question: Should government legislate how social media is used? Or do these laws cross the line and interfere with freedom of speech?
That question is a tough one to answer.
When it comes to the government, the First Amendment states that Congress cannot make laws restricting free speech.
However, when it comes to social media companies, they can set their own editorial rules and regulations because they are private entities, allowing them to take certain actions towards users’ posts if they breach company guidelines.
Just last week Monday, we saw Elon Musk purchase Twitter for $44 billion as he aims to take the company private, loosen up content rules, create an edit button so users can change their published tweets, ditch ads, etc.
Vera Eidelman, a staff attorney with the ACLU Speech, Privacy and Technology Project, expanded on the topic, saying, “Since I’ve been following the trend, the types of bills that are flagged require social media content, viewpoints of political figures and media outlets to focus on transparency.” “Florida and Texas recently passed these types of laws, but both are being challenged by a number of advocacy amicus briefs drafted by the Reporters Committee for Freedom of the Press.”
In Florida’s case, Senate Bill 7072 looks to levy fines and impose penalties against social media platforms that block or inhibit content from political candidates and media organizations.
As for Texas, House Bill 20 states that, “a social media platform may not censor a user, a user’s expression, or a user’s ability to receive the expression of another person based on the viewpoint of the user or another person.”
What is next?
“We are waiting for the lawsuits challenging the Florida and Texas laws to see what happens next,” explained David Greene, who is the civil liberties director and senior staff attorney at the Electronic Frontier Foundation. “In each case, trial courts found that the laws did violate the First Amendment and put them on hold.” Each state’s appeals will be influential, Greene added. If they are struck down, it could slow down the momentum of other states, but if one or both are upheld, there could be a wave of these types of laws.
He went on to say that while these other lawsuits play out other types of legislation can be proposed as an alternative to curb social media’s power and reach. “I would hope states look more into legislation that has to do with market dominance,” Greene said. “That can be regulated under our constitutional systems and is different from regulating speech.”
As for social media platforms themselves, Greene echoed that “people want platforms to do a better job at being more predictable; users need to put pressure on their services to make their content moderation policies more understandable.”