果冻传媒app官方

Table of Contents

Meta鈥檚 content moderation changes closely align with FIRErecommendations

Mark Zuckerberg announces sweeping changes to bolster free expression on Facebook, Instagram, and Threads that track 果冻传媒app官方鈥檚 2024 Social Media Report.
Meta logo surrounded by Facebook and Instagram logos

Ink Drop / Shutterstock.com

On Tuesday, Meta* CEO Mark Zuckerberg and Chief Global Affairs Officer Joel Kaplan  sweeping changes to the content moderation policies at Meta (the owner of Facebook, Instagram, and Threads) with the stated intention of improving free speech and reducing 鈥渃ensorship鈥 on its platforms. The changes simplify policies, replace the top-down fact-checking with a  system, reduce opportunities for false positives in automatic content flagging, and allow for greater user control of content feeds. All these changes mirror recommendations FIREmade in its May 2024 Report on Social Media.

Given Meta鈥檚 platforms boast billions of users, the changes, if implemented, have major positive implications for free expression online.

FIRE鈥檚 Social Media Report

Social Media Report 2024 Featured Image

FIRE Report on Social Media 2024

Reports

With as many as 5.17 billion accounts worldwide, social media is the most powerful tool in history for average citizens to express themselves.

Read More

In our report, we promoted three principles to improve the state of free expression on social media:

  1. The law should require transparency whenever the government involves itself in social media moderation decisions.
  2. Content moderation policies should be transparent to users, who should be able to appeal moderation decisions that affect them.
  3. Content moderation decisions should be unbiased and should consistently apply the criteria that a platform鈥檚 terms of service establish.

Principle 1 is the only one where FIREbelieves government intervention is appropriate and constitutional (and we created a model bill to that effect). Principles 2 and 3 we hoped would enjoy voluntary adoption by social media platforms that wanted to promote freedom of expression. 

While we don鈥檛 know whether these principles influenced Meta鈥檚 decision, we鈥檙e pleased the promised changes align very well with 果冻传媒app官方鈥檚 proposals for how a social media platform committed to free expression could put that commitment into practice.

Meta鈥檚 changes to content moderation structures

With a candid admission that it believes 10-20% of its millions of daily content removals are mistakes, Meta announced it is taking several actions to expand freedom of expression on the platform. The first is simplification and scaling back of its rules on the boundaries of discourse. According to Zuckerberg and Kaplan:

[Meta is] getting rid of a number of restrictions on topics like immigration, gender identity and gender that are the subject of frequent political discourse and debate. It鈥檚 not right that things can be said on TV or the floor of Congress, but not on our platforms. These policy changes may take a few weeks to be fully implemented. 

While this is promising in and of itself, it will be enhanced by a broad change to the automated systems for content moderation. Meta is restricting its automated flagging to only the most severe policy violations. For lesser policy violations, a user will have to manually report a post for review and possible removal. Additionally, any removal will require the agreement of multiple human reviewers.

This is consistent with our argument that AI-driven and other automated flagging systems will invariably have issues with false-positives, making human review critical. Beyond removals, Meta is increasing the confidence threshold required for deboosting a post suspected of violating policy.

Who fact-checks the fact checkers?

Replacing top-down fact-checking with a bottom-up approach based on X鈥檚 Community Notes feature may be just about the biggest change announced by Meta. As FIREnoted in the Social Media Report: 

Mark Zuckerberg famously said he didn鈥檛 want Facebook to be the 鈥渁rbiter of truth.鈥 But, in effect, through choosing a third-party fact checker, Facebook becomes the arbiter of the arbiter of truth. Given that users do not trust social media platforms, this is unlikely to engender trust in the accuracy of fact checks.

Zuckerberg similarly said in the  that Meta鈥檚鈥渇act checkers have just been too politically biased, and have destroyed more trust than they鈥檝e created.鈥 

Our Social Media Report argued that the Community Notes feature is preferable to top-down fact-checking, because a community of diverse perspectives will likely be 鈥渓ess vulnerable to bias and easier for users to trust than top-down solutions that may reflect the biases of a much smaller number of stakeholders.鈥 Additionally, we argued labeling is more supportive of free expression, being a 鈥渕ore speech鈥 alternative to removal and deboosting.

We are eager to see the results of this shift. At a minimum, experimentation and innovation in content moderation practices provides critical experience and data to guide future decisions and help platforms improve reliability, fairness, and responsiveness to users.

User trust and the appearance of bias

An overall theme in Zuckerberg and Kaplan鈥檚 remarks is that biased decision-making has eroded user trust in content moderation at Meta, and these policy changes are aimed at regaining users鈥 trust. As FIREargued in our Social Media Report:

In the case of moderating political speech, any platform that seeks to promote free expression should develop narrow, well-defined, and consistently enforceable rules to minimize the kind of subjectivity that leads to arbitrary and unfair enforcement practices that reduce users鈥 confidence both in platforms and in the state of free expression online.

We also argued that perception of bias and flexibility in rules encourages powerful entities like government actors to 鈥渨ork the refs,鈥 including through informal pressure, known as 鈥jawboning.鈥

Close up of a bright classical pillars reflecting in a pool of water

What is jawboning? And does it violate the First Amendment?

Issue Pages

Indirect government censorship is still government censorship 鈥 and it must be stopped.

Read More

Additionally, when perceived bias drives users to small, ideologically homogeneous alternative platforms, the result can damage broader discourse:

If users believe their 鈥渟ide鈥 is censored unfairly, many will leave that platform for one where they believe they鈥檒l have more of a fair shake. Because the exodus is ideological in nature, it will drive banned users to new platforms where they are exposed to fewer competing ideas, leading to 鈥溾 the well-documented phenomenon that like-minded groups become more extreme over time. Structures on all social media platforms contribute to polarization, but the homogeneity of alternative platforms turbocharges it.

These are real problems, and it is not clear whether Meta鈥檚 plans will succeed in addressing them, but it is welcome to see them recognized.

International threats to speech

Our Social Media Report expressed concern that the Digital Services Act 鈥 the broad EU regulation mandating censorship on social media far beyond what U.S. constitutional law allows 鈥 would become a least common denominator approach for social media companies, even in the United States. Mark Zuckerberg seems to announce his intention to do no such thing, stating he planned to work with President Trump to push back on 鈥済overnments around the world鈥 that are 鈥減ushing [companies] to censor more.鈥

While we are pleased at the implication that Meta鈥檚 platforms will seemingly not change their free expression policies in America at the behest of the EU, the invocation of a social media company working with any government, including the United States government, rings alarm bells for any civil libertarian. We will watch this development closely for that reason. 

FIRE has often said 鈥 and it often bears repeating 鈥 the greatest threat to freedom of expression will always come from the government, and as Zuckerberg himself notes, the government has  pushed Meta to remove content.

When the rubber meets the road

Meta鈥檚 commitment to promote freedom of expression on its platforms offers plenty of reasons for cautious optimism. 

But we do want to emphasize caution. There is, with free expression, often a large gap between  and what happens when theory meets practice. As a civil liberties watchdog, our duty is to measure promise against performance.

Take, for example, our measured praise for Elon Musk鈥檚 stated commitment to free expression, followed by our frequent criticism when he failed to live up to that commitment. And that criticism hasn鈥檛 kept us from giving credit when due to X, such as when it adopted Community Notes. 

Similarly, FIREstands ready to help Meta live up to its stated commitments to free expression. You can be sure that we will watch closely and hold them accountable.

* Meta has donated to 果冻传媒app官方.

Recent Articles

FIRE鈥檚 award-winning Newsdesk covers the free speech news you need to stay informed.

Share