We’re closely watching how Facebook enforces its newly-announced policy that limits speech by users who are organizing public protests. This policy is deserving of special attention since it effects free expression on two levels: the organization of the protest itself, and the speech about it. This new policy adds to Facebook’s overall content moderation burden at a time when the company is already unable to keep up with user reports and appeals. One of the great benefits of Facebook is that it is a powerful and affordable organizing tool—therefore, the potential consequences of this policy are serious.
Under our coordinating harm policy, we’re removing content that:
Advocates for in-person meetings or events against government health guidance. This does not include discussion and debate of public policy or proposed new guidance from policymakers and elected officials.
Coordinates in-person events or gatherings that encourage people with COVID-19 to join.
Although we understand Facebook’s motivations here, this policy is troubling for several reasons.
First, as with all content removal decisions, Facebook has still not fully implemented the Santa Clara Principles, and is currently failing to provide sufficient avenues to appeal removal decisions.
Second, under the best circumstances, content moderation is extremely difficult to do well. It’s full of tricky context, huge grey areas, and impossible line-drawing. Facebook moderates a ton of content—the result of which is hugely problematic, and not for lack of effort. When it comes to complexity, this policy is no different, and indeed, may be even more difficult to implement fairly and consistently.
We had initially hoped Facebook would base these decisions on an objective measure like state or local law; Facebook had previously said they were consulting with local governments and “unless government prohibits the event during this time, we allow it to be organized on Facebook.”
But the written policy Facebook has adopted uses instead the hazier standard of “government health guidance,” which seems to incorporate gatherings that are discouraged though not legally prohibited. Granted, even relying on law would involve a thousand difficult decisions. But laws at least aim for some level of precision. Under US law, for example, laws that restrict First Amendment-protected activity like protests must meet an even higher level of precision: complete limits on gatherings, irrespective of content, must be justified by an important governmental interest and leave open ample means of communication; limits on some but not all gatherings, including those that disfavor expressive gatherings like protests, likely have to be proven to be necessary and the least speech restrictive alternative to advancing a critical public interest. But guidelines, being hortatory and unenforceable, do not.
But even if such standards were precise, the task Facebook has burdened itself with is daunting. There are thousands of these guidelines in the US alone—and thousands more worldwide—with many variations among them. What is permissible in one town may, because of slight variations in the guidelines, be impermissible in the next town over.
And compounding this and every content moderation-related problem now is the increased use of automated decision-making. As we’ve previously written, the current uptick in automated content moderation is understandable given the circumstances, but nevertheless troubling—and must be treated as a temporary measure to be rolled back as soon as it’s safe for moderators to return to work.
Realistically, Facebook is probably mostly targeting events that specifically urge the flouting of social distancing rules, that specifically call for prohibited crowding and defiance of public health measures. Perhaps they wrote their policy to be fastidiously viewpoint-neutral, wanting to avoid, rightfully, a policy that banned only protests against shelter-in-place rules. They seem to try to address this by including an explanation that the policy does not restrict “discussion and debate of public policy or proposed new guidance from policymakers and elected officials.” But if this is the case, we should also be concerned by the existence of a broadly written policy when they envision only limited enforcement.
COVID-19 is compounding content moderation impossibilities in numerous ways. This Events Policy is another one, and it is sure to result in mistakes and angry users. That’s why we recommend that companies roll back any policy changes made in light of COVID-19.