This blog originally appeared on Daily Kos.

Hundreds of thousands of people signed petitions imploring the Facebook Oversight Board, an assortment of luminaries assembled by the social media giant to offload hard decisions, to make permanent an indefinite suspension of the former president from Facebook. Bowing somewhat to the overwhelming public pressure, the Oversight Board declined to reinstate Trump and punted the decision back to the company, while calling out their wildly inconsistent decisions and lack of meaningful policy. The company has six months to issue another decision.

This “pause” satisfied pretty much no one, but avoided a return to the full-on disinformation blizzard Trump perfected on Facebook with an intermixed stew of paid microtargeted advertising, amplification bots, private groups and manipulation of Facebook’s opaque algorithmic formulas.

But whatever happens with the final decision on the Trump return or permanent suspension, it’s only a matter of time until another populist wanna-be dictator uses the same formula. We can’t ban our way out of the right-wing disinformation echo chamber and its dangerous impacts on democratic process.

At the Facebook Users Union (https://facebookusers.org), a project of two Northern California nonprofits Global Exchange and Media Alliance, we’ve been all in on the Ban Trump effort, but we know that’s only the beginning. The Facebook Users Union aims to leverage the power of us, the Facebook users who create the content and provide the eyeballs and clicks, to have a say in how the world’s largest social media platform impacts us and the world we live in. The purpose of a union is to build collective power to negotiate. We have power together that we could never have alone. We believe that users together can demand, and win, significant changes in how Facebook does business that will make the platform less toxic and less dangerous. Because there’s no Facebook without Facebook users.

Let’s face it. The government is the government. Having the federal government meddle in public speech creates a world of possible trouble. To the extent social media functions as a part of the fourth estate, it needs independence from government-driven censorship or overly aggressive interventions that infringe on rights of dissent. With what we know about law enforcement monitoring of social media, we have to be careful what we wish for.

Similiarly, I can’t think of anyone who wants a tech bro telling us what’s true and what’s false, nor any of the terminally confused contractors who regularly put users in “Facebook jail” for reasons that often seem opaque, bewildering or flat out wrong.

But there are ways to tamp down the algorithmic frenzies that shoot sensationalistic disinformation around the platform at the speed of light. One of the most interesting is called a “circuit-breaker”, which points with laser precision at the small quantities of content that display patterns of viral algorithmically driven spread.

A circuit breaker is an electrical switch that automatically gets flipped when excess electrical current is threatening damage to an electrical circuit. Circuit breakers are used in the stock exchange when there’s a dramatic drop, to pause trading and allow traders to assimilate new information and make better-informed choices. Basically, they avoid a “run on a stock” based on rumors that may or may not be true. By slowing down the pace of trading, they let reality catch up and sometimes prevent financial ruin based on lies and disinformation.

In social media, a small amount of content becomes “super-spreader” content. Almost entirely right wing in origin, super-spreader content benefits from a well-developed artificial amplification network that works to game algorithmic formulas for maximum reach as quickly as possible. Facebook’s algorithmic amplification formulas prioritize controversy, sensationalism and anger and feed on personal vulnerabilities to propel virulent content that includes incitements to violence, hate speech and medical and election-related disinformation.

A circuit breaker would automatically be flipped when a superspreading post is heading to tens of thousands of people’s newsfeeds at an unsafe speed.

Once the circuit breaker is triggered, the post would be “paused” until it’s reviewed by a human being. The Working Group on Infodemics Policy Framework recommends that the circuit breaker would temporarily prevent the content from algorithmic amplification in newsfeeds, appearing in trending topics, or via other algorithmically aggregated and promoted avenues. Individual posting or message sharing could still occur. (See Working Group on Infodemics Policy Framework, page 80.)

The rub, of course, is what human review looks like. Facebook’s existing fact-checking program would need to be better resourced and purposefully diversified so that content in all languages could be fact checked in a timely way. It should also expand to include a significant number of alternative and independent media outlets. Expanding the make-up of Facebook’s independent fact checkers will ensure that diverse viewpoints are not unfairly restricted.

Calls to delete Facebook are everywhere, but the platform’s user base is still growing at over 7% per year, which means its impact on our world isn’t getting smaller.

If you’d like to help force Facebook to voluntarily implement changes to reduce its toxic footprint, join us at the Facebook Users Union (https://facebookusers.org).