Top 10 worst moments in a bad year for Facebook

Facebook had a terrible year in 2021. Here are some of the highlights. Hopefully in 2022, we can wrest this company away from its incompetent leadership (#FireZuck) and stop it from harming health, safety, human rights and democracy.

If anyone can do it, it’s Facebook users.

1. Facebook’s key role in the January 6th insurrection is revealed

2. ProPublica exposes Sheryl Sandberg “is fine with” Facebook silencing opponents of authoritarian governments

3. Facebook Admits its Most Viewed Article in Early 2021 Raised Doubt About COVID Vaccines

4. Biden Administration Tells Facebook its COVID Lies are Killing People

5. Facebook critic, Maria Ressa of the Philippines, Wins Nobel Peace Prize

6. Wall Street Journal releases the bombshell “Facebook Files”
Instagram is toxic for teenage girls; high-profile FB users are exempt from the site’s rules; Facebook is used for human trafficking; and more

7. Facebook Whistleblower Speaks Out, Files SEC Complaint

8. News Consortium Publishes “Facebook Papers” Based on Leaked Documents

9. Rohingya Muslims Sue Facebook for Amplifying Hate Speech That Incited Violence in Myanmar

10. Facebook Voted Worst Company of the Year

Have any other bad Facebook moments you want to add? Post a comment.

 ********

More background on Facebook’s really bad 2021

Avaaz: Facebook: From elections to insurrection

Book: An Ugly Truth: Inside Facebook’s Battle for Domination

BuzzFeed: “Mark Changed The Rules”: How Facebook Went Easy On Alex Jones And Other Right-Wing Figures

Center for Countering Digital Hate: The Disinformation Dozen: Why platforms must act on twelve leading online anti-vaxxers

Global Project Against Hate and Extremism: Democracies Under Threat — How Loopholes for Trump’s Social Media Enabled The Global Rise of Far-Right Extremism

Using antitrust tools to rein in the new robber barons

Over a century ago, Americans looked around at the robber barons who were laying waste to the public commons and invoked antitrust theory to break up businesses that had gotten too big and too dominant.

Since those days, antitrust has been less popular, and the mantra to “let businesses innovate” has carried the day. And innovate they have, nowhere more than in technology, which has created a hugely robust sector dominated by a handful of wealthy entrepreneurs who, after they innovated, wiped out any and all competition. It’s a new round of robber barons.

With classic antitrust in the garbage can, a Congressional committee led by Rep. David Cicilline (D-RI) and staffed by Lina Khan, the new chair of the FTC, went back to resurrect antitrust and apply it to the 21st century. The massive report and the legislation that it generated came before the House Judiciary committee in June.

But not all our representatives were aboard the train.

We urgently need to email U.S. Representatives Zoe Lofgren, Eric Swalwell, and Lou Correa. These California Democrats voted with Big Tech and against online platform accountability during the Judiciary Committee hearing on antitrust legislation designed to rein in the power and influence of companies like Facebook, Google and Amazon.

The bills, which would increase consumers’ ability to protect their private data and more easily switch from one online platform to another, did pass out of committee. Most Democrats and some Republicans voted for them. But the tech industry lobbied hard against them, and California lawmakers caved in to this pressure. It’s not just California lawmakers who are susceptible to Big Tech’s charms: After the bills passed out of committee, House Majority Leader Steny Hoyer (D-MD) announced that they were not ready to be brought to the floor of the House and that Congress’s approach to the tech industry should be “constructive, not destructive.”

We need to make sure Majority Leader Hoyer and Congress Members Lofgren, Swalwell, and Correa hear loud and clear that siding with Big Tech over their constituents and the public interest is not acceptable.

Congress isn’t the only part of the government that needs to catch up with the return of the robber barons. The courts also threw out the antitrust case against Facebook filed by the FTC and the Attorney Generals of 48 out of the 50 states.

It can be hard to recognize the same old problem in its brand new guise. We need to keep talking about antitrust—and regulation—which are the tools to keep robber barons, both old and new, in check.

Ban Donald Trump from Facebook. Then what?

This blog originally appeared on Daily Kos.

Hundreds of thousands of people signed petitions imploring the Facebook Oversight Board, an assortment of luminaries assembled by the social media giant to offload hard decisions, to make permanent an indefinite suspension of the former president from Facebook. Bowing somewhat to the overwhelming public pressure, the Oversight Board declined to reinstate Trump and punted the decision back to the company, while calling out their wildly inconsistent decisions and lack of meaningful policy. The company has six months to issue another decision.

This “pause” satisfied pretty much no one, but avoided a return to the full-on disinformation blizzard Trump perfected on Facebook with an intermixed stew of paid microtargeted advertising, amplification bots, private groups and manipulation of Facebook’s opaque algorithmic formulas.

But whatever happens with the final decision on the Trump return or permanent suspension, it’s only a matter of time until another populist wanna-be dictator uses the same formula. We can’t ban our way out of the right-wing disinformation echo chamber and its dangerous impacts on democratic process.

At the Facebook Users Union (https://facebookusers.org), a project of two Northern California nonprofits Global Exchange and Media Alliance, we’ve been all in on the Ban Trump effort, but we know that’s only the beginning. The Facebook Users Union aims to leverage the power of us, the Facebook users who create the content and provide the eyeballs and clicks, to have a say in how the world’s largest social media platform impacts us and the world we live in. The purpose of a union is to build collective power to negotiate. We have power together that we could never have alone. We believe that users together can demand, and win, significant changes in how Facebook does business that will make the platform less toxic and less dangerous. Because there’s no Facebook without Facebook users.

Let’s face it. The government is the government. Having the federal government meddle in public speech creates a world of possible trouble. To the extent social media functions as a part of the fourth estate, it needs independence from government-driven censorship or overly aggressive interventions that infringe on rights of dissent. With what we know about law enforcement monitoring of social media, we have to be careful what we wish for.

Similiarly, I can’t think of anyone who wants a tech bro telling us what’s true and what’s false, nor any of the terminally confused contractors who regularly put users in “Facebook jail” for reasons that often seem opaque, bewildering or flat out wrong.

But there are ways to tamp down the algorithmic frenzies that shoot sensationalistic disinformation around the platform at the speed of light. One of the most interesting is called a “circuit-breaker”, which points with laser precision at the small quantities of content that display patterns of viral algorithmically driven spread.

A circuit breaker is an electrical switch that automatically gets flipped when excess electrical current is threatening damage to an electrical circuit. Circuit breakers are used in the stock exchange when there’s a dramatic drop, to pause trading and allow traders to assimilate new information and make better-informed choices. Basically, they avoid a “run on a stock” based on rumors that may or may not be true. By slowing down the pace of trading, they let reality catch up and sometimes prevent financial ruin based on lies and disinformation.

In social media, a small amount of content becomes “super-spreader” content. Almost entirely right wing in origin, super-spreader content benefits from a well-developed artificial amplification network that works to game algorithmic formulas for maximum reach as quickly as possible. Facebook’s algorithmic amplification formulas prioritize controversy, sensationalism and anger and feed on personal vulnerabilities to propel virulent content that includes incitements to violence, hate speech and medical and election-related disinformation.

A circuit breaker would automatically be flipped when a superspreading post is heading to tens of thousands of people’s newsfeeds at an unsafe speed.

Once the circuit breaker is triggered, the post would be “paused” until it’s reviewed by a human being. The Working Group on Infodemics Policy Framework recommends that the circuit breaker would temporarily prevent the content from algorithmic amplification in newsfeeds, appearing in trending topics, or via other algorithmically aggregated and promoted avenues. Individual posting or message sharing could still occur. (See Working Group on Infodemics Policy Framework, page 80.)

The rub, of course, is what human review looks like. Facebook’s existing fact-checking program would need to be better resourced and purposefully diversified so that content in all languages could be fact checked in a timely way. It should also expand to include a significant number of alternative and independent media outlets. Expanding the make-up of Facebook’s independent fact checkers will ensure that diverse viewpoints are not unfairly restricted.

Calls to delete Facebook are everywhere, but the platform’s user base is still growing at over 7% per year, which means its impact on our world isn’t getting smaller.

If you’d like to help force Facebook to voluntarily implement changes to reduce its toxic footprint, join us at the Facebook Users Union (https://facebookusers.org).

Tell Facebook: Flip the switch on hate and lies

Facebook should create a “circuit breaker” to help prevent dangerous disinformation and incitements to violence from ever reaching a mass audience. By the time millions of people have shared false information, it’s already too late. A simple circuit breaker could have stopped disinformation about the pandemic and about the 2020 elections from reaching millions of people.

Disinformation and incitements to violence are nothing new. What is new is online platforms that spread controversial content at lightning speed, before anyone’s had a chance to check it for dangerous disinformation or incitement to hatred and violence.

That’s how the verifiably false Plandemic video spread to more than 8 million people in one week before social media companies took it down.

We deserve social media platforms free from attacks on our elections, free from pandemic misinformation, and free from threats to our physical safety.

Circuit breakers are used in the stock exchange when there’s a dramatic drop, to pause trading and allow traders to assimilate new information and make better-informed choices. Social media circuit breaking would work the same way, pausing the most high volume posts for fact-checking before they reach millions of people.

Facebook can and should build and flip the switch! As Facebook users, we demand Facebook enact a circuit breaker policy to stem the flow of hate and lies!

Background

Why we need a circuit breaker

Disinformation and incitements to violence are nothing new. What is new is online platforms that spread controversial content at lightning speed, before anyone’s had a chance to check it for dangerous disinformation or incitement to hatred and violence.

In this new reality, social media companies’ failure to slow down superspreading content has the potential to destabilize our political system, polarize our population, undermine public health, and threaten our physical safety.

A circuit breaker is an electrical switch that automatically gets flipped when excess electrical current is threatening damage to an electrical circuit. Stock markets have borrowed the circuit breaker concept to halt trading when there’s a dramatic drop in the value of the stock market.

Social media companies like Facebook could and should implement a circuit breaker to pause viral content while it’s checked for dangerous disinformation and incitements of hatred and violence.

While a circuit breaker on its own won’t eliminate the torrent of lies and hate that Facebook has allowed to inundate its platform, it is a crucial first step.

Slowing down superspreader content goes beyond the free speech vs. censorship dichotomy

The speed with which disinformation can spread on social media platforms outstrips conventional conversations about unfettered free speech vs. censorship. Government censorship of speech is always a problem and always a threat to democracy, but in the 21st century it is no longer the only threat we face.

Powerful and ubiquitous social media companies, chief among them Facebook, have the power to undermine human rights and democratic structures by placing their engagement metrics and advertising profits above the public interest.

Circuit breakers restore free speech to the real meaning of the expression, which never included the right to viral or algorithmic amplification. They would allow superspreader social media posts to remain posted on an individual’s profile but would prevent them from being algorithmically amplified until they are fact checked by a human being.

Facebook’s algorithmic amplification formulas prioritize controversy, sensationalism and anger and feed on personal vulnerabilities to propel virulent content. None of us want Mark Zuckerberg to decide what is true, what is false, what is hateful and what is not. But the time for analysis is not after the social media platform’s tools have delivered the content to millions of people. By then, it is too late.

How a circuit breaker would work

Stock exchanges use circuit breakers to pause trading during periods of high volatility. During this pause, traders are able to assimilate new information that will help them make informed choices when the stock exchange reopens. Social media circuit breaking would work the same way.

The circuit breaker would automatically be flipped when a superspreading post is heading to tens of thousands of people’s newsfeeds at an unsafe speed. Some researchers have suggested a trigger for posts with 100,000 Facebook interactions in 12 hours, which would mean roughly the top 0.01 percent of Facebook posts from public pages within the period. (See Social media platforms need to flatten the curve of dangerous misinformation by Ellen Goodman and Karen Kornbluh.)

Others have suggested that Facebook implement a circuit breaker that would be flipped under any of the following circumstances:

  • A post reaches a certain number of direct interactions and its reach is growing exponentially (similar to what’s described above)
  • A certain number of reports or complaints have been made about a post.
  • A certain number of pages with large audiences are pushing the post in a coordinated way.
  • A post is being amplified by pages that have previously been penalized by Facebook for spreading misinformation.

Only Facebook has real-time data about what the right trigger point would be to prevent lies and hate from reaching a mass audience. (See Fighting Coronavirus Misinformation and Disinformation by the Center for American Progress.)

Once the circuit breaker is triggered, the post would be “paused” until it’s reviewed by a human being. It would remain on the user’s profile page but not be eligible for amplification. The Working Group on Infodemics Policy Framework recommends that the circuit breaker would temporarily prevent the content from algorithmic amplification in newsfeeds, appearing in trending topics, or via other algorithmically aggregated and promoted avenues. Individual posting or message sharing could still occur. (See Working Group on Infodemics Policy Framework, page 80.)

Any post paused by the circuit breaker would be placed at the top of the list for review by Facebook and third-party fact checkers. This means Facebook’s existing fact-checking program would need to be better resourced and purposefully diversified so that content in all languages could be fact checked in a timely way. It should also expand to include a significant number of alternative and independent media outlets. Expanding the make-up of Facebook’s independent fact checkers will ensure that diverse viewpoints are not unfairly restricted.

Correcting the record by alerting Facebook users who’ve been shown verifiably false or misleading information is also crucial, but it is not as effective as preventing those lies and hate from reaching users in the first place. (Learn more from Avaaz.)

As Facebook users, we deserve to have newsfeeds in which factual content is not drowned out by virulent, sensationalistic content that contains provably false information and incitements to hatred and violence.  A circuit breaker is a policy that Facebook could easily enact as a first step toward creating a better environment for its users.

Delete Trump

We believe in free speech. But let’s get a couple of things clear. 

Donald Trump’s social media accounts are the world’s most prominent organizing tool for violent white nationalists. Some of us may have been chilled by the sight of our elected congress people cowering under their desks, some of us maybe not so much.

But this is not just a political spat. A noose was set up in front of the Capital building. “Murder the media” was scrawled and several reporters were threatened, surrounded and feared for their lives while doing their jobs. In Los Angeles, a Black woman was beaten. There are more violent attacks being planned and no guarantee they will be limited to the stomping grounds of the political elite.

Violent white nationalism is a threat to all of us: women, LGBT/queer and binary people, Black people, Muslims, immigrants, Jews, anarchists, liberals. All in the crosshairs. And all because of election lies powered with racism. We’ve been “Protesting Facebook” because words can and do hurt people, especially in a country filled with weapons and white supremacy.

The Internet should be and is open. Our electronic town square lets anyone and everyone set up a website and say what they want to say, A website called donaldjtrump.com is still there and you can visit it any time you like.

Social media is not the Internet. Social media platforms are tools to achieve wider distribution of content with algorithmic assistance from corporations which set up these sites to maximize engagement in order to draw personal information at scale to sell to advertisers. It’s the business of eyeballs – as the old advertising manuals say.

Maximizing engagement with violent racism and white supremacy is a dangerous business and one the social media moguls have been too comfortable with for some time. It’s okay to say no to the amplification of hatred, lies and racism. It’s necessary to say no to the amplification of white supremacist violence. If we can’t say no, we will replay the rise of fascism in Europe. Six million was enough.

Online social media has been a honey trap for the left; alternately exposing us to federal authorities one week and banning our content the next. The real answers to the disinformation landscape have to do with the engagement metrics and algorithims which send people spinning into the depths of rightwing radicalism without a paddle in a self-perpetuating echo chamber. Studies show that some 65% of members of white nationalist groups on Facebook were invited to join them, not by people they knew, but by Facebook’s recommendation.

But we can’t address that in two weeks. So we have to put our bodies on the levers, and the gears, and all the apparatus, and make it stop. Deleting Trump is the right thing to do.

January 6th cannot be a dress rehearsal.

Tech Transparency Project: Capitol Attack Was Months in the Making on Facebook

Facebook suspended President Trump following the mob attack on Congress. But the platform allowed organizing for the pro-Trump rally, as well as the spread of conspiracy theories and militant extremism that drove the rioters.

Facebook Chief Operating Officer Sheryl Sandberg made headlines for saying the mob attack on the U.S. Capitol was “largely organized” on other platforms, suggesting Facebook had done better than others at taking down dangerous content.

Not only is that assertion false, according to research by the Tech Transparency Project (TTP), but it ignores the fact that Facebook spent the past year allowing election conspiracies and far-right militia activity to proliferate on its platform, laying the groundwork for the broader radicalization that fueled the Capitol insurrection in the first place.

Read the report.

Social justice groups square off against Facebook

Corey Hill of Global Exchange and the Protest Facebook coalition praises the Stop Hate for Profit advertiser boycott and suggests we think of it as a first step. He proposes additional demands to stop the hate and lies on Facebook, such as getting rid of microtargeting, stopping lies in political ads, and refusing service to anyone trying to disrupt the 2020 election.

Read the article on Alternet.

Protect our Democracy. Protest Facebook.

Our election is at risk–in part because of Facebook

This year’s US presidential elections are in jeopardy—in part because San Francisco Bay Area technology company Facebook refuses to take responsibility for the lies, hate, and disinformation that are being spread using its platform.

Facebook refuses to take down advertisements by US politicians that feature even the most blatant falsehoods. It freely allows microtargeting that directs disinformation at vulnerable communities that is hidden from everyone else. And its efforts to stop the spread of toxic lies and hate are too little, too late.

Please join us to say enough is enough! Facebook must immediately prevent its platform from being used to spread disinformation and divisiveness that could disrupt our elections. We’re urging Facebook to:

Stop the lies in political ads

Facebook should fact check advertisements placed by politicians and political campaigns. In other words, it should hold these political ads to the same standards it applies to advertisements that are not placed by politicians. As Facebook employees said in an open letter to company leadership, “Misinformation shared by political advertisers has an outsized detrimental impact on our community.” If you or I lied in an advertisement on Facebook, it would immediately be taken down.

Prohibit political ad microtargeting

Right now political campaigns can target their ads so precisely that they can pinpoint them to reach specific voters. The problem with this, according to former Federal Election Commission Chair Ellen Weintraub, is that “false and misleading messages may be disseminated in a way that does not allow people with conflicting information to counter those messages, because they won’t see them.” Facebook already restricts targeting for industries with a history of discrimination, like housing. They should do the same with political ads.

Refuse service to anyone trying to disrupt the 2020 election

All businesses can refuse service to a person who violates their policies. Restaurants usually require patrons wear shoes and shirts. Facebook should require that users refrain from using the platform to disrupt the 2020 election. They can do this by implementing terms of service or acceptable usage policies that require users to follow certain rules about how Facebook can be used.

And more

There’s no dearth of ideas for how to reform Facebook to protect our democracy and our civil society. Facebook’s own employees have put forward an excellent set of recommendations that include spending caps for politicians and observing silence periods free from political advertisements before elections. Another strong set of recommendations was put forward by Change the Terms, which focuses on how the platform can enforce an acceptable use policy to prevent Facebook from being used to incite hate, fear and abusive behavior. Finally, a number of organizations make the case that the Federal Trade Commission should break up Facebook, which now has unprecedented power to decide what news and information billions of people around the world see every day.

Facebook moderators open letter

In June 2020, Facebook moderators published an open letter expressing solidarity with Black Lives Matter and objecting to Facebook’s lack of action against after President Trump posted a message threatening and inciting violence against Black Lives Matter demonstrators.

Read the letter on Medium.