Meta is now allowing Facebook and Instagram to run political advertising saying the 2020 election was rigged.
The policy was reportedly introduced quietly in 2022 after the US midterm primary elections, according to the Wall Street Journal, citing people familiar with the decision. The previous policy prevented Republican candidates from running ads arguing during that campaign that the 2020 election, which Donald Trump lost to Joe Biden, was stolen.
Meta will now allow political advertisers to say past elections were “rigged” or “stolen”, although it still prevents them from questioning whether ongoing or future elections are legitimate.
Other social media platforms have been making changes to their policies ahead of the 2024 presidential election, for which online messaging is expected to be fiercely contested.
In August, X (formerly known as Twitter) said it would reverse its ban on political ads, originally instituted in 2019.
Earlier, in June, YouTube said it would stop removing content falsely claiming the 2020 election, or other past US presidential elections, were fraudulent, reversing the stance it took after the 2020 election. It said the move aimed to safeguard the ability to “openly debate political ideas, even those that are controversial or based on disproven assumptions”.
Meta, too, reportedly weighed free-speech considerations in making its decision. The Journal reported that Nick Clegg, president of global affairs, took the position that the company should not decide whether elections were legitimate.
The Wall Street Journal reported that Donald Trump ran a Facebook ad in August that was apparently only allowed because of the new rules, in which he lied: “We won in 2016. We had a rigged election in 2020 but got more votes than any sitting president.”
The Tech Oversight Project decried the change in a statement: “We now know that Mark Zuckerberg and Meta will lie to Congress, endanger the American people, and continually threaten the future of our democracy,” said Kyle Morse, deputy executive director. “This announcement is a horrible preview of what we can expect in 2024.”
Combined with recent Meta moves to reduce the amount of political content shared organically on Facebook, the prominence of campaign ads questioning elections could rise dramatically in 2024.
“Today you can create hundreds of pieces of content in the snap of a finger and you can flood the zone,” Gina Pak, chief executive of Tech for Campaigns, a digital marketing political organization that works with Democrats, told the Journal.
Over the past year Meta has laid off about 21,000 employees, many of whom worked on election policy.
Facebook was accused of having a malign influence on the 2016 US presidential election by failing to tackle the spread of misinformation in the runup to the vote, in which Trump beat Hillary Clinton. Fake news, such as articles slandering Clinton as a murderer or saying the pope endorsed Trump, spread on the network as non-journalists – including a cottage industry of teenagers living in Macedonia – published false pro-Trump sites in order to reap advertising dollars when the stories went viral.
Trump later appropriated the term “fake news” to slander legitimate reporting of his own falsehoods.