Revealed: the Facebook loophole that lets world leaders deceive and harass their citizens

Julia Carrie Wong

Facebook has repeatedly allowed world leaders and politicians to use its platform to deceive the public or harass opponents despite being alerted to evidence of the wrongdoing.

The Guardian has seen extensive internal documentation showing how Facebook handled more than 30 cases across 25 countries of politically manipulative behavior that was proactively detected by company staff.

The investigation shows how Facebook has allowed major abuses of its platform in poor, small and non-western countries in order to prioritize addressing abuses that attract media attention or affect the US and other wealthy countries. The company acted quickly to address political manipulation affecting countries such as the US, Taiwan, South Korea and Poland, while moving slowly or not at all on cases in Afghanistan, Iraq, Mongolia, Mexico and much of Latin America.



Facebook pledged to combat state-backed political manipulation of its platform after the historic fiasco of the 2016 US election, when Russian agents used inauthentic Facebook accounts to deceive and divide American voters.

But the company has repeatedly failed to take timely action when presented with evidence of rampant manipulation and abuse of its tools by political leaders around the world.

Facebook fired Zhang for poor performance in September 2020. On her final day, she published a 7,800-word farewell memo describing how she had “found multiple blatant attempts by foreign national governments to abuse our platform on vast scales to mislead their own citizenry” and lambasting the company for its failure to address the abuses. “I know that I have blood on my hands by now,” she wrote. News of the memo was first reported in September by BuzzFeed News.

Zhang is coming forward now in the hopes that her disclosures will force Facebook to reckon with its impact on the rest of the world.

“Facebook doesn’t have a strong incentive to deal with this, except the fear that someone might leak it and make a big fuss, which is what I’m doing,” she told the Guardian. “The whole point of inauthentic activity is not to be found. You can’t fix something unless you know that it exists.”

With 2.8 billion users, Facebook plays a dominant role in the political discourse of nearly every country in the world. But the platform’s algorithms and features can be manipulated to distort political debate.

One way to do this is by creating fake “engagement” – likes, comments, shares and reactions – using inauthentic or compromised Facebook accounts. In addition to shaping public perception of a political leader’s popularity, fake engagement can affect Facebook’s all-important news feed algorithm. Successfully gaming the algorithm can make the difference between reaching an audience of millions – or shouting into the wind.

Zhang was hired by Facebook in January 2018 to work on the team dedicated to rooting out fake engagement. She found that the vast majority of fake engagement appeared on posts by individuals, businesses or brands, but that it was also being used on what Facebook called “civic” – ie political – targets.

This method of acquiring fake engagement, which Zhang calls “Page abuse”, was made possible by a loophole in Facebook’s policies. The company requires user accounts to be authentic and bars users from having more than one, but it has no comparable rules for Pages, which can perform many of the same engagements that accounts can, including liking, sharing and commenting.

The loophole has remained open due to a lack of enforcement, and it appears that it is currently being used by the ruling party of Azerbaijan to leave millions of harassing comments on the Facebook Pages of independent news outlets and Azerbaijani opposition politicians.


The article's full-text is available here.


Back to CIRSD recommends

Latest news