More than five million Facebook posts by Filipino users during the May elections in the Philippines were taken down by Meta for violating violence rules as the company provided figures for actions taken with regard to harmful content.
The company removed 15,000 accounts via its automated systems for violating the platform’s rules on inauthentic behavior or the use of fake accounts as the social network also observed large-scale “inauthentic behavior” on its platforms, referring to fake accounts spamming election-related posts.
Facebook noted that its automated systems were able to take down the said accounts thanks to earlier work ahead of the elections by its investigative teams and the Meta monitored content from the Philippines in the four months leading up to the May 9 polls and the week after.
The bulk of the removed posts, or about 5 million, were flagged for breaching the platforms’ violence and incitement policies, said Melissa Chin, Meta’s content policy manager for Asia-Pacific and the teams took down more than 10,000 accounts, with the insights from these takedowns applied to its automated system.
RELATED STORY: 69 percent Filipinos treat ‘fake news’ as serious issue: Survey
The fake accounts were being used to inflate the distribution of election-related content including some that would use politics merely to get people’s attention and the posts were also about “high-severity violence,” threats leading to serious injury and statements expressing violence related to voting, voter registration or election results.
Meta’s director for global threat disruption, David Agranovich explained in an online press call that “some were not inherently political actors, and were people trying to make money by using election-related topics alongside other topics such as sports and entertainment to get people to click through links and go to their websites, potentially to sell something.”
Among the posts deleted include attacks on public figures using “severe sexualizing content, negative physical descriptions tagged to, mentioned or posted on the public figure’s account” and from January 9 to May 16, 2022, the company took action on more than 5 million pieces of content for violating its violence and incitement policies on both Facebook and Instagram in the Philippines.
Over 670,000 pieces of content received action for violating hate speech policies, while another 550,000 received action for violating bullying and harassment policies and some posts were removed for containing hate speech, which Meta defined as a “direct attack on people based on their protected characteristics,” referring to race, ethnicity, religious affiliation, gender identity, sexual orientation or serious disease.