Facebook Accused Of Showing Related Search Ads White Supremacist Group
JAKARTA - Facebook is accused of serving ads on searches related to white supremacist groups. Whereas currently in the US there is a ban on such content on the platform. This was first reported by the Technology Transparency Project.
The report, which was first covered by The Washington Post, identified 119 Facebook pages and 20 Facebook groups affiliated with white supremacist organizations on the platform.
Researchers searched Facebook to find 226 designated hate groups or dangerous organizations that used sources such as the Southern Poverty Law Center, the Anti-Defamation League, and even Facebook itself, and found more than a third of those present on the platform.
The study found that although Facebook insisted that the company was not profiting from hateful content, ads appeared in 40 percent of queries for the group.
The white supremacist pages identified by the report include two dozen pages automatically generated by Facebook. The platform automatically creates a page when a user lists an interest, place of work or business without an existing page.
The issue of auto-generated white supremacist business pages was previously raised in the 2020 analysis, also by the Technology Transparency Project. Among the auto-generated pages identified by the 2022 report was "Pen1 Death Squad," an abbreviation for white supremacist gangs.
Meta spokesman Dani Lever said the 270 groups the company designated as white supremacist organizations were banned from Facebook, and they were investing in technology, staff and research to keep the platform safe.
"We immediately resolved an issue where ads appeared in searches for terms related to illicit organizations and we're also working to fix an issue with automated page generation, which erroneously affects a small number of pages," Lever said, as quoted by The Verge.
"We will continue to work with outside experts and organizations in an effort to stay ahead of violent, hateful and terrorism-related content and remove such content from our platforms," he added.
In 2020, more than 1,000 advertisers boycotted Facebook over the platform's handling of hate speech and misinformation. That same year, civil rights auditors released a report that found that the company's decision resulted in a "serious setback" for civil rights.
Following the audit, Meta formed a civil rights team in 2021, which has published the status of actions and recommendations issued by the auditors.