JAKARTA – Facebook's track record with the content available on its platform is nothing to envy. But for users inexperienced with social media tools, the platform serves up more intrusive content that can be anything from graphic to sexual violence.
Over the past few weeks, leaked internal research material belonging to whistleblower Frances Haugen has revealed the history of surveys conducted by the company.
Among them is its reluctance to resolve plagiarism issues created by some popular pages and groups, but Facebook ignores the issue to avoid legal issues.
Facebook is also a hotbed for spreading political propaganda and hateful content through clickbait farming. Instead of taking swift action, the company's content business is paying criminals through its content and advertising initiatives.
Now, a recent USA Today investigation claims that Facebook users who lag behind in digital literacy and social media skills are exposed to disturbing content that depicts violence and borderline nudity.
The company now known as Meta Inc conducted a user survey several years ago with the aim of analyzing the digital literacy skills of its audience.
Based on how users respond to questions about terms such as tagging and other basic features, Facebook learns the type of content each person has been exposed to in the last 30 days.
Users who failed to correctly answer any of the questions about Facebook's core features saw 11.4 percent more nudity and 13.4 percent more graphic violence in their content feed.
A Facebook employee who discussed the findings reportedly said that "the 'default' feed experience, so to speak, includes nudity + limit content unless controlled."
To complement its research findings, Facebook also reached out to 'vulnerable users' in their homes and conducted detailed interviews to learn about their experiences on the platform based on their low level of digital skills.
The Facebook team recognizes that many users in this segment are alienating themselves from the platform after seeing disappointing content in their feeds that adds to the problems they've been struggling with.
For example, posts showing children being bullied, "threatening, and killing others," and racial tensions appear in middle-aged black women's Facebook content feeds. The findings are not surprising, as Facebook sparked controversy for inflammatory content in the run-up to the Capitol Hill incident earlier this year and continues to battle COVID-19 misinformation, hate speech, and conspiracies such as the health effects of 5G.
SEE ALSO:
For other at-risk users who are members of the Anonymous Narcotics Group, Facebook has started showing recommendations and ads for alcoholic beverages. The following coupon and savings pages were immediately flooded with posts of financial scams.
Instagram's sister platform is no stranger to this problem either, having recently received a stern warning for allowing the online drug trade to thrive. It accounts for several overdose-related deaths in the US.
Facebook's research concludes that its content algorithm is dangerous for people experienced with the nooks and crannies of social media. Because users are not aware of tools like 'hide', 'unfollow', 'block', and reporting, they continue to see inappropriate content appear in their feeds.
Again, people of color, those with lower socioeconomic status, and lower levels of education are the most vulnerable. More importantly, between a quarter and a third of all Facebook users fall into the 'low tech skill' category according to the social media titan research itself.
The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)