
Instagram users saw a flood of violent and sexually explicit contents in their Reels before the company responded to complaints …
Parent company Meta says it has now fixed the issue, which is claims was due to a “mistake” rather than its new relaxed moderation policy.
CNBC reports that it experienced the issue first-hand.
On Wednesday night in the U.S., CNBC was able to view several posts on Instagram reels that appeared to show dead bodies, graphic injuries and violent assaults. The posts were labeled “Sensitive Content” […]
A number of Instagram users took to various social media platforms to voice concerns about a recent influx of violent and “not safe for work” content recommendations.
Some users claimed they saw such content, even with Instagram’s “Sensitive Content Control” enabled to its highest moderation setting.
In an update to the report, it said Meta had apologized for the issue.
“We have fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended. We apologize for the mistake,” a Meta spokesperson said in a statement shared with CNBC.
Meta CEO Mark Zuckerberg last month said that the company was cutting back on automated checks on content, and would in future often only act after receiving complaints from users.
Up until now, we have been using automated systems to scan for all policy violations, but this has resulted in too many mistakes and too much content being censored that shouldn’t have been. So, we’re going to continue to focus these systems on tackling illegal and high-severity violations, like terrorism, child sexual exploitation, drugs, fraud and scams. For less severe policy violations, we’re going to rely on someone reporting an issue before we take any action.
We also demote too much content that our systems predict might violate our standards. We are in the process of getting rid of most of these demotions and requiring greater confidence that the content violates for the rest. And we’re going to tune our systems to require a much higher degree of confidence before a piece of content is taken down.
Image: Meta/9to5Mac
FTC: We use income earning auto affiliate links. More.