Facebook reportedly told a user that it had removed hate speech she reported when it had not.
According to the BBC, a Facebook user by the name of Janet (the U.K. news organization did not publish her last name for privacy reasons) had reported posts in a group with 54,000 members named “LARGEST GROUP EVER! We need 10000000 members to Make America Great Again!”
“Facebook has been promoting themselves in my Newsfeed saying they are trying to keep our democracy safe by eliminating content that is false and divisive,” Janet told the BBC.
The user, who lives in Las Vegas, reported the group for anti-Muslim and anti-immigrant posts. After reporting them, Janet received a message that said: “We removed both the group and all its posts, including the one you reported.”
But that was not the case.
Facebook told the BBC that it is looking into a potential glitch in its content moderation system. The error reportedly sends users a message saying content has been removed even though moderators have deemed it permissible on the social network.
“We are investigating this issue, and will share more information as soon as we can,” Facebook told the BBC.
Fox News reached out to Facebook for comment on this issue.
“If they are sending me notices they removed the content and offensive groups but in reality are not, doesn’t this go against what they say in public or to Congress?” Janet added.
Facebook, along with Twitter and Google, has been under intense scrutiny over the last year for its policies around content moderation and removal. Mark Zuckerberg’s company has been accused of censoring conservatives, censoring liberals and not doing enough to protect the people it hires to monitor the deluge of content posted to the platform.