In a podcast interview with tech website Recode on Wednesday, Zuckerberg said that while Facebook was dedicated to stopping the spread of fake news, certain beliefs that were sincerely held would not be taken down.
After the remarks caused a backlash on social media, he was forced to backtrack, saying if any post advocated violence or hate against a group, it would be removed.
The controversy began when Zuckerberg provided an unprompted example of Holocaust deniers to Recode host Kara Swisher to make a point about allowing hoaxes to be published on the site.
He said that messages accusing victims of the Sandy Hook school shooting of being liars would be taken down for harassment, but added that not all factually incorrect posts would receive the same treatment.
“I’m Jewish, and there’s a set of people who deny that the Holocaust happened,” he said.
“I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong.”
After Swisher interjected that Holocaust deniers may indeed be motivated by malign intent, Zuckerberg continued:
“It’s hard to impugn intent and to understand the intent. I just think, as abhorrent as some of those examples are, I think the reality is also that I get things wrong when I speak publicly. I’m sure you do. I’m sure a lot of leaders and public figures we respect do too.”
The comments caused a stir, with many seeing Zuckerberg’s foray into the contentious debate as problematic.
Zuckerberg later emailed Recode to clarify, stating that if something is spreading and rated as false by the site’s fact checkers, “it would lose the vast majority of its distribution in News Feed.
“And of course if a post crossed (the) line into advocating for violence or hate against a particular group, it would be removed.”
The episode was an unwelcome distraction for Facebook after it held a briefing on the company’s new policy to remove bogus posts likely to spark violence.
The new tactic being rolled out across the global social network was tested in Sri Lanka, which was recently rocked by inter-religious violence over false information posted on the platform.
A spokesperson announced Facebook may remove inaccurate or misleading content, such as doctored photos, created or shared to stir up or ignite volatile situations in the real world.
Hate speech and threats deemed credible are violations of Facebook rules, and are removed.
The new policy advances another step, by eliminating content that may not be explicitly violent but which seems likely to encourage such behavior.