The below letter from Mark Zuckerberg, the founder and CEO of Facebook, was released to the world wide media.
A lot of you have asked what we’re doing about misinformation, so I wanted to give an update.
The bottom line is: we take misinformation seriously. Our goal is to connect people with the stories they find most meaningful, and we know people want accurate information. We’ve been working on this problem for a long time and we take this responsibility seriously. We’ve made significant progress, but there is more work to be done.
Historically, we have relied on our community to help us understand what is fake and what is not. Anyone on Facebook can report any link as false, and we use signals from those reports along with a number of others — like people sharing links to myth-busting sites such as Snopes — to understand which stories we can confidently classify as misinformation. Similar to clickbait, spam and scams, we penalize this content in News Feed so it’s much less likely to spread.
The problems here are complex, both technically and philosophically. We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible. We need to be careful not to discourage sharing of opinions or to mistakenly restrict accurate content. We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties.
While the percentage of misinformation is relatively small, we have much more work ahead on our roadmap. Normally we wouldn’t share specifics about our work in progress, but given the importance of these issues and the amount of interest in this topic, I want to outline some of the projects we already have underway:
– Stronger detection. The most important thing we can do is improve our ability to classify misinformation. This means better technical systems to detect what people will flag as false before they do it themselves.
– Easy reporting. Making it much easier for people to report stories as fake will help us catch more misinformation faster.
– Third party verification. There are many respected fact checking organizations and, while we have reached out to some, we plan to learn from many more.
– Warnings. We are exploring labeling stories that have been flagged as false by third parties or our community, and showing warnings when people read or share them.
– Related articles quality. We are raising the bar for stories that appear in related articles under links in News Feed.
– Disrupting fake news economics. A lot of misinformation is driven by financially motivated spam. We’re looking into disrupting the economics with ads policies like the one we announced earlier this week, and better ad farm detection.
– Listening. We will continue to work with journalists and others in the news industry to get their input, in particular, to better understand their fact checking systems and learn from them.
Some of these ideas will work well, and some will not. But I want you to know that we have always taken this seriously, we understand how important the issue is for our community and we are committed to getting this right.