Opinion: How Facebook fights false news

By Ankhi Das, Head of Public Policy - South Asia, Facebook

Facebook was built to connect people and give them the power to build community. Around the world, we've seen Facebook used to bridge language, cultural and socioeconomic barriers, and give small business owners new economic opportunities. But we also know that connecting people can have unintended consequences.

Misinformation and false news are harmful to any community and make the world less informed. We take our responsibility to deal with Misinformation and false news very seriously and remain invested in this responsibility across the globe including in Sri Lanka. We want to empower people to decide for themselves what to read, trust, and share. We do so by promoting news literacy and informing people with more context. To give people more control, we encourage them to tell us when they see false news. Feedback from our community is one of the various signals that we use to identify potential hoaxes.

We are also working to empower our community in Sri Lanka on how to spot false news in order to make more informed decisions. Last year, we launched our Digital Literacy program in Sri Lanka in partnership with Sarvodya Fusion. Under this continuing program, 20,000 secondary school students are being trained on how to use the internet safely and responsibly.

Third party fact checking

In addition to our own efforts to reduce the spread of misinformation on Facebook, we're scaling our partnerships with third-party fact-checkers, who are working on combating misinformation and false news in Sri Lanka. Last week, we announced our partnership with Agence France-Presse (AFP) to fact-check content on Facebook in Sri Lanka. AFP is a global partner and certified through a non-partisan International Fact-Checking Network.

When our fact checkers rate something as false, we rank those stories significantly lower in News Feed. On average, this cuts future views by more than 80%. The information from fact-checkers helps improve our technology, so that we can identify more potential false news faster in the future. This multi-pronged approach also roots out the bad actors that frequently spread fake stories. It dramatically decreases the reach of those stories. And it helps people stay informed without stifling public discourse.

Other Features on Facebook, such as Context Button, give people more information about the publishers and articles they see, such as the publisher’s Wikipedia entry. Related Articles, displays articles from third-party fact-checkers immediately below a story on the same topic. If a fact-checker has rated a story as false, we’ll let people who try to share the story know there’s more reporting on the subject.

This is some of the most important work being done at Facebook. And we know we can’t do it alone —we work with NGOs and other civic society organisations to alert us on fake news spreading on the platform. The informal network that we have built over the last few years has led us to identify and reduce the distribution of any content that violates our community standards guidelines. We also work extensively with trusted partners on the ground who know the pulse of the community and who have extensive networks and local relationships.

Removing accounts and content that violates our policies

Although false news does not violate our Community Standards, it often violates our policies in other categories, such as spam, hate speech or fake accounts, which we remove. Over the past year we’ve learned more about how networks of bad actors work together to spread misinformation, so we created a new policy to tackle coordinated inauthentic activity. We’re also using machine learning to help our teams detect fraud and enforce our policies against spam.

Getting Ahead Together

The issue of misinformation and false news will always be a work in progress. Even with these steps, we know people will still come across misleading content on Facebook and the internet more broadly. Facebook is committed to helping Sri Lanka and its communities and we are doubling down on countering misinformation on our platform. We are committed to working with the community, our partners in civil society and news publishers in Sri Lanka to ensure that we keep people safe on Facebook.

(- Ankhi Das is the Director of Public Policy for Facebook in India and South & Central Asia. With over 18 years of public policy and regulatory affairs experience in the technology sector, Ankhi’s primary responsibilities are to lead Facebook’s efforts on connectivity, Internet governance, promoting access and Open Internet, privacy, data security, safety issues and political risk management for the company. Prior to joining Facebook in 2011, Ankhi was at Microsoft, whereas the Public Policy Director she was responsible for strategic public policy projects and managing regulatory issues for Microsoft in India. -)

Ankhi-Das
Ankhi Das, Head of Public Policy - South Asia, Facebook
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
Top
0
Would love your thoughts, please comment.x
()
x