Monday, September 17, 2018

What is Facebook Doing to Combat Fake News?

Natalie Butko
nb861214@ohio.edu

Facebook faced criticism after the 2016 election for the amount of fake news and false information that was widely shared on the platform in the months leading up to Election Day. An analysis from BuzzFeed shows that in the three months leading up to the election, the 20 top-performing false election stories had more engagements on Facebook than the 20 top-performing election stories.

Since the election nearly two years ago, Facebook has shown an effort to combat misinformation. In December 2017, Facebook released the video below to summarize how they are fighting false news.


The video can also be found on Facebook's Help Center where a page dedicated to the effort in fighting false news can be found. The website says, "We're committed to fighting the spread of false news on Facebook. We use both technology and human review to remove fake accounts, promote news literacy and disrupt the financial incentives of spammers. In certain countries, we also work with third-party fact-checkers who are certified through the non-partisan International Fact-Checking Network to help identify and review false news."

Facebook's strategy is simple: remove, reduce and inform.

Facebook has both ad policies and community standards. Their goal is to remove posts that don't follow those guidelines. Areas that are typically violated when a post is considered fake news include spam, hate speech or fake accounts.

Lots of fake content shared on Facebook is financially motivated. By removing their content, Facebook is making these sites unprofitable. Third-party fact-checkers are a large part of this strategy. Checkers are verified through the International Fact-Checking Network. When checkers rate something as false, those posts appear lower in a user's newsfeed. Facebook claims this reduces future views by more than 80%.

Finally, Facebook wants users to be aware of misinformation and how to report it. In the first image below, the "about this article" can be clicked on to learn more about the source of the article. In a separate effort, if a user tries to share false information, Facebook alerts the user that additional reporting is available from a fact-checker that the user may want to see before reposting content. This can be seen in the second image below.

Source: Facebook
Source: Facebook
Back on Facebook's Help Center there is another page titled "How are we working to secure Facebook during elections?" This page promotes Facebook's three initiatives for future elections: increasing transparency, strengthening enforcement and supporting an informed community. Elements of this effort include identifying advertisements, verifying pages, removing fake accounts and increasing the number of people working in security to 10,000.

Facebook has no easy task ahead of them. As a website dedicated to user-generated content, they are facing increased pressure to be a leader in the fight against the spread of fake news. They have already taken steps to decrease the amount of false information on the website as well as inform their users how to recognize fake news. Just last week it was announced they are working on technology to better detect uploaded photos and videos that have been manipulated. While they have not figured out how to fully eliminate misinformation on their platform, it is evident that Facebook has taken many steps in the right direction since the 2016 election.

No comments:

Post a Comment