nb861214@ohio.edu
Facebook faced criticism after the 2016 election for the amount of fake news and false information that was widely shared on the platform in the months leading up to Election Day. An analysis from BuzzFeed shows that in the three months leading up to the election, the 20 top-performing false election stories had more engagements on Facebook than the 20 top-performing election stories.
Since the election nearly two years ago, Facebook has shown an effort to combat misinformation. In December 2017, Facebook released the video below to summarize how they are fighting false news.
Facebook's strategy is simple: remove, reduce and inform.
Facebook has both ad policies and community standards. Their goal is to remove posts that don't follow those guidelines. Areas that are typically violated when a post is considered fake news include spam, hate speech or fake accounts.
Lots of fake content shared on Facebook is financially motivated. By removing their content, Facebook is making these sites unprofitable. Third-party fact-checkers are a large part of this strategy. Checkers are verified through the International Fact-Checking Network. When checkers rate something as false, those posts appear lower in a user's newsfeed. Facebook claims this reduces future views by more than 80%.
Finally, Facebook wants users to be aware of misinformation and how to report it. In the first image below, the "about this article" can be clicked on to learn more about the source of the article. In a separate effort, if a user tries to share false information, Facebook alerts the user that additional reporting is available from a fact-checker that the user may want to see before reposting content. This can be seen in the second image below.
Source: Facebook |
Source: Facebook |
Facebook has no easy task ahead of them. As a website dedicated to user-generated content, they are facing increased pressure to be a leader in the fight against the spread of fake news. They have already taken steps to decrease the amount of false information on the website as well as inform their users how to recognize fake news. Just last week it was announced they are working on technology to better detect uploaded photos and videos that have been manipulated. While they have not figured out how to fully eliminate misinformation on their platform, it is evident that Facebook has taken many steps in the right direction since the 2016 election.
No comments:
Post a Comment