As journalists, it has always been our job as "watchdogs" to keep the government and powerful parties in check for the well-being of the common man. Although that still holds true today, "watchdog" has a whole new meaning with concerns to the online era.
Poynter.org has an article titled How journalists verify user-generated content, information on social media that outlines this idea pretty well. Even though it is a summarized version of Nieman Reports' package about the truth of social media, it hits enough main issues to be a valuable reference.
In Poynter's article, issues such as verifying and validating user-generated content, spotting photo manipulations and the importance of contacting the source of the information are mentioned as extremely important ways to protect yourself and the news organization you work for.
As outlined in another Poynter article titled Altered Photo a Reminder of Issues with User-Generated Content, Tribune Interactive and Pointer Online published a photo showing a boy crying in front of a tombstone with the name "Santa Claus" etched on the front of it. This photo was taken from an account on Flickr, however, the user didn't even have the rights to it was a stock image—they did not create it, and even noted so in a comment below.
Sometimes the truth is difficult to find beneath photo alterations, making the validity of the source questionable, leading to possible cases of copyright infringement. However, journalists are working to use technology to their advantage to uncover the truth about user-generated content by checking timestamps on photos, scanning for alterations, and contacting sources before publishing information.
Public discussion and conversation about news articles and issues are very important to the news cycle and spreading of important information. However, when anonymous online users post nothing but racist, sexist and hurtful comments below news stories, the conversation becomes less than productive, to put it lightly. Fortunately, over the years, news outlets have changed their approaches to reader feedback to make discussion more positive and constructive for the online community.
In 2011, the American Journalism Review published an online story titled Is Facebook the Solution to the Obnoxious Comment Plague?, outlining the concerns that some news organizations had about the toxic environment of the comment section beneath their content. In order to combat this, some sites implemented a system where a person has to sign into their Facebook account before publishing a comment. For the most part, the number of comments has dwindled because of the new system, but that's not necessarily a bad thing, as the nasty, negative comments are the ones that are missing.
Although this system definitely helps filter out the garbage comments, it's not foolproof—the issue lies in handing over unspecified rights to a third party social media group. Journalists have not yet created an editorial outline for this problem, but since 2011, the landscape has already changed dramatically. Now, some online news websites don't even use a comment section below their stories; they rely on comments and shares via social media websites to spark convers
ation, and generate more online traffic to their sites.
Verification, validity, authenticity and accuracy are the biggest issues with online user-generated content. And as the industry changes, journalists must now learn to not only adapt to the ever-growing industry, but with these values in mind, they must also monitor and cross-check viral information in order to bring the truth to light for their readers.