By: Anthony Suszczynski
as309714@ohio.edu
It is imperative that journalists tell the truth when writing and reporting news stories. Their reputations depend on it, but more importantly the public’s perception of news events are formed by what they read and hear in the news. As I read and watched the materials from this week, I understood how important the truth is.
Journalists such as Jason Blair have been caught being dishonest reporting. Sometimes news events are very serious and tragic and any altering of the story is disrespectful to those who were a part of that news event and had to live through it. Ethics must be a part of a journalist’s everyday life.
Jason Blair is just one example of false or unethical reporting. The others are listed in the articles and slides. Some include Jack Kelly, Mitch Albom, and Stephen Glass. Today, however, there is an entirely new meaning to fake news. Social media has made it incredibly easy to spread fake news. One example of fake news is politically motivated articles that are spread around to alter people’s political opinions. One such article is the story that Pope Francis endorsed President Trump. That simply did not happen, but it was shared on Facebook many times.
As we are on the topic of fake news, I believe it is important to examine a word that I just heard for the first time within the past few weeks. This word is deepfakes. In my own words, deepfakes refers to the altering of images and sound. Specifically, we see this with people. Technology now allows computers to put an individual on screen and produce video and/or audio of them saying or doing something they have not actually done. I’ll give you an example.
Just the other week, I was scrolling through Twitter when I saw a tweet from Joe Rogan discussing deepfakes. Joe Rogan hosts one of the most popular podcasts in the world. This makes it particularly easy for deepfake technology to manipulate his speech. Due to the fact that he has thousands of hours of him talking on his podcast, computers were able to generate audio that sounded nearly identical to him speaking. The more audio there is of someone, the easier it is for this technology to work. When I heard the recording, I was shocked. It sounds just like him! Here is the audio that is NOT Joe Rogan talking. A warning to the readers: The audio has some colorful and crude language, but shows just how advanced this technology is.
Lastly, it is worth examining the threats that deepfakes impose. An article by The Wall Street Journal titled, “Deepfake Videos Are Getting Real and That’s a Problem” explains that there are a number of potential problems. In the article, Professor Hany Farid of Dartmouth College worries that this could be a threat to democracy. I agree with him. Many of the methods U.S. Courts use to determine the innocence of someone is through use of recordings. If that cannot be trusted, it will make the judicial system more complicated. Another way I believe democracy can be threatened is during elections. If videos float around on social media of candidates saying and doing things they have not done, it can alter elections. Overall, deepfakes are interesting and technology continues to advance rapidly. I think we are all still learning at this point and I hope that in the coming years, we make technology work for us and not against us.
link to photo source
link to photo source
No comments:
Post a Comment