Saturday, June 1, 2019

The Power of Deepfakes

By: Anthony Suszczynski
as309714@ohio.edu

It is imperative that journalists tell the truth when writing and reporting news stories. Their reputations depend on it, but more importantly the public’s perception of news events are formed by what they read and hear in the news. As I read and watched the materials from this week, I understood how important the truth is. 

Journalists such as Jason Blair have been caught being dishonest reporting. Sometimes news events are very serious and tragic and any altering of the story is disrespectful to those who were a part of that news event and had to live through it. Ethics must be a part of a journalist’s everyday life. 

Jason Blair is just one example of false or unethical reporting. The others are listed in the articles and slides. Some include Jack Kelly, Mitch Albom, and Stephen Glass. Today, however, there is an entirely new meaning to fake news. Social media has made it incredibly easy to spread fake news. One example of fake news is politically motivated articles that are spread around to alter people’s political opinions. One such article is the story that Pope Francis endorsed President Trump. That simply did not happen, but it was shared on Facebook many times.

As we are on the topic of fake news, I believe it is important to examine a word that I just heard for the first time within the past few weeks. This word is deepfakes. In my own words, deepfakes refers to the altering of images and sound. Specifically, we see this with people. Technology now allows computers to put an individual on screen and produce video and/or audio of them saying or doing something they have not actually done. I’ll give you an example. 

Just the other week, I was scrolling through Twitter when I saw a tweet from Joe Rogan discussing deepfakes. Joe Rogan hosts one of the most popular podcasts in the world. This makes it particularly easy for deepfake technology to manipulate his speech. Due to the fact that he has thousands of hours of him talking on his podcast, computers were able to generate audio that sounded nearly identical to him speaking. The more audio there is of someone, the easier it is for this technology to work. When I heard the recording, I was shocked. It sounds just like him! Here is the audio that is NOT Joe Rogan talking. A warning to the readers: The audio has some colorful and crude language, but shows just how advanced this technology is. 

So what does this all mean? One question I had specifically was, what is the government doing about this? According to an informative piece on the many aspects of deepfakes by CNN Business titled, “When seeing is no longer believingInside the Pentagon’s race against deepfake videos,”“the Pentagon, through the Defense Advanced Research Projects Agency (DARPA), is working with several of the country’s biggest research institutions to get ahead of deepfakes.” The article further explains that the Pentagon is actually trying to learn how to make deepfakes in order to better understand them. This will then allow them to develop computer technology to spot and detect deepfakes.

Lastly, it is worth examining the threats that deepfakes impose. An article by The Wall Street Journal titled, “Deepfake Videos Are Getting Real and That’s a Problem” explains that there are a number of potential problems. In the article, Professor Hany Farid of Dartmouth College worries that this could be a threat to democracy. I agree with him. Many of the methods U.S. Courts use to determine the innocence of someone is through use of recordings. If that cannot be trusted, it will make the judicial system more complicated. Another way I believe democracy can be threatened is during elections. If videos float around on social media of candidates saying and doing things they have not done, it can alter elections. Overall, deepfakes are interesting and technology continues to advance rapidly. I think we are all still learning at this point and I hope that in the coming years, we make technology work for us and not against us.
















link to photo source

No comments:

Post a Comment