Forget Anonymity, The Real Issue Is With Deepfakes

Original article was published by Emma VandenEinde on Artificial Intelligence on Medium


Forget Anonymity, The Real Issue Is With Deepfakes

Deepfakes are taking over identities on the internet, and until media consumers understand how to spot them, we won’t know what’s true.

It’s said that you can’t believe everything that is posted on the internet, and the rise of anonymous internet comments and profiles has solidified this statement. But, it’s important to remember you can’t believe everything you see on the internet, either. Deepfakes — a form of disinformation that takes a video out of context by layering new audio on top of it — have permeated the social media streams, leaving an untrained audience guessing what is true and what is fake. In order to prevent ourselves from being victims of deliberately fake news, we need to know what we’re looking for in the content.

<a href=”https://imgflip.com/i/4g66he“><img src=”https://i.imgflip.com/4g66he.jpg” title=”made at imgflip.com”/></a><div><a href=”https://imgflip.com/memegenerator“>from Imgflip Meme Generator</a></div>

Deepfakes are created by placing one person’s head onto another person’s body, typically in the form of a video or animation. The film industry has frequently used artificial intelligence to link up audio with moving lips, but now, anyone can download the technology and try it out for themselves. The goal behind deepfakes is to make people “say or do things that they never in fact said or did” in the most realistic way possible.¹

With the ability to put anyone’s face on any body, the possibilities of creative edits are endless. Actors from The Office, Kim Kardashian and even Dr. Phil have all been used in a deepfake video, and it is quite comical.

<blockquote class=”instagram-media” data-instgrm-captioned data-instgrm-permalink=”https://www.instagram.com/p/CFSV6lAFzds/?utm_source=ig_embed&amp;utm_campaign=loading” data-instgrm-version=”12″ style=” background:#FFF; border:0; border-radius:3px; box-shadow:0 0 1px 0 rgba(0,0,0,0.5),0 1px 10px 0 rgba(0,0,0,0.15); margin: 1px; max-width:540px; min-width:326px; padding:0; width:99.375%; width:-webkit-calc(100% — 2px); width:calc(100% — 2px);”><div style=”padding:16px;”> <a href=”https://www.instagram.com/p/CFSV6lAFzds/?utm_source=ig_embed&amp;utm_campaign=loading” style=” background:#FFFFFF; line-height:0; padding:0 0; text-align:center; text-decoration:none; width:100%;” target=”_blank”> <div style=” display: flex; flex-direction: row; align-items: center;”> <div style=”background-color: #F4F4F4; border-radius: 50%; flex-grow: 0; height: 40px; margin-right: 14px; width: 40px;”></div> <div style=”display: flex; flex-direction: column; flex-grow: 1; justify-content: center;”> <div style=” background-color: #F4F4F4; border-radius: 4px; flex-grow: 0; height: 14px; margin-bottom: 6px; width: 100px;”></div> <div style=” background-color: #F4F4F4; border-radius: 4px; flex-grow: 0; height: 14px; width: 60px;”></div></div></div><div style=”padding: 19% 0;”></div> <div style=”display:block; height:50px; margin:0 auto 12px; width:50px;”><svg width=”50px” height=”50px” viewBox=”0 0 60 60″ version=”1.1″ xmlns=”https://www.w3.org/2000/svg” xmlns:xlink=”https://www.w3.org/1999/xlink”><g stroke=”none” stroke-width=”1″ fill=”none” fill-rule=”evenodd”><g transform=”translate(-511.000000, -20.000000)” fill=”#000000″><g><path d=”M556.869,30.41 C554.814,30.41 553.148,32.076 553.148,34.131 C553.148,36.186 554.814,37.852 556.869,37.852 C558.924,37.852 560.59,36.186 560.59,34.131 C560.59,32.076 558.924,30.41 556.869,30.41 M541,60.657 C535.114,60.657 530.342,55.887 530.342,50 C530.342,44.114 535.114,39.342 541,39.342 C546.887,39.342 551.658,44.114 551.658,50 C551.658,55.887 546.887,60.657 541,60.657 M541,33.886 C532.1,33.886 524.886,41.1 524.886,50 C524.886,58.899 532.1,66.113 541,66.113 C549.9,66.113 557.115,58.899 557.115,50 C557.115,41.1 549.9,33.886 541,33.886 M565.378,62.101 C565.244,65.022 564.756,66.606 564.346,67.663 C563.803,69.06 563.154,70.057 562.106,71.106 C561.058,72.155 560.06,72.803 558.662,73.347 C557.607,73.757 556.021,74.244 553.102,74.378 C549.944,74.521 548.997,74.552 541,74.552 C533.003,74.552 532.056,74.521 528.898,74.378 C525.979,74.244 524.393,73.757 523.338,73.347 C521.94,72.803 520.942,72.155 519.894,71.106 C518.846,70.057 518.197,69.06 517.654,67.663 C517.244,66.606 516.755,65.022 516.623,62.101 C516.479,58.943 516.448,57.996 516.448,50 C516.448,42.003 516.479,41.056 516.623,37.899 C516.755,34.978 517.244,33.391 517.654,32.338 C518.197,30.938 518.846,29.942 519.894,28.894 C520.942,27.846 521.94,27.196 523.338,26.654 C524.393,26.244 525.979,25.756 528.898,25.623 C532.057,25.479 533.004,25.448 541,25.448 C548.997,25.448 549.943,25.479 553.102,25.623 C556.021,25.756 557.607,26.244 558.662,26.654 C560.06,27.196 561.058,27.846 562.106,28.894 C563.154,29.942 563.803,30.938 564.346,32.338 C564.756,33.391 565.244,34.978 565.378,37.899 C565.522,41.056 565.552,42.003 565.552,50 C565.552,57.996 565.522,58.943 565.378,62.101 M570.82,37.631 C570.674,34.438 570.167,32.258 569.425,30.349 C568.659,28.377 567.633,26.702 565.965,25.035 C564.297,23.368 562.623,22.342 560.652,21.575 C558.743,20.834 556.562,20.326 553.369,20.18 C550.169,20.033 549.148,20 541,20 C532.853,20 531.831,20.033 528.631,20.18 C525.438,20.326 523.257,20.834 521.349,21.575 C519.376,22.342 517.703,23.368 516.035,25.035 C514.368,26.702 513.342,28.377 512.574,30.349 C511.834,32.258 511.326,34.438 511.181,37.631 C511.035,40.831 511,41.851 511,50 C511,58.147 511.035,59.17 511.181,62.369 C511.326,65.562 511.834,67.743 512.574,69.651 C513.342,71.625 514.368,73.296 516.035,74.965 C517.703,76.634 519.376,77.658 521.349,78.425 C523.257,79.167 525.438,79.673 528.631,79.82 C531.831,79.965 532.853,80.001 541,80.001 C549.148,80.001 550.169,79.965 553.369,79.82 C556.562,79.673 558.743,79.167 560.652,78.425 C562.623,77.658 564.297,76.634 565.965,74.965 C567.633,73.296 568.659,71.625 569.425,69.651 C570.167,67.743 570.674,65.562 570.82,62.369 C570.966,59.17 571,58.147 571,50 C571,41.851 570.966,40.831 570.82,37.631″></path></g></g></g></svg></div><div style=”padding-top: 8px;”> <div style=” color:#3897f0; font-family:Arial,sans-serif; font-size:14px; font-style:normal; font-weight:550; line-height:18px;”> View this post on Instagram</div></div><div style=”padding: 12.5% 0;”></div> <div style=”display: flex; flex-direction: row; margin-bottom: 14px; align-items: center;”><div> <div style=”background-color: #F4F4F4; border-radius: 50%; height: 12.5px; width: 12.5px; transform: translateX(0px) translateY(7px);”></div> <div style=”background-color: #F4F4F4; height: 12.5px; transform: rotate(-45deg) translateX(3px) translateY(1px); width: 12.5px; flex-grow: 0; margin-right: 14px; margin-left: 2px;”></div> <div style=”background-color: #F4F4F4; border-radius: 50%; height: 12.5px; width: 12.5px; transform: translateX(9px) translateY(-18px);”></div></div><div style=”margin-left: 8px;”> <div style=” background-color: #F4F4F4; border-radius: 50%; flex-grow: 0; height: 20px; width: 20px;”></div> <div style=” width: 0; height: 0; border-top: 2px solid transparent; border-left: 6px solid #f4f4f4; border-bottom: 2px solid transparent; transform: translateX(16px) translateY(-4px) rotate(30deg)”></div></div><div style=”margin-left: auto;”> <div style=” width: 0px; border-top: 8px solid #F4F4F4; border-right: 8px solid transparent; transform: translateY(16px);”></div> <div style=” background-color: #F4F4F4; flex-grow: 0; height: 12px; width: 16px; transform: translateY(-4px);”></div> <div style=” width: 0; height: 0; border-top: 8px solid #F4F4F4; border-left: 8px solid transparent; transform: translateY(-4px) translateX(8px);”></div></div></div></a> <p style=” margin:8px 0 0 0; padding:0 4px;”> <a href=”https://www.instagram.com/p/CFSV6lAFzds/?utm_source=ig_embed&amp;utm_campaign=loading” style=” color:#000; font-family:Arial,sans-serif; font-size:14px; font-style:normal; font-weight:normal; line-height:17px; text-decoration:none; word-wrap:break-word;” target=”_blank”>Karen makes the Jim face. With Jim’s face. #theoffice #deepfake</a></p> <p style=” color:#c9c8cd; font-family:Arial,sans-serif; font-size:14px; line-height:17px; margin-bottom:0; margin-top:8px; overflow:hidden; padding:8px 0 7px; text-align:center; text-overflow:ellipsis; white-space:nowrap;”>A post shared by <a href=”https://www.instagram.com/iamdeepfaker/?utm_source=ig_embed&amp;utm_campaign=loading” style=” color:#c9c8cd; font-family:Arial,sans-serif; font-size:14px; font-style:normal; font-weight:normal; line-height:17px;” target=”_blank”> DeepFaker</a> (@iamdeepfaker) on <time style=” font-family:Arial,sans-serif; font-size:14px; line-height:17px;” datetime=”2020–09–18T17:40:40+00:00″>Sep 18, 2020 at 10:40am PDT</time></p></div></blockquote> <script async src=”//www.instagram.com/embed.js”></script>

This list isn’t just limited to celebrities. You can also put your face onto another person’s body, as this person did with Johnny Depp’s character Jack Sparrow.

<blockquote class=”twitter-tweet”><p lang=”en” dir=”ltr”>Anyone remember that one time I drank to much Rum and was a <a href=”https://twitter.com/hashtag/deepfake?src=hash&amp;ref_src=twsrc%5Etfw”>#deepfake</a> pirate 🏴‍☠️ 🤣 and told you that the <a href=”https://twitter.com/NwiMonsterCon?ref_src=twsrc%5Etfw”>@nwimonstercon</a> is only 9 Days away? -B <a href=”https://t.co/uVKTDv97M2″>pic.twitter.com/uVKTDv97M2</a></p>&mdash; NWI Monster-Con (@NwiMonsterCon) <a href=”https://twitter.com/NwiMonsterCon/status/1308900348011196419?ref_src=twsrc%5Etfw”>September 23, 2020</a></blockquote> <script async src=”https://platform.twitter.com/widgets.js” charset=”utf-8″></script>

But beyond making some funny videos, why are deepfakes so important and potentially dangerous to our media consumption? The National Counterintelligence and Security Center highlights on Twitter that manipulated videos can give editors the ability to make videos of presidential candidates saying false statements. When shared, these fabricated posts could highly sway the 2020 Election.

<blockquote class=”twitter-tweet”><p lang=”en” dir=”ltr”>Deepfakes (high-quality generated or manipulated video, images, text, or audio) pose an emerging threat and can be used by foreign adversaries to shape public opinion and influence U.S. elections. For more information on <a href=”https://twitter.com/hashtag/deepfake?src=hash&amp;ref_src=twsrc%5Etfw”>#deepfake</a> risks and mitigation see: <a href=”https://t.co/e6MqCgKfUs”>https://t.co/e6MqCgKfUs</a> <a href=”https://t.co/w42lf9jcV7″>pic.twitter.com/w42lf9jcV7</a></p>&mdash; NCSC (@NCSCgov) <a href=”https://twitter.com/NCSCgov/status/1306710371303030785?ref_src=twsrc%5Etfw”>September 17, 2020</a></blockquote> <script async src=”https://platform.twitter.com/widgets.js” charset=”utf-8″></script>

The media company BuzzFeed brought attention to this back in 2018 when it published a video titled “You Won’t Believe What Obama Says In This Video!” The video featured former President Barack Obama calling President Donald Trump a “total and complete dips**t” among other statements. However, the video reveals that while we see an animation of Obama, the audio source comes from Jordan Peele, a famous comedian. After revealing himself, Peele states: “Moving forward, we need to be more vigilant with what we trust from the internet.”

But it’s not just media companies that have this ability. Twitter user Jason Lam (@chinolam) tried out the technology recently on an image of President Donald Trump.

<blockquote class=”twitter-tweet”><p lang=”en” dir=”ltr”>Nothing to do on a rainy Saturday, except make deepfake propaganda videos. <a href=”https://t.co/qpa6LmcnEQ”>pic.twitter.com/qpa6LmcnEQ</a></p>&mdash; Jason Lam (@chinolam) <a href=”https://twitter.com/chinolam/status/1299777103920062464?ref_src=twsrc%5Etfw”>August 29, 2020</a></blockquote> <script async src=”https://platform.twitter.com/widgets.js” charset=”utf-8″></script>

The editing may not be the best, and maybe you can spot the difference. But as artificial intelligence gets smarter, the editing will become much smoother and it will be harder to tell what is really a truthful video.

Think about the implications of this: what if Donald Trump was deepfaked in a video saying he opposed the border wall construction? Or what if Joe Biden was deepfaked saying he agreed with defunding Planned Parenthood? And what if these videos went viral right before Election Day? There could be a wealth of consequences, all made by disinformation.

Hany Farid, a digital forensics expert and author of the book Photo Forensics, outlines in a NOVA PBS video that “the margins are very thin” when it comes to the 2020 Election, and these videos could have an impact on those swing votes. He concludes, “you don’t have to fool tens of millions of people.” Instead, you just have to fool the right people on social media.

Many Americans agree with Farid. Pew Research Center reported that 68% of adults said that fake information, like these videos, “greatly impacts Americans’ confidence in government institutions.” Additionally, 56% believe the problem will worsen in the next five years.²

Nevertheless, there are some tips for spotting these imposter videos and recognizing false information:

1) Research. Always.

Some deepfakes can be more easily debunked when comparing the video content with other news sources. Ask yourself: are your local news stations reporting the same thing? Is this something that’s plausible or is it characteristic of the person to say that? It may be harder to compare what President Donald Trump is saying, for example, but some deepfakes are obviously edited, much like this animation of the painting of Mona Lisa.

<blockquote class=”twitter-tweet”><p lang=”en” dir=”ltr”>Bringing the Mona Lisa to life with Deepfake AI HT <a href=”https://twitter.com/trevorjonesart?ref_src=twsrc%5Etfw”>@trevorjonesart</a> <a href=”https://twitter.com/hashtag/deepfake?src=hash&amp;ref_src=twsrc%5Etfw”>#deepfake</a> <a href=”https://twitter.com/hashtag/art?src=hash&amp;ref_src=twsrc%5Etfw”>#art</a> <a href=”https://twitter.com/hashtag/tech?src=hash&amp;ref_src=twsrc%5Etfw”>#tech</a> <a href=”https://twitter.com/hashtag/innovation?src=hash&amp;ref_src=twsrc%5Etfw”>#innovation</a> <a href=”https://twitter.com/hashtag/ai?src=hash&amp;ref_src=twsrc%5Etfw”>#ai</a><a href=”https://t.co/L0HQ4jsFA8″>pic.twitter.com/L0HQ4jsFA8</a></p>&mdash; Mark Runyon (@aspprogrammer) <a href=”https://twitter.com/aspprogrammer/status/1307441145723662337?ref_src=twsrc%5Etfw”>September 19, 2020</a></blockquote> <script async src=”https://platform.twitter.com/widgets.js” charset=”utf-8″></script>

Research also includes looking at who is publishing the video and their intent on publishing. If it is not from a credible news source or a source with no clear bias, there could be a chance that the video was doctored to advance a certain agenda or message.

2) Look at the quality of the video.

If the content seems legitimate, you can also analyze elements of the editing to see if it is real or deepfaked. Jessica Guynn from USAToday recently put together some tips after speaking with Siwei Lyu, a computer science professor at the State University of New York at Albany.³ They recommend checking to see if the video resolution is grainy or if the lighting between the face and the neck does not match. Media consumers should also check for the slightest of delay between the audio and lip movement of a person in a video.

It’s important to note that deepfakes are easier to make when a single person is represented in the video and the video is shorter than a minute. However, like the Dr. Phil video above, deepfakes are evolving to include more people and adapt to more facial movements.

3) Check out fact-checking websites.

The most notable websites to consider when it comes to checking fake news are Politifact.com, Factcheck.org and Snopes.com. These sites are operated by credible researchers and journalists who fact-check pieces of information across the internet. If they claim it’s false, they will show you why.

Artificial Intelligence is improving and recognizing more facial movements. Source: @teguhjatipras on Pixabay (https://pixabay.com/illustrations/flat-recognition-facial-face-woman-3252983/)

The advances in technology ensure that deepfakes are here to stay, whether it’s for humorous or serious reasons. Therefore, it’s up to us as media consumers to treat every piece of content with skepticism about its accuracy and context. For all we know, the next video we share on Twitter about President Donald Trump may be voiced by the 21-year-old biology student in the apartment next to us.