It’s Getting Harder to Spot a Deep Fake Video


“We’re entering an era in which our enemies
can make anyone say anything at any point in time.” Jordan Peele created this fake video of President
Obama to demonstrate how easy it was to put words in someone else’s mouth- moving forward we need to be more vigilant
with what we trust from the internet. not everyone bought it, but the technology
behind it is rapidly improving, even as worries increase about its potential for harm. This is your Bloomberg QuickTake on Fake Videos. Deep fakes, or realistic-looking fake videos
and audio, gained popularity as a means of adding famous actresses into porn scenes. Despite bans on major websites, they remain
easy to make and find. They’re named for the deep-learning AI algorithms
that make them possible. Input real audio or video of a specific person-
the more, the better- and the software tries to recognize patterns in speech and movement. Introduce a new element like someone else’s
face or voice, and a deep fake is born. Jeremy Kahn: It’s actually extremely easy
to make one of these things… there was just some breakthroughs from academic researchers
who work with this particular kind of machine learning in the past few weeks, which would
drastically reduce the amount of video you need actually to create one of these. Programs like FakeApp, the most popular one
for making deep fakes, need dozens of hours of human assistance to create a video that
looks like this rather than this, but that’s changing. In September researchers at Carnegie-Mellon
revealed unsupervised software that accurately reproduced not just facial features, but changing
weather patterns and flowers in bloom as well. But with increasing capability comes increasing
concern. You know, this is kind of fake news on steroids
potentially. We do not know of a case yet where someone
has tried to use this to perpetrate a kind of fraud or an information warfare campaign,
or for that matter, to really damage someone’s reputation// but it’s the danger that everyone
is really afraid of. In a world where fakes are easy to create-
authenticity also becomes easier to deny. People caught doing genuinely objectionable
things could claim evidence against them is bogus. Fake videos are also difficult to detect,
though researchers and the US Department of Defense, in particular, have said they’re
working on ways to counter them. Deep Fakes do however have some positive potential-
take CereProc, who creates digital voices for people who lose theirs from disease… There are also applications that could be
considered more value-neutral, like the many, many deep fakes that exist solely to turn
as many movies as possible into Nicolas Cage movies.

100 thoughts on “It’s Getting Harder to Spot a Deep Fake Video

Leave a Reply

Your email address will not be published. Required fields are marked *