When seeing is no longer believing, the only thing we can rely on is our gut. Deepfakes, be it video clips or soundbites, are only going to expedite a catastrophe, especially where sensitive political, social and environmental issues are concerned.Īs deepfakes burgeon, we may not pause to think about what is real, or worse, we may stop trusting anything we see or hear, a phenomenon scholar Aviv Ovadya has termed “reality apathy”.īleak as it might seem, indifference is the natural response in this scenario. But the rise of synthesised content is not an optimistic development for a world that’s already struggling to combat misinformation. But we must draw a line somewhere to protect ourselves.ĭeepfakes are innocuous when you’re using an app to swap faces with Kim Kardashian just for the heck of it. The concept is undeniably an alluring one, and we can only assume that’s the reason why millions of people around the world are captivated by robot influencers. Even major tech players, such as Apple and Amazon, are wielding the powers of synthetic media to make Siri and Alexa more realistic.īut can we, as human beings, ever accept the illusion that deepfake characters are ‘real’, let alone connect with them on any personal level? Moreover, do we want to? Hour One, like several other companies, has adopted a well-defined mission to use deepfakes for creating human-centric experiences. “You get something amazing in two minutes, you can send it to everyone, and nobody needs to waste their time,” Aharon said in an interview with Fast Company. But what if they needed to ditch the meeting without repercussions? With deepfake technology, an avatar of that person can deliver the speech. To make sense of the untapped potential of deepfakes, Oren Aharon, the CEO and cofounder of Hour One, asks the world to imagine a company-wide meeting where someone has to record a presentation on video. Israeli start-up Hour One, the creator of the “we are for reals” advert, is using deepfakes for business purposes. But despite all the hubbub around deepfakes and its ethics, synthetic technology is advancing rapidly. Most popularly, Reface, an AI-powered app with more than 100 million downloads, allows users to swap faces with another person, essentially generating deepfake photos and videos. Today, no one has to navigate the complex world of deep learning technology to create their own deepfakes many apps already exist. The only thing that belies her lack of humanity is her soulless eyes. Though her bio says she is a “robot living in LA”, the same question pops up on almost every post: “Are you real or what?” She isn’t, but there are no dead giveaways that she is an avatar controlled by LA-based start-up Brud. She likes avocado toast and samosas, she spends her weekends with real-life friends and she laments about lost love.
Lil Miquela’s posts indicate she is just like the rest of us. One exists as a popular 19-year-old influencer with more than three million followers on Instagram. In addition to producing deceiving content about real people, the technology can also create non-existent characters with composite images. The person in the footage sounds and looks like Obama, but in a twist of irony, the PSA reveals that American comedian and filmmaker Jordan Peele created the fake video to highlight how easy it is to distort words and misuse people’s likeness. “We’re entering an era in which our enemies can make it look like anyone is saying anything at any point in time, even if they would never say those things,” he says. In a fake news PSA (public-service announcement) posted three years ago, former US President Barack Obama warns Internet users about the dangers of this technology. But of course, the five talking heads in the video are not real they’re deepfakes and they’re about to take over our world.ĭeepfakes are a form of synthetic media technology that manipulates images, soundbites and video using artificial intelligence software. They could be a friend of a friend, a stranger sitting across from you at a restaurant or just another person riding the Metro. These are people you might see in your day-to-day life.