Deepfake videos usually have a negative connotation since it has been linked with fake videos of celebrities, including porn. However, many businesses are now embracing the technology, integrating it into their work processes.
Some partners at accounting firm EY are incorporating synthetic talking-head-style virtual body doubles of themselves in emails and client presentations. One EY partner used a virtual double to speak to a Japanese client in their native language, something which apparently had a good effect.
The exploration of such technologies comes at a time when traditional modes of establishing business relationships, like meetings, long lunches, golf matches, and networking, have become almost impossible due to the pandemic and its restrictions. EY’s use of virtual doubles is based on the technology provided by UK-based startup Synthesia.
In an interview with Wired, Jared Reeder, who works at EY on a team that provides creative and technical assistance to partners, said that he has come to specialize in creating AI doubles of coworkers in the past few months. The video clips that incorporate the fake doubles are openly acknowledged as synthetic.
According to Reeder, the technology has livened up conversations with clients. “We’re using it as a differentiator and reinforcement of who the person is… As opposed to sending an email and saying ‘Hey we’re still on for Friday,’ you can see me and hear my voice… It’s like bringing a puppy on camera… They warm up to it,” he said.
Success
You are now signed up for our newsletter
Success
Check your email to complete sign up
In an interview with the BBC, Synthesia’s chief executive and co-founder Victor Riparbelli calls the technology the “future of content creation.” To illustrate his point, Riparbelli gave the example of a company employing 3,000 warehouse workers in North America, with some of the employees speaking English and some Spanish.
“If you have to communicate complex information to them, a four-page PDF is not a great way. It would be much better to do a two or three-minute video, in English and Spanish… If you had to record every single one of those videos, that’s a massive piece of work. Now we can do that for [little] production costs, and whatever time it’ll take someone to write the script. That pretty much exemplifies how the technology is used today,” Riparbelli said.
Last year, South Korea’s MBN channel aired a news segment featuring a deepfake version of anchor Kim Joo-Ha. The copy perfectly mimicked Joo-Ha’s gestures, facial expressions, and voice. The video caused quite a stir in the country. While some were amazed by it, others worried that Joo-Ha might lose her job. MBN had indicated that it would continue using the deepfake copy for only some of its breaking news reports.
Recently, a YouTube content creator known as Shamook, who is famous for creating deepfake videos where an actor in a movie is switched with another actor, was hired by Lucasfilm. The executives at Disney, which owns Lucasfilm, seemed to have been impressed by Shamook’s deepfake video of a scene from an episode of the Star Wars franchise, Mandalorian.
Criminal use, ethics of commercialization
Similar use of deepfake technologies, whether it be in media, business, or other sectors, is being explored and will likely increase over the coming years. However, deepfakes also present challenges to companies. Deepfake videos can be used to negatively impact a firm’s reputation. Even if the company proves that the video is fake, the reputational damage might not get reversed quite as easily.
A criminal group can use a deepfake video of a CEO admitting some kind of fault in the company to manipulate the stock market. By the time people realize the video is fake, the damage would be done and the criminals would have manipulated the market to their advantage. In August 2019, criminals impersonated the voice of a company’s executive to demand the transfer of $243,000.
In an interview with The Wall Street Journal, Irakli Beridze, head of the Centre on AI and Robotics at the United Nations Interregional Crime and Justice Research Institute, warned that deepfake videos can be manipulated by hackers. “Imagine a video call with [a CEO’s] voice, the facial expressions you’re familiar with. Then you wouldn’t have any doubts at all,” he said.
There are also ethical questions to tackle when using deepfakes even for legitimate activities. Lilian Edwards, professor of law, innovation, and society at Newcastle Law School, says that the issue of commercial use of deepfake technology hasn’t been fully addressed. For instance, who owns the rights to a deepfake video?
“If a dead person is used, such as [the actor] Steve McQueen or [the rapper] Tupac, there is an ongoing debate about whether their family should own the rights [and make an income from it]… Currently, this differs from country to country,” Edwards said.