Audio Deep Fake: How to protect ourselves from misuse?


Deep Fake is a fascinating technology. But how do we protect ourselves from misuse?

We can hear how professionally Deep Fake technology already works today from our own AI. With a few original voice recordings of the former German Chancellor Angela Merkel, we have managed to produce an amazingly real-sounding clone of her voice. See ESPESY Deep Fake article.

So how should our society deal with technologies like this in the future?

Let’s look back for a moment. Because the topic of fake is probably as old as language. Even centuries ago, people could tell lies, forge other people’s signatures or spread false quotes. Today, everyone has the possibility to take photos or audio recordings with their smartphone, to change them and to distribute them worldwide. And even the grandchild trick already works without speech synthesis.

Therefore, it is less a question of technology than of society. The technology itself is only a tool.

It needs a societal discourse, in which above all ethicists, politicians and lawyers are in demand.

Clear rules, laws and control mechanisms are needed, for example to protect the individual’s right to privacy. Basically, it is the same as with any other technology.