Imagine that you are standing in front of a mirror, but no longer see your own face, but through the eyes of Barack Obama or Angela Merkel. In real time, your own facial expressions are transferred to someone else’s face.
The TNG Hardware Hacking Team has managed to create such a prototype and transfer a person’s face to any other face in real time. The basis for this is the so-called deep-fake approach. The use of neural networks detects faces of video input, translates and integrates them back to the video output. Through this technique, it is possible to project deceptively real imitations to other people.
For this purpose, we used Keras trained autoencoder networks and various face recognition algorithms. In this talk, Thomas Endres and Martin Förtsch give one entertaining and very vivid introduction to the world of deep fakes in real time. In doing so, they particularly focus on deep learning techniques used in this application.