Deep Fake

How fake pictures reveal themselves

23. Juni 2023, 10:30 Uhr | Tobias Schlichtmeier
AI can produce pictures, humans can not differ from real photographs.
© Michael Schwettmann

Humans often have no chance to differ simulated pictures, audio oder video files from real pictures. For this reasen, researchers work together on an autonomous detection.

Diesen Artikel anhören

Artificial Intelligence (AI) can produce a picture in a short time at the basis of a text, that looks like a photograph. For human eyes, it looks like the same. This is fascinating - bout doubts in particular every picture or photograph. In his doctoral thesis, Jonas Ricker from the Bochum University, has concentrated on technical detection of fake pictures. He is searching for possibilities to differ synthetical made pictures and videos from real pictures and videos.

Step to step from diffusion and back

The so called Diffusion Model for picture creation is very popular at the moment. This is because of the application »Stabel diffusion«. »The basic principle first sounds astonishing«, Ricker explains. »A real picture is being destroyed step by step, by adding random diffusioin - this is where the name comes from. After many hundred steps, no more picture information is present, it is fully diffuse. The goal of the model is to invert the process, to reconstructure the original picture - this is the most difficult part«. The key is to not predict the picture directly, but to act step by step like you do it by diffusion. With enough training data, the model can learn to render the picture a little bit less diffuse. With doing this step a lot of times, you can create new pictures out of random diffusion.

Uncover fake profiles in Social Media

»The diffusion model is already right now very good in creating deceptive real pictures - and will be much better in the future«, Jonas Ricker is sure. This makes it more diffcult to differ real picture from fake pictures in the future. At the moment, he is testing different methods to differ from the model created pictures from real pictures or photographs. It is really essential to differ real from fake pictures, not only to uncover Fake News, which for example comes as a video, but also to uncover fake profiles in Social Media. They are being used on a grand scale to affect public opinion politically. »This is exactly the point of the the excellence cluster CASA: to unmask big hacker attacks from states or secret services, which have big instruments to make deep fake propaganda«, Ricker explains.

Anbieter zum Thema

zu Matchmaker+

Matchmaker+