A few days ago, the head of the laboratory at Dartmouth College in the United States, Professor Hani Farid, in an interview with the British edition of CBS News, said that videos with fake faces of various heads of state could cause a future nuclear war. To prove his words, the scientist cited as an example the possibility of a video that contains the announcement of the President about the launch of a nuclear missile. In this case, the country, which is threatened by the “first person” in the video, must immediately respond with its own missile sent towards the enemy. Such a scenario threatens humanity with a new nuclear war.
The program that makes it possible to create such videos is called FakeApp and was released by Reddit user Deepfakes on January 26, 2018. After this event, the network was filled with a variety of videos with changed faces. Videos of obscene content “involving” famous Hollywood actresses turned out to be especially popular.
Is it possible to recognize the falsification of a face in a video?
We asked experts to comment on the situation and found that the majority of respondents were unanimous in their opinion: technologies that use neural networks are not yet able to qualitatively replace reality, a fake is always visible to the naked eye.
A person who is not familiar with the technology of neural networks can be deceived by face swapping. I’m sure of it Ilya Yakidimifounder and director of the “Protection Module” company: “If you bet on the ignorance of the population, then yes, you can blackmail someone, provide so-called “evidence”, spoil your reputation.”
But experts also answer that it is too early to be afraid of too realistic fakes.
Andrey Belyaev, chief computer vision expert at the Neurodata Lab, comments: “Most algorithms now work in an “ideal” model of the world – without interference, noise, shadows. If the algorithm that replaces faces is allowed to process a photo with a face in glasses, or with a small face, or the photographera photo taken in the rain – he will not cope, and even if he replaces faces, the synthetic nature of the photo will be visible to the naked eye. Existing technologies are very far from working in real conditions. He also believes that the work of neural networks to replace faces in photos and videos is no different from the “handmade work of a master”, therefore “there are exactly as many opportunities to recognize neurofalsification as there are opportunities to recognize manual falsification.”
I agree with him Vladimir Tushkanov, Junior Data Scientist at Kaspersky Lab. He believes that “recognizing a fake by blurred or jumping fragments of the frame is still not difficult.” But even despite this, the expert warns that there is a danger of using neural networks to change faces in everyday life: spread among friends and acquaintances: it is fraught with serious psychological consequences, regardless of the quality of the fake.
What threatens the substitution of faces in life?
Speaking about the responsibility that can come when using face replacement using neural networks, you need to understand what goal a person pursues. If she is harmless, then the likelihood of getting punished for her actions is extremely small. But there are a number of cases in which they are provided for by the law of the Russian Federation. We list some of them:
- If you see photos or videos posted on the network that are offensive in nature, you have the right to go to court. For example, your face was used to create obscene content. For an insult, an administrative fine will be imposed from 1 to 3 thousand rubles (CAO RF Article 5.61).
- If evidence is falsified against you in a criminal case in court, then the smallest punishment that the author of these materials can receive is a fine from 104 to 300 thousand rubles. At the most, arrest for up to four months. Falsification of evidence in a criminal case of a grave or especially grave crime is punishable by imprisonment for up to 7 years (Criminal Code Article 303).
You also need to know that if you or your child has been cyberbullied using face replacement through neural networks, then you need to act like this:
- Take a screenshot of a video or photo.
- Have the screenshot printed on paper certified by a notary to confirm the fact of bullying.
- File a lawsuit on the basis of this document in order to demand compensation from the offender for non-pecuniary damage.
- 5 key events in high-tech that cannot be kept silent
- Artificial intelligence helped astronomers find a new planet