Deepfakes and AI: How Technologies Become Weapons of Information Warfare and Manipulation

Factcheck

On December 16, 2024, in the program “Screenshot,” dedicated to deepfakes, on the ATN channel, several important statements were made:

According to recent research, three out of four news stories are shared on social media without being read first. You see a catchy headline – and immediately share it. Emotions lower the threshold of critical thinking, essentially bypassing the brain as an intermediary between the eyes and fingers. Just think about the amount of fake news and misinformation that surrounds us today. Machine learning and artificial intelligence make them so sophisticated that it’s almost impossible to detect deception. For example, here’s one of the latest deepfakes, where the alleged head of the Norovlya district executive committee, Vladimir Antonenko, announces the introduction of a curfew.

Cognitive warfare is becoming a central element in global confrontation. Shaping societal preferences, embedding foreign narratives, and destabilizing the situation as a result. Artificial intelligence is actively used to achieve these goals. It can easily synthesize politicians’ voices or create fake video messages. Distinguishing fakes is becoming increasingly difficult. Deepfakes have become a nearly indispensable tool in attempts to sway public opinion. In the hands of skilled political technologists, such a tool opens virtually unlimited opportunities for manipulating public opinion. Creating political tensions and even societal discord during election campaigns is a traditional tactic for those aiming to influence the course and outcomes of the electoral process. In this regard, it is crucial to know and understand: if news or a video triggers a strong emotional reaction and an urge to repost, it is most likely an attempt to manipulate you.

Deepfakes have also become part of popular culture. They find expression in a specific form of political protest in Western countries. The creators do not aim to mislead with their works. The videos are humorous in nature and reflect attitudes towards politicians who have lost public support.

Our verdict based on conducted analyses:

Verdict: Incorrect


Reasons:

  • Cause-and-effect relationships and connections do not exist as stated.
  • The quoted statement is fabricated; the individual never said anything similar.
  • The alleged event is fictional and never occurred.
  • The claim about a photo/video/document or research/statistics is false. For example, images depict something other than what is claimed, or the research/statistics have been misinterpreted.


Rating Scale

Full video:

We will analyze disinformation narratives and conduct semantic analysis.

Narrative Analysis
Risk Level: High
Information Warfare

Deepfakes and disinformation as tools for manipulating public opinion and creating political tension.

appeal to emotions
use of authoritative sources
creating fear
Confidence: 85%
Technological Threat

The use of artificial intelligence and deepfakes for manipulation and fraud.

suppressing critical thinking
creating false trust
real-life examples
Confidence: 80%
Criticism of Modern Technologies

Warnings about the consequences of using new technologies in the context of disinformation.

historical analogy
generalization
warnings
Confidence: 75%
Overall Assessment

The video raises important questions about the impact of technologies on public opinion and security, using manipulative techniques to highlight the risks of disinformation.

Emotional Analysis
Main Topics
manipulation through technologies
impact of deepfakes on public opinion
information warfare and disinformation
Emotional Analysis
Predominant Tone:
negative
0

-0.6

Key Emotions:

awareness
distrust
anxiety

Only 73 mentions of “Norovlya and curfew” were found in Belarusian Telegram channels and public chats between November 18 and December 1, 2024.
Topic starter post publication:

Publication distribution:

An analysis of publication times reveals synchronization among many toxic channels:

The video shared there was used in the “Screenshot” program.

Rate article
Factсheck BY