Events

Deepfakes and disinformation: a wave of fake videos with calls for participation in military operations is recorded in Kazakhstan

In Kazakhstan, an increase in the distribution of deepfakes — fake videos generated using neural network technologies-has been recorded. The purpose of such materials is to discredit well-known public and political figures, as well as create the illusion of their participation in propaganda campaigns related to participation in armed conflicts.

Fake videos imitate the appearance and voice of famous people with a high degree of reliability. Among those whose images were used in fake content are Foreign Minister Murat Nurtleu, singer Alisher Karimov, journalist Gaziza Raimbek, as well as heads of administrations of Kostanay and North Kazakhstan regions.

In one of the videos, Alisher Karimov allegedly appeals to the audience to sign a contract for participation in military operations, promising high payments and "the opportunity to prove himself a real man." Visual and voice support of the material is performed using generative AI algorithms, which makes it visually reliable for most users.

The distribution of such deepfakes occurs mainly through social networks and instant messengers. The authors of the materials remain unknown, but the legislation of Kazakhstan provides for criminal liability for the production and dissemination of deliberately false information, especially if it may affect public security or international relations.

This is not the first attempt to use generative AI technologies in disinformation campaigns. Back in May, there were videos of fake speeches attributed to Prime Minister Olzhas Bektenov and journalist Gulnara Bazhkenova. Now this practice has spread to other countries in the region: deepfakes with the participation of bloggers, senators and media representatives from Uzbekistan have also become a regular element in the information environment.

The activation of deepfakes in the post-Soviet space is considered by experts as part of a broader strategy of information manipulation. The goal is to create public distrust of public figures, sow chaos, and create a fake social media agenda. Against the background of the high speed of content distribution in the digital environment, such falsifications pose a growing threat to both information security and public trust in official sources.

Maili News

Maili.uz -news portal of Uzbekistan.

Recent Posts

Uzbekistan strengthens its position on the innovation map of Eurasia: INMerge Uzbekistan International Summit held in Tashkent

В Ташкенте состоялся первый инновационный саммит INMerge Uzbekistan — ключевое событие для технологического и инвестиционного сообщества региона. Саммит стал площадкой…

1 day ago

France unveils Citroën C5 Aircross 2025 with AI integration and a Focus on comfort

На европейский рынок выходит обновлённый Citroën C5 Aircross 2025 года — кроссовер, в котором сочетаются современные технологии, продуманная эргономика и…

1 day ago

Uzbekistan introduces new insurance compensation rules: focus on fairness and transparency

Постановлением Кабинета Министров от 15 июля 2025 года № 443 утверждены изменения в порядке обязательного страхования гражданской ответственности работодателя. Документ…

1 day ago

Uzbekistan strengthens digital literacy of civil servants: artificial intelligence in the focus of training

В Узбекистане продолжается реализация Стратегии развития технологий искусственного интеллекта, утверждённой Постановлением Президента от 14 октября 2024 года. Одним из практических…

1 day ago

US rethinks the future of AI: Nvidia CEO puts emphasis on physical sciences and robotics

Генеральный директор Nvidia Дженсен Хуанг, один из ключевых архитекторов современного искусственного интеллекта, заявил, что если бы начинал карьеру сегодня, сосредоточился…

1 day ago

Uzbekistan depends on money transfers from Russia: more than $ 6 billion in half-year

По данным Центрального банка Узбекистана, за первые шесть месяцев 2025 года объём денежных переводов из Российской Федерации составил 6,4 миллиарда…

1 day ago