Uncover and debunk: Interview with Svitlana Slipchenko from VoxCheck

Svitlana Slipchenko, head of fact-checking project VoxCheck, talks about strategy and tools for combating disinformation, which demonstrate effectiveness in Ukraine and across the world.


Counteraction to disinformation truly can’t be based on one method alone, such as the debunking of fakes, developing media literacy theory, or forming lists of reliable sources of information. We need a full-fledged complex strategy, which could work at all levels: personal, journalistic, public and governmental.

This strategy can include several components:

  • refutation of repeated and current fakes;

  • selection of main disinformation narratives and creation of counter-narratives;

  • media competency both at the level of school and higher education, which forms skills of critical thinking in children and teenagers, and at the professional level of courses for journalists, IT specialists, state workers, etc.;

  • expanding policies to counter disinformation from digital platforms.

Nowadays, in conditions of full-scale war, we observe a very strong synchronization in counteraction to disinformation question. It is worth mentioning that there is big involvement of state institutions, such as The Center for Strategic Communications and Information Security, established under the Ministry of Culture and Information Policy, The Center for Countering Disinformation under the National Security and Defense Council of Ukraine, separate offices and management, which actively work on information ecology specifically in their fields of activity.

The world’s support also deserves attention. For instance, right now an International fact-checking network works on the refutation of information about the war in Ukraine, at the same time the European observatory of digital media makes monthly reports, in which pays attention to fakes about Russian aggression against Ukraine. Thus, it is the TOP theme for everybody, who is involved in forming the information ecosystem in the world.

Thus, from my point of view, strategy is a complex of actions, which are coordinated at all levels from a particular person to international institutions.

Despite the solidarity and activity of different organizations and professionals, according to Internews latest research, it is known to the majority of Ukrainians about the existence of false news, but not many people can identify them. What do you think, what is the problem?

We see that the majority of Ukrainians face disinformation in their daily routine. At the same time, according to research, only 14% of respondents could distinguish true news from fake. For the comparison in 2021, this indicator was 24%. The quantity of information influences on critical thinking of the country’s population. One of the reasons could be named infodemic – a fast spreading of inaccurate or/and false information via a big number of sources. Other ground – the level of trust in official news sources. So, in 2022 Ukrainians trust much more national and regional internet websites and TV channels, but, for example, all kinds of press weakened their positions because of a lack of facts. People are looking for alternative information sources, and in this, we see a big role of social networks and messengers, from which 77% of Ukrainians receive news.

Social platforms really give an opportunity to operatively receive information about both certain regions and the whole country, but this speed often worsens quality. Thousands of telegram channels involve huge audiences, and to maintain them they publish hot news often without checking its truthfulness. Besides, we never know, who stands after those anonymous channels, thus telegram became a useful platform for propagandists and authors of disinformation messages.

State institutions can check and track connections between different recourses. In February 2021 cyber experts from the Security Service of Ukraine exposed a massive spy network, which engaged in reconnaissance and subversive activities on behalf of the special services of the Russian Federation.

This year the list has expanded, but the audience has increased too. One of the latest research of “Detector Media” shows an increase of the number of followers in 2-3 times. Thus, this is a huge audience, which continues to consume misinformative news, often even not realizing that

Here we return again to the question of complex measures that increase the level of media competence. For us, fact-checkers, specialists, who counteract disinformation, it is important to explain, how to recognize these or other fakes, build honest news sources. It also should be a part of our post-war program of Ukraine rebuilding. It will form critical thinking skills, useful in every area of life.
Svitlana Slipchenko, VoxCheck

Continuing the theme of social media, should be discussed policies of international platforms regarding content moderation and their influence on the spreading of false content.

In my opinion, Meta’s social media have the most effective tools for counteracting disinformation. Their cooperation program with fact-checking organizations allows them to reduce the coverage of messages, which contain fake information: they can be blurred or have a warning about possible fake. Among bloggers and authors of news can be found the position that it also limits the spreading of real information. We also suffered from this, when at the beginning of the full-scale war, we published photos and videos, which usually the platform doesn’t let to publish. Our content shared within one day, and then started to be blurred, hidden or deleted. It is the question of content moderation.

Meta has many times explained publicly and officially that they do not have Russian moderators, who intentionally would limit Ukrainian content. Nevertheless, with the beginning of full-scale war moderation became softer. All those photos from Bucha, Irpin, Izium, which were spreading online, the platform would not accept in 2021 at all. The fact that now they are blurred, but still stay online – is already a step toward Ukraine. Worth to remember that Meta is a worldwide corporation. Accordingly, when permission to publish content for Ukraine appears, there can be a conflict among other users in other countries. And here the platform needs to find a balance.

Twitter and YouTube have similar moderation programs, but they don’t cooperate with external fact-checkers. It is also worth mentioning Google: prior to the war, it well-indexed fact-checkers' articles from all over the world. If you wanted to find some information, that was already denied, first you would see on the page with the results – refuted articles.

The most acute question of content moderation, as I’ve already mentioned, occurs in telegram. There are some limits, connected to international sanctions, as, for example, blocked channels for specific countries. But it is primarily the users themselves who have to filter this or that content.

Modern technologies can both help in indicating fake information and increase disinformation. Which new trends, to your mind, have the biggest influence on development?

What considers detecting fake information, machine learning and artificial intelligence greatly speed up work and make it able to process a big amount of data. But they are certainly difficult and expensive, not all organizations can afford to use them. Fact-checkers, for sure, need the required skills and technologies for data analytics. In our latest research of telegram channels, we ourselves read more than 5 thousand messages from 60 Russian and pro-Russian channels, which gathered in 19 narratives. If we used modern technologies, that wouldn’t take 2 months of murderous work for our workers, and maybe would help to detect more channels and messages. Thus, these tools, for sure, increase work effectiveness and speed, and also the efficiency of answers to information danger.

If we talk about using bots and deepfakes, modern technologies of social platforms have learned to counteract them. Meta has already made research, which demonstrated the absence of real users' reactions to comments from bots. Deepfakes show up limited and in bad quality, so you can see them with the naked eye. They still can’t recreate real facial expressions, natural moves of the head and neck, etc. Theoretically, this defect can be overcome by intentionally making a bad quality, but it is easier to use another manipulative method – spreading photo and video content without context and misleading in this way.

For instance, in October spread the video, allegedly from a supermarket in Kyiv, where you can observe that people are fighting for food. We checked the information, found the original video and found out that it was filmed in 2017 in the Russian city Omsk. There are hundreds of similar cases.

Russian propaganda is very straightforward, it is about quantity and not about quality. There is so much of it that truth often just can’t get into information space. Or people can’t perceive the truth in the total mass of fakes. Yes, there is too much disinformation, but it is already classified by schemes and narratives, and technologies can give an answer for such big amounts.

This blog post is a guest article by Yanina Shabanova, Head of Communications at SocialBoost and 1991 Accelerator.

REALIES - Strong civil society for a healthy information ecology. is funded by the German Federal Foreign Office.

The text represents the opinion of the author and not automatically the view of the German Federal Foreign Office and the betterplace lab.

Translated from the Ukrainian by Mariia Pysmenna.

Our Podcast

The first episode of the series "Wir kriegen die Krise." (only in German)