We began working on TikTokcracy after it became clear how the pro-Russian candidate Călin Georgescu in Romania finished first in the 2024 elections thanks to a network of TikTok accounts publishing almost identical videos. The model was already well documented: fire hose tactics, mass replication, paid influencers, Telegram coordination, fake comments, and artificially boosted reach.
Our question was straightforward — does anything similar exist in Bulgaria? And if so, who benefits from it?
Mapping the Network: Where We Started
We opened TikTok and began with key Bulgarian political hashtags — #меч, #рудигела, #ивелинмихайлов, #възраждане, #величие, #избори2024 — recording all accounts that appeared consistently under them.
Then we identified influencers who surfaced repeatedly. We looked at:
- who repeats the same phrases
- who posts in the same style
- who attaches themselves to the same cluster of hashtags
We also checked for reposts and stitches. If the same video circulated across many accounts, that was our first signal of coordination.
At this point, the initial map started forming: party profiles, supporting accounts, micro-influencers, pages of unclear origin, and profiles that existed solely to push specific clips.
In parallel, we reviewed bios—links to Facebook, YouTube, Telegram, websites. A full pre-election ecosystem emerged. From these observations, we concluded that Bulgaria hosts an “economically motivated system for algorithmic manipulation,” operating through a multi-platform infrastructure.
Data Collection
To measure behaviour, we downloaded all videos from the pre-election period through Sensika and pulled metadata via Exolyt. This gave us:
- views
- likes
- comments
- shares
- ER (engagement rate)
- timestamps
- comment datasets
We used Exolyt not only as a secondary source to validate metrics but also because the platform preserves comments in the exact state they were in when its crawler passed. This mattered because many TikTok comments disappear, change, or get filtered later. In Exolyt we could see the original discussions, as well as geographic data for some of the commenting profiles.
This is where we discovered the first geographic anomalies — comments from countries and regions that have no logical connection to Bulgarian political discourse. Similar inconsistencies are described by NATO StratCom COE (2024) and Nevado-Catalán et al. (2022) as clear indicators of purchased or inauthentic engagement.
This allowed us to move from qualitative observation to quantitative analysis.
Once we established the methodology, we shared it with our partners so they could extract equivalent data for Kosovo. This helped us determine whether the same models exist beyond Bulgaria.
The First Anomaly: Low ER with Very High Views
The first sign of manipulation is a low engagement rate paired with unusually high reach.
- Normal content: ER above 3–4%
- Suspicious: 1–2%
- Likely manipulation: below 1%
These metrics align with TikTok benchmark data from Socialinsider, Brandwatch, and Emplicit.
In Bulgaria, we found dozens of videos with:
- 100–300K+ views
- ER below 2%
- sometimes 0.5% or 0.3%
This is not organic behaviour. It matches the model described by NATO StratCom COE (2024) and Nevado-Catalán (2022): artificially generated views without corresponding reactions.
Fire Hose Model: Repetitions, Variations, and Information Noise
We observed entire clusters of profiles posting:
- the same videos
- with minimal changes (different music, text, emojis)
- within hours
This is the fire hose model described by RAND (Paul & Matthews, 2016), also visible in the Romanian campaign.
The goal: TikTok treats each variation as “new” content → increased reach → higher chance of appearing in voters’ For You Page.
Imbalances: Suspicious Comments and Geographic Anomalies
One of the most telling anomalies appeared in the comment sections:
- monosyllabic, templated comments
- repeated comments under different videos from different profiles
- accounts with locations in other European countries, Asia, the Middle East
- accounts that comment actively on political content from specific parties but produce no content of their own
- comments unrelated to the video
- or a complete lack of comments on strongly political clips with very high view counts
This pattern aligns precisely with what Nevado-Catalán and NATO StratCom COE associate with:
- purchased comments
- bot comments
- comments from global engagement farms
Suspicious Profiles and Cross-Platform Links
We also found profiles with clear signs of inauthenticity:
- AI-generated profile photos
- registration outside Bulgaria (e.g., UK, where the DSA does not apply)
- content directed exclusively toward supporting one party
- links to Telegram groups, Facebook pages, and websites connected to specific political actors
The BFMI & Sensika report describes a concrete example: @prodajnipoliticivpandiza — a profile with an AI-generated photo, unclear origin, and one-directional political content.
Parties and Networks: Who Runs the Largest Operations?
According to our analysis, the most active political networks on TikTok were:
- МЕЧ
- “Възраждане”
- “Величие”
These groups displayed:
- mass repetition of videos
- the lowest ER with the highest view counts
- the largest number of suspicious comments
- the strongest cross-platform activity
- the most frequent signs of paid boosts
What TikTok’s Own Data Shows
TikTok officially reported that in 2024 in Bulgaria it removed:
- 423,000 fake accounts
- 6.4 million fake likes
- 328 ads violating policies
These numbers alone show that the platform was a significant target for manipulation during the election period.
What We Concluded
After combining all indicators—low ER, burst views, fire hose clusters, anomalous comments, AI profiles, cross-platform coordination, repetitive party networks—a clear picture emerged:
Organized networks in Bulgarian TikTok exploit algorithmic vulnerabilities to amplify political messages and shape the visibility of specific parties.
The model is almost identical to the Romanian one, though smaller in scale. It is part of a dispersed, economically motivated, continuously active system of algorithmic manipulation.
Our partners’ analysis in Kosovo showed nearly the same dynamics: TikTok enables rapid creation of new political identities and networks that can shift public sentiment faster than institutions can respond. This new ecology of influence is described in detail in our report.



