11.5 Million Views and Counting: Odysee's Moderation Black Hole
Published April 20, 2026 · OdyseeWatchdog Investigative Team
We scanned Odysee's entire catalog using its own API and flagged 4,711 items that violate the platform's Community Guidelines. Then we counted the views. 11,459,855. Every single one of those items is still live.
The Scale of Inaction
Let's put 11.5 million views in context:
- That's roughly equivalent to the population of Portugal worth of views on content that promotes hate speech, extremism, terrorism propaganda, and dangerous conspiracy theories.
- 1,351 items (28.7%) have been viewed over 1,000 times each. These are not obscure uploads gathering dust — they are actively being watched.
- 279 items (5.9%) have over 10,000 views. At this scale, the content is reaching audiences larger than most local news broadcasts.
- 10 items have exceeded 100,000 views each — combined, these top 10 have been watched over 1.5 million times.
The Removal Rate: 0%
Here is the single most damning number in our entire analysis:
0%
Of 4,711 flagged items, Odysee has removed zero.
For comparison, YouTube's Q3 2024 transparency report showed the platform removed 8.1 million videos in a single quarter. TikTok removed 170 million videos in the same period. X/Twitter removed 5.4 million accounts. Odysee removed zero items in the entire history of our monitoring.
What 11.5 Million People Watched
The content accumulating these views is not ambiguous. It includes:
- Hate speech: 59% of flagged items, totaling millions of views on content promoting antisemitism, white supremacism, and racial hatred.
- Dangerous conspiracy theories: COVID vaccine misinformation, "plandemic" narratives, "great replacement" theory — the top-viewed flagged video (269,901 views) is a French anti-vaccine film.
- Extremism: Content from known extremist figures including Alex Jones (247 videos, 168K views), Mark Collett (101 videos, 295K views), and Red Ice TV (102 videos, 220K views).
- Active monetization: 1.13 million LBC in tips flowing to this content, with Odysee earning a 5% cut on every transaction.
The Algorithmic Boost
View counts this high don't happen by accident. Odysee's recommendation algorithm actively promotes content based on engagement metrics. When extremist content gets tipped and shared, it signals the algorithm to recommend it more widely. The platform doesn't just fail to moderate — it actively amplifies the worst content through its own recommendation system.
This is not a bug. It is the direct, predictable consequence of having no content moderation team, no automated detection, no transparency reporting, and a financial model that profits from engagement on extremist content.
The Case for Intervention
11.5 million views on content that includes terrorism propaganda, hate speech, and dangerous medical misinformation represents a public safety issue. The platform's 0% removal rate and 4-person staff demonstrate that self-regulation has completely failed.
The data supports three conclusions:
- Odysee cannot self-moderate. It lacks the staff, tools, and willingness.
- The content causes real harm. 11.5 million views means millions of people exposed to radicalization, medical misinformation, and hate.
- Regulatory intervention is warranted. The EU DSA, UK Online Safety Act, and evolving US platform accountability legislation all provide frameworks for addressing platforms that systematically refuse to enforce their own rules.
Legal Disclaimer
This site only highlights publicly available content that violates Odysee's own Community Guidelines and/or applicable laws. We do not host, embed, or redistribute any Odysee content. All referenced material is linked in its original, publicly accessible location for accountability and reporting purposes only.