Team Vitality and technology solution Bodyguard have published a report on online hate and toxicity in esports, shedding light on the scale of abuse targeting players and teams.
Although the complete report has not yet been published online, the findings offer an important first glimpse into how one of Europe’s top esports organisations is tackling an issue affecting athletes worldwide.
The report coincides with Mental Health Awareness Week and World Mental Health Day, and it forms part of Team Vitality’s KARE programme, launched in 2023 with the support of Phillips monitor brand EVNIA. KARE focuses on awareness, prevention and action with the aim of making mental health a central priority in esports and gaming communities.
For this report, supported by EVNIA, Team Vitality used Bodyguard, a technology solution that enables brands and platforms to moderate texts, comments, images and videos across online accounts in real time. The French company’s hybrid AI and human-review system detects and removes toxic content using moderation rules customised by Team Vitality.
Between August 1st and October 6th, 19 of Team Vitality’s official accounts were monitored by Bodyguard. This would include accounts of players, coaches, and official team channels. Over 57,000 messages were analysed, and more than 2,000 were blocked for violating moderation standards. The data showed that 3.6% of all messages were considered hateful, slightly below the esports average of 4.2%. Around 10% of the total messages were classified as positive.
According to the report, these results align with industry trends and suggest that targeted moderation and awareness initiatives can reduce toxicity without stifling debate. However, additional data is necessary to fully understand these situations and the criteria used by the team that monitored and blocked messages.
Encouraging Results, but Clearer Data is Needed
According to the report, the over 2,000 hateful messages that were blocked were mainly on X and Instagram, where toxicity rates reach 4.6% and 2.5% respectively.
“Sport or esports, the challenge remains the same,” the report stated, mentioning that insults, hate speech, and online harassment can damage an athlete’s “well-being and performance.”
While the findings highlight progress, some definitions and comparisons remain unclear. The report does not specify the criteria behind the classification of “hateful” or “toxic” messages, but it indicates racist, homophobic, fatphobic, and religiously motivated insults, as well as personal attacks on players and their families.
Similarly, “positive messages” are said to make up 10% of total interactions, but the shared report does not clarify what qualifies as “positive.”
It is unclear whether this includes genuinely supportive comments or simply messages that are not hostile. Similarly, the mention that football receives more “unwanted content” (3.5%) than esports (1%) lacks explanation of what “unwanted” entails — spam, off-topic remarks, or inappropriate media.
Still, Team Vitality’s initiative marks an important step for the esports industry. Partnering with EVNIA and using Bodyguard’s moderation technology shows a tangible commitment to mental health and sets a precedent for how organisations can protect players and communities.
The report may not answer every question, but it reinforces one essential truth: success in esports must also mean safety, inclusion and respect online.
The post Team Vitality report highlights the scale of online toxicity in esports appeared first on Esports Insider.