If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Rainbow Six Siege devs to make "toxicity management a priority"

Ubisoft has plans in place to continue tackling toxicity in Rainbow Six Siege.

Rainbow Six Siege has a toxicity problem, and the developers are taking various steps to manage the issue.

Ubisoft said the end goal is to "track negative player behavior," take action against those who act poorly, and implement features which encourage players to improve behavior.

Short-term changes such as chat improvements, and team kill tracking are being worked on at the moment.

The Rainbow Six Siege team is currently "tracking the frequency" of racial or homophobic slurs by monitoring chat. Bans will be applied in "increasing severity" on a case-by-case basis.

It clearly states in the Code of Conduct such talk is very much frowned upon. Threatening, abusive, racist, sexist, or other defamatory remarks will result in a ban.

Particularly "egregious offenders" can be permanently banned without warning, but a temporary ban may be applied beforehand.

Depending upon the severity, as previously reported, the following ban durations in Rainbow Six Siege will be applied:

  • 2 Days
  • 7 Days
  • 15 Days
  • Permanent

Team Killing and chat options

Various player options for Rainbow Six Siege will be made available during Year 3 Season 2 and Season 3.

Intentional team killing has become a major issue in Rainbow Six Siege the situation will continue to be addressed during Season 2.

Many offenders are "slipping through the cracks" of the current detection system at present, but the team plans to improve its ability to track long term offenders "across multiple games and sessions."

Details on how the developers are tracking team killing were not provided as such information would "lead to exploiting."

During Season 3, Mute Chat and Chat Filtering options will be added to Rainbow Six Siege. The developers are working on a system which will allow players to mute individuals in either text, voice chat, or both text and chat.

An automated system is also in the works which will sensor text chat based on a chat filter list.

The goal with chat filtering is to replace words identified as offensive, and notify players if their language was found to be unacceptable.

The number of times a player triggers this filter will be tracked. The team will then take action against those "intentionally having a negative impact on other player’s gaming experience."

Ubisoft hope these short-term changes will put a dent in toxicity, and other plans to tackle the issue will be addressed before implementation.

Sign in and unlock a world of features

Get access to commenting, homepage personalisation, newsletters, and more!

In this article
Related topics
About the Author
Stephany Nunneley-Jackson avatar

Stephany Nunneley-Jackson

News Editor

Stephany is VG247’s News Editor, with 22 years experience (with 15 of them at VG247). With a brain that lacks adhesive ducks, the ill-tempered, chaotic neutral fembot does her best to bring you the most interesting gaming news. She is also unofficially the site’s Lord of the Rings/Elder Scrolls Editor.

Comments