“For the past few years, Microsoft and Two Hat have worked together to implement proactive moderation technology into gaming and non-gaming experiences to detect and remove harmful content before it ever reaches members of our communities,” wrote Dave McCarthy, vice president of Xbox product services, in a blog post.
Two Hat makes “community management solutions” that use a mix of artificial intelligence and other technologies to monitor online interactions — things like online chats — in order to filter out or report harmful content, such as hate speech, violent threats and cyberbullying. Microsoft said Two Hat’s tools allow people to decided what they’re comfortable with and what they aren’t.
Over the past few years, Microsoft has begun setting new moderation rules for its online players. It also publicly discussed new tools it’s working on, like content filters, to help people avoid toxic chats. Last September, Microsoft released a new Family Settings App to help give parents and caregivers more control over their kids’ activity on Xbox consoles.
The terms of the deal weren’t disclosed.