Now Reading
Understanding Trust, Safety, and Community Health at Twitch

Understanding Trust, Safety, and Community Health at Twitch

Spread the love

The Trust and Safety team at Twitch state that they spend their time making sure users are protected and guidelines are followed on the platform in accordance with their terms and conditions. During TwitchCon 2022 at the San Diego Convention Center, the team discussing policy included: 

  • Angela Hession, VP Trust & Safety
  • Alison Huffman, VP of Product, Community Health
  • Connie Chung, Head of Global Policy, Trust, and Safety
  • Doug Scott, Chief Customer Officer

Each gave some insight about what tools and and rules they are implementing to make Twitch a fair and equitable space for users. 

According to Hession, “I think because of the ever-evolving nature of life that response time is essential. I think it’s not as acute for static upload video. So our response time is 24 hours, 7 days a week, 365 days a year, and it’s ensuring this admission and how it escalates and prioritizes harm.” With 8 million different communities that are going live every month, at any given moment it’s certain that Twitch has a large scale of users to monitor. There are also two and a half million viewers watching streams live as well.

For the team, prioritizing accuracy and nuance is what comes first in their job to protect users on the platform. Being quick and efficient is their approach to solving these problems. There are adversarial actors, as Hession mentions, and then people whose core beliefs may be bigoted are harmful anyway. Hession groups these users together when filtering out abuse buckets.

Twitch heavily relies on its volunteer-based model of users branded as mods to also assist in this capacity. While many moderators may not have the knowledge base to understand safety and security, Twitch says they are trying programs like Safety Center. They are also growing their moderator program. 

According to Huffman, the use of real people is important to the platform: “We can’t do this just through tech because it’s live and we don’t know what’s happening.” Huffman started as a community moderator in 2007. For her, a lot of this has been a training experience on the powers of building an operation.

What makes this particular social network unique compared to others is that static content gets preemptively reviewed and scanned before it is live to the world. According to the policy team, the magic of Twitch happens when the streamer and chat are interacting and any delay breaks. Huffman further went on to say that Twitch relies on partnership with the community to have them help keep the platform and users’ homes safe. “It means we are having streamers and channel moderators who help keep their community safe to their standards.”

Twitch also has made it possible for moderators to report people who are violating community guidelines and for the process to be intuitive and clear so that users are able to file secure reports to the operations team and the offending material is reviewed and taken down as quickly as possible. 

“We really needed to find that right connection between the technology that helps scale and then the people who have the right nuance of judgment,” says Huffman.

The company also is thinking about education. “It’s not just about our rules, but it’s why we’re doing this,” says Hession. Twitch has kicked off some programs, giving more information about the underlying logic of community guidelines. The livestream platform is doing that through music, to articulate why they have their policies and how they’re preventing harm. The policy team wants to empower their streamers to help build the communities that are right for them. They realized that they can’t do it all and can’t do it all instantly. Hession says, “The best way to protect our creators is to give them powerful, easy-to-use nuanced tools that let them build the community. We want to be able to have communities that you can set up to tailor that own individual experience which I think is really important.”

Panelist Scott says, “God bless the mods. We love our mods.” When it comes to suspension on Twitch, panelist Chung says, “Suspension rates are the severity; you want to make sure it’s definitely proportional if it’s very low.”

Twitch believes that their user community has been very clear about what they expect based on experience and they want to control that experience. The team is adamant that as the service grows and evolves, the safety experience needs to do so as well. “I think we’re very lucky that we have a very engaged community. They care about safety as much as we do,” says Hession. 

As far as operations go, Twitch is proud to share that they have resolved 80% of user reports within 10 minutes, which according to them is industry leading. Chung says she’s amazed at how much the community wants to follow Twitch’s goals. “I think that we’re going to continue to be really excited for that for next year. We’ve got a number of policy options that are going to come deepening our understanding of harassment, sexual harassment, and that is changing in the industry. Twitch is making sure that we stay abreast of the trends so that we can protect our community but also have good listening skills,” she says.

You can learn more about safety practices happening at Twitch here.


Spread the love
Scroll To Top