Cathryn Weems on Content Moderation and Leveraging Transparency Reporting to Build Trust
E17

Cathryn Weems on Content Moderation and Leveraging Transparency Reporting to Build Trust

How do tech platforms develop clear content policies while balancing user freedom, regulatory requirements, and cultural contexts? What does it take to scale trust and safety efforts for billions of users in a rapidly changing digital landscape? Navigating these challenges requires foresight, transparency, and a deep understanding of user behavior.

In today’s episode of Click to Trust, we are joined by Cathryn Weems, Head of Content Policy at Character.AI, to take on the intricacies of building Trust and Safety policies. Cathryn shares her extensive experience shaping content policies at some of the world’s largest tech platforms, from crafting transparency reports to addressing complex government takedown requests. She offers unique insights into balancing global scalability with localized approaches and why clear, enforceable transparency reports  are key to fostering trust.

In this episode, you’ll learn:
  1. The Art of Content Policy: Cathryn explains the challenges of defining “gray area” content and how tech platforms can develop policies that are clear and enforceable at scale.
  2. Transparency in Action: Gain insights into the evolution of transparency reporting and how transparency reports build  trust with users while navigating government regulations.
  3. Women in Tech Leadership: Cathryn shares advice for aspiring women leaders in Trust and Safety, including strategies for negotiating compensation and carving a path for yourself in a male-dominated field.
Jump into the conversation:
(00:00) Meet Cathryn Weems
(01:10) The evolution of Trust & Safety as a career path
(05:30) Tackling the complexities of content moderation at scale
(10:15) Crafting content policies for gray areas and new challenges
(14:40) Transparency reporting: Building trust through accountability
(20:05) Addressing government takedown requests and censorship concerns
(25:25) Balancing cultural context and global scalability in policy enforcement
(30:10) The impact of AI on content moderation and policy enforcement
(35:45) Cathryn’s journey as a female leader in Trust & Safety
(40:30) Fostering trust and improving safety on digital platforms