Cathryn Weems on Content Moderation and Leveraging Transparency Reporting to Build Trust
E17

Cathryn Weems on Content Moderation and Leveraging Transparency Reporting to Build Trust

Cathryn Weems [00:00:00]:
If you don't plan ahead of time. So if you have to report on January through June of whichever year, and you don't start figuring out the data that you need to collect on January 1st, if you don't think, if you don't stop thinking about that transparency report till July, when that period has ended, you're screwed. You need to be thinking about it a few months before January starts so that you can make sure that you are tracking and logging the data in the right way with all of the right fields that you're required to report on. Otherwise you won't have the data. I mean, you can't possibly go back and redo six months worth of work. There's lots to think about if you're not, you know, regardless whether you're a small company or not.

Sabrina Pascoe [00:00:36]:
This is Click to Trust, a podcast that delves into the intricate challenges of protecting online communities. Hi Cathryn, welcome to Click to Trust. How are you doing today?

Cathryn Weems [00:00:46]:
Good, thank you. Nice to see you again.

Sabrina Pascoe [00:00:48]:
I connected a really long time ago because we both love nerding out over content policy, but anyways, you and I have known each other for a while, but do you mind introducing yourself to our listeners really quick?

Cathryn Weems [00:00:58]:
So I'm Cathryn Wiens and I've been working in Trust and Safety for about, I think it's almost 25 years now, and I've been at a variety of different tech companies with names that you would have heard of and recognized. And currently I'm the Head of Content Policy at Character.AI, which is an AI startup. And I've been there for, I think I might be into my third month, but it's about two months somewhere around there. And previous to this I've done a variety of different things in Trust and Safety from obviously content policy, policy development, as you alluded to. And I've also done some legal operations work and child safety and handled some of the content regulations that we've seen come out from various governments over the last, I don't know, five, 10 years. And then randomly, just for, you know, full disclosure, I actually did do some consulting for Trust Lab. I think it was a couple of years ago now. So yeah, I'm somewhat familiar with the company, at least as it was at that stage.

Sabrina Pascoe [00:01:55]:
Absolutely awesome. I'm super excited to talk to you today because I think you genuinely have more Trust and Safety experience than anyone I know or have met. One question that I always like to ask folks, especially who have been in Trust and Safety for a while, is how the heck did you become a Trust and Safety professional. Because I know for me it was not something that was ever discussed on career day. It was like doctor or a lawyer or like chef. Trust and Safety professional never really came up. So how did you kind of stumble into this field?

Cathryn Weems [00:02:26]:
Yeah, for sure. So it definitely wasn't shared in career day, even in college, because I. You can see some gray hairs. They're either from Trust and Safety or maybe from the children, but I've been doing this a while, as I said and as you said and the Internet was, it was only really a fraction of what it is now at that stage. So there were still people doing things that needed Trust and Safety, you know, even before I started working in the Internet, of course, but career in Trust and Safety wasn't even a phrase that would ever have happened probably till about 2010, in all honesty, that phrase. And actually even then, probably not a career in Trust and Safety, more just like I work in Trust and Safety or I have a Trust and Safety job.

Cathryn Weems [00:03:08]:
I absolutely fell into it. Which is true for I think everyone who's got Trust and Safety experience and more than about, probably about five, maybe eight years. If you've got more than five to eight years of experience in Trust and Safety, you almost definitely fell into it one way or another. From what I can tell. If you're the random person out there who has planned your career in Trust and Safety from 10 years or so ago, I would love to hear from you because I'm, I'm interested, but the way I fell into it was that I was, I had nannied after college, I didn't really know what I wanted to do and I nannied for a family in Ohio and I that spent some time obviously in the States and I really liked it and I'm originally from the UK if that isn't immediately obvious. And so I was looking at different jobs and there was a job advert for Yahoo at the time and I answered that job advert and got hired in at Yahoo in the UK for the first six weeks were out here in Silicon Valley and then after a couple of years I was able to get a transfer out to the Bay Area and I've been here ever since, so.

Cathryn Weems [00:04:10]:
So but definitely fell into it because Trust and Safety just wasn't. Wasn't a concept back then and has only really been a concept for the last 10, 15 years at most, but yeah, I've been doing Trust and Safety work for all of that time.

Sabrina Pascoe [00:04:23]:
Yeah, absolutely. I mean up until recently, I would say within the last couple years I've seen universities that are starting to offer courses and curriculums in Trust and Safety. And part of me is like, oh, that's so cool. That makes me so happy. And the other part of me is like, well, you guys are lucky because the rest of us, we were just winging it for a while.

Sabrina Pascoe [00:04:43]:
So you worked for a lot of different tech companies. You've worked for Yahoo. And then I know you transitioned into T&S policy for Flickr. You also worked on Trust and Safety policy and transparency reporting for YouTube, which I can imagine was just a behemoth of a project. I mean, I say for our listeners who don't know, but anybody who likes, doesn't live under a rock, knows that YouTube, millions of pieces of content, of video are uploaded onto YouTube every single day. So I've always wanted to ask you, how do you even approach content moderation on a platform like that where you've got billions of users and pieces of content uploaded every day and lots of user generated content. And then you also have advertising and there's, there's just so much content on YouTube. And so I've always wondered, like, how did you approach that challenge?

Cathryn Weems [00:05:36]:
So, first of all, it was over 10 years ago, so the volume of content, while still ridiculously large, was not quite as large as it is today. I think at the time that I was there, it was, I think the stat was 48 hours of content was uploaded every second. I think that was right. And I think it might have gone up to 72 hours by the time I was there, about two years. I think it had gone as high as 72 hours of content every second I left. I can't even fathom what it is now. It's probably a week or more of content. It's probably maybe a month of.

Cathryn Weems [00:06:07]:
I can't even fathom when you've got a number that's so large. I just, my head cannot, I can't wrap my head around it. And so at some point.

Sabrina Pascoe [00:06:18]:
You did 72 hours of content per second. Did you say like even that?

Cathryn Weems [00:06:21]:
Yeah, it was, it was definitely, definitely two days worth of content. And it might have gone up to as much as three days, but like it was somewhere between two and three days worth of content uploaded every second. Like hours worth of content. Like, it's just, it was unfathomable. And so, so first of all, the amount of content was lower than the amount of content that we have right now on YouTube, but also, I don't think I was thinking about the volume of content had I been thinking about the billions of users and the volume of content. I don't know that I, I think I would have been paralyzed to some extent because it is just so unbelievable. I knew it was a global platform, of course I knew it was popular, and that alone was enough.

Cathryn Weems [00:07:00]:
Like the actual specifics of quite how popular and quite how global and quite how many hours of content or how many videos or whatever wasn't really something I focus on so much, but the way I handled it, I think was just, you have so many people, however many people it is, and obviously the team there was bigger than if it was, you know, the very, very small companies because it needed to be. It wasn't as big as, you know, some of the teams that I've heard of at some companies and even prob, probably even other parts of Google, the teams may have been larger at the time, but you can only do so many things in a day. And so much as we worked very, very hard and there was, we had a really like, great operational team that we worked closely with who were phenomenal, who helped, you know, handle like a huge volume of the reported content or any of the proactive review that we were doing. We could only review so many individual pieces of content that were gray areas or figure out, know some brand new thing has happened and then how do we possibly figure out policy for this? So my, my role specifically there was working on the policies for content moderation. So whether or not you can do certain things in your videos or in the comments potentially. And a lot of those policies were already written. So a lot of my work was seeing if there were any gaps where we needed to further clarify and, or create new policies.

Cathryn Weems [00:08:21]:
So before there was any harmful challenges on YouTube, there wasn't necessarily a policy about harmful challenges on YouTube and then there's harmful challenges on YouTube and you see the first one, you're like, oh, well, maybe that's just an edge case. And so we'll figure out an answer or a way to handle that edge case. And then you see a second thing that's kind of similar but kind of different, and you're like, hmm. And then by the time you've seen the third one, even with the second one, once you've seen it twice, you know you're going to see it three plus to a million times more, right? And so especially because of the scale, you know that the law of averages, you're going to see it more. And so once you've seen something a few times, you're like, oh yeah, we actually need a Policy specifically for this and like figuring out the level of harm that could happen or is likely to happen, what audiences like, and all of those different things to figure out the nuances of policy, which I know you are very familiar with that kind of thought process and research and, you know, writhing process. And so yes, there was maybe millions of pieces of content that fell into whichever category, the good stuff, the bad stuff, and then the in between, but I wasn't thinking about the scale because I think I would have not been able to do my job.

Cathryn Weems [00:09:29]:
But the other side of what I was doing to get back a little bit onto your transparency kind of question or mention, the other part of my role was overseeing the people that were handling the legal request coming in from governments around the world. They wanted us to take down videos because they were against their local laws or they were saying they were against their local laws. The team I had, we were reviewing those requests to make sure that they were actually first of all valid requests. And then once we verified that they were valid, then we needed to review to see if they did actually violate YouTube's global terms of service or our content policies. And if not, did we, did they violate the law in those countries and did we need to geo block that content? And so that's where the transparency piece came in more because the transparency reporting. And we can talk about this in more detail if it's interesting, let's take.

Sabrina Pascoe [00:10:19]:
A step back and for any of our listeners who aren't familiar, like, what is content policy? Why is it important? And why is it actually hard to write? Right? Like it's not just, I mean, obviously disclaimer, don't eat those, like pod cleaning things, so eat hot pods, but I remember like when I started working at Google, the pandemic hit and we didn't really have a global pandemic policy, right. So I kind of want to level set for our listeners, like, what is content policy? Why does it matter? Why is it so challenging to kind of nail down? And then I think we can get more into the, that question of this really interesting question of, you know, the relationship between tech companies and governments and what is a legitimate, like you said, takedown requests is violating local laws versus where does that kind of start to get into censorship? How much power or authority do tech companies have relative to governments when it comes to requests that construe to censorship? I think those are both really interesting topics, but I kind of want to hit them one at a time.

Cathryn Weems [00:11:21]:
Content policy, for me, it's the set of policies or Guidelines or rules, whichever word you prefer, that dictate the content that users can share or upload or engage with on your platform, what that looks like. Obviously for YouTube, it's like, what can you make a video about? And how can you title that video? Also, it relates to the comments that you could make on other people's videos on something like a podcasting app or something. Then it would be, what is the audio content you can do? Maybe they're less worried about video content because it's maybe only audio, but it's just the set of guidelines or policies that dictate what is and isn't okay. What the rules of the road are for that, that site specifically. And obviously there is a lot of similarity in many of the big and small companies where everyone, every site I've ever looked at, the guideline pages that I've ever looked at, they all say that, you know, illegal content, child abuse, that kind of content is not okay. I don't think any site is allowing that content, but where people draw the line about, you know, profanity, for example, profanity is something that many of us hear in the shows that we watch, the movies that we watch, the music we listen to, and maybe we're okay with it, maybe we're not. Maybe we change how we speak around younger, like members of our family, and maybe we don't, who knows? But profanity, obviously is one of those areas where if you are a site that's more targeted for a younger audience, or if you're a site exclusively targeted for 18 plus, then your views on where the line might be for profanity is going are going to be different.

Cathryn Weems [00:13:02]:
Whereas allowing, you know, very, very hateful content or threatening content or threats of conducting like a mass shooting or something, probably most sites are going to say that that's against the guidelines. So that's what content policy is for me, but.

Sabrina Pascoe [00:13:17]:
I agree with you 100%. That's always how I kind of talk about it. Is it's like the rules of the road or even like, if you're a philosophy nerd like me, I call it like the social contract, but I think what you were highlighting there gets into it was really validating to hear because I think a lot of the time people are like, oh, it's so easy. Like, why, why, why is this difficult? And I think especially because people who. And I don't. I don't mean this in a negative way. I love my technical folks.

Sabrina Pascoe [00:13:41]:
We couldn't exist without them, but when you work in a A different kind of part of the industry where things are. Everything is like on a 0:1 binary. For example, if you're in a and somebody says, is this policy violating? There's a million. And you know this. There's a million different questions that I have to ask in response to that. Like, what type of platform are we talking about? Is this media? Is it a digital marketplace that connects users online? What jurisdiction are we operating in? What are the local laws? What are the cultural norms around? Like, what's appropriate or inappropriate? Is your target audience minors? Are we talking about monetized or non monetized? And so that's where I think that, like, meeting you was such a validating experience because I was like, yes, this is actually really hard. Content is really hard work.

Sabrina Pascoe [00:14:30]:
And it's not as straightforward as don't be hateful.

Cathryn Weems [00:14:33]:
So when I was at Flickr, we had something in our community guidelines. At that point, I didn't write it. I wish I'd written it, but I didn't. I can't take credit for it, but the line was, I think I may be getting it slightly wrong. Something along the lines of, don't be creepy. You know that guy, don't be that guy. And I was like, yeah, like kind of. That's that.

Cathryn Weems [00:14:52]:
In theory, if community guidelines or content policy just said, just don't be awful. Don't be awful to people or anyone else or yourself, that kind of is what they all say, but that isn't sufficient because what awful means for me isn't what even it means for you. And we've both been doing this kind of work for a while, so we might have a slightly more similar definition, but depending on where you grew up, depending on how you grew up, depending on your religious beliefs, depending on your worldview, like, you're going to have different perceptions and interpretations of this stuff. And for content policy, you can say on this, you know, fictional site that we work at, you are not allowed to do A, B and C. However, you are encouraged to do D, E and F. It's obviously slightly more nuanced than that, but if you're writing policy, a lot of times we'll come on to maybe writing for.

Cathryn Weems [00:15:42]:
We'll come on to AI moderation, maybe at some point, but historically we were all historically meaning. And I think before about two years ago, we were all writing for humans. And the people who were reading these policies and having to understand and apply the policies on a daily basis often didn't have English as a first language on purpose because we wanted to make sure we had global languages covered. Because not even though I speak English, not everyone in the world speaks just English. Lots of people speak English, but not. They don't only speak English. And so we hire these companies, hire moderators around the world who have English as well as something else.

Cathryn Weems [00:16:20]:
But you need to write in a very, very clear, uneasily understandable way. And the only goal of policy is it's irrelevant of whether it's the most beautiful written document ever. That might be nice, but if it doesn't get the intended outcome enough of the time, then it's kind of. It's useless, right? And even enough of the time, I didn't say all the time, because you're never going to get everyone. And even if that everyone is only two or three people, and even if it's just one person and you wrote the policy and you're applying the policy, you're still not probably going to be 100% consistent, because people.

Sabrina Pascoe [00:17:00]:
This is why I'm going to keep saying you're one of my favorite people in Trust and Safety because you have so many golden nuggets, but if you're not careful, you'll miss them. I'm going to try and collect these gold golden nuggets. The first one was the ick factor, right? Whenever I'm training folks who are new to Trust and Safety and they're like, it's creepy. It makes me uncomfortable. I don't like it. I'm always like, no, no, no, no, no. We do not make decisions based on the ick factor because it's such an opportunity to introduce our implicit biases and forms of discrimination, right.

Sabrina Pascoe [00:17:30]:
Like, saying that somebody is creepy does not necessarily mean that they're breaking the law or that they are dangerous to your community. It could just literally mean that you find that person unattractive. That's one thing is like, we've, I think, slowly over time, a lot of tech companies, when they started out, a lot of platforms started with really poorly written policies because Trust and Safety policy wasn't a career yet, right? And we had to slowly migrate these problematic policies and turn them into what are now considered, like, best practices. The other thing is about, like, working with vendors and BPOs that I find super interesting. And this, this question about, like, localization and the dynamics between platforms and vendors. We had Alice Hunsberger on the show the other day, you know, very well, and she.

Cathryn Weems [00:18:16]:
She's one of my favorite people in Trust and Safety.

Sabrina Pascoe [00:18:19]:
She's in the circle too. Don't worry, but she has really informed great views about like the dynamics between platforms and vendors and how they can cooperate better together, but one thing we were talking about was like, yeah, one, you have to make it so straightforward, I think. Well, there's kind of two things. One is make sure that the policy is actually tied to like purpose and enforcement, right.

Sabrina Pascoe [00:18:42]:
Like you can define all of these terms if you don't define them and also tie them back to like what is the point of defining terms? Like you could say hate speech isn't allowed on our platform. And my follow up question would be like, cool, what's the next step? Right. Like why? But also does that mean we're banning users? Does that mean we're suppressing content? At what point are you negotiating? Like not negotiating but liaising with law enforcement so it has eye to like a concrete action, but also when you're working with BPOs you have to be really mindful of like cultural context. And I was talking to someone else about this the other day, but with like all the elections coming up and people outsourcing election misinformation to BPOs based in Honduras or the Philippines, you can't just say like no election misinformation. You have to give them the context of here are the people who are running for president in this country. Here are the folks that are considered like political entities and parties. Here's what they're advocating for and then describe.

Sabrina Pascoe [00:19:42]:
But let's say somebody asked me to moderate content about elections in Japan. I would need to really educate myself about the Japanese political system first. Yeah, anyways, I feel like this is why I love talking to you because a lot about this forever. Okay, so we talked.

Cathryn Weems [00:19:57]:
One thing though that you said about like you said the way before, that the content posters were poorly written. I agree that they are better now, they are more detailed, but if you think about any company, you've been at no specific company, sometimes those policy documents are really long. They're really long. And especially still when we have, when we have enough humans still having to review them and apply them, even if they're not expected to remember them. Going forward, if we have machines reading them, obviously the length of the document is almost irrelevant, but I understand why we have so much complexity because we've seen that we need to explain, explained every. We have to define every single thing so that we can have a consistent definition and then come back to it regularly throughout the policy talk.

Cathryn Weems [00:20:45]:
That all makes sense to me, but. So I agree with you that they're better written. Now I think it was that There just hadn't, I think when they were first written, we didn't have all of the use cases or all of the learnings that we've had from doing this for so long that we now know that we need to be really specific about this stuff. We thought that we could say, just don't upload any sexual content. What does sexual content mean? Like, I know what I think I mean by it, but in some, it, it depends. And like, and so, and not everyone has the exact same definition of what sexual content is. And so, yeah, you can define, you need lists of what are sex acts and what are the. And it can get really, really granular in order so that you can all make the same decisions.

Cathryn Weems [00:21:27]:
And so I just, I agree with you that they're better. I agree that they weren't as good as they could have been, but I think that the reasons were that we just didn't, we hadn't gone through enough cycles of this, of realizing that we were probably being inconsistent. And it wasn't until like it was probably 2014, 15, when people started to be like, oh, we actually need to make sure we're being consistent. Because inconsistency is actually part of the problem that Trust and Safety teams or companies can have with getting users to trust them, right. Or getting the public to trust them. Because if you say we don't allow X, Y, Z doesn't matter what it is, then you need to be doing as good of a job as you can to remove X, Y, Z, right so.

Sabrina Pascoe [00:22:09]:
So one thing that I've learned recently that's really helped me is having different documents for different audiences. I think you're right. We did move to a place where our policies were really vague and then we wanted them to be really specific so that we could train people better and have better models. There's a whole reason. Or have localized policies based on regions. And then I think we've kind of swung back a little bit. Not, not that, not to not have them specific, but to remember that there are people in queues moderating content who don't have the time to read a 48 page policy, right.

Sabrina Pascoe [00:22:43]:
And so you need to give them something that they can look at to understand key concepts and what, like what kind of things might be exceptions to the rule without looking at a 48 slide?

Cathryn Weems [00:22:55]:
No, they can't read it because.

Sabrina Pascoe [00:22:56]:
Yeah, and I think you touched on this earlier, but your policies are worth nothing if they can't be scaled and enforced, right.

Cathryn Weems [00:23:02]:
Exactly, exactly.

Sabrina Pascoe [00:23:04]:
You have to think about how you're Setting your human moderators up for success as well as how you're going to train model based off policies. One thing you touched upon really quickly was like transparency and trust, which is so, so important obviously as Trust and Safety professionals. And I know you've worked on transparency reporting in the past. So for any of our listeners who don't know what transparency reports are, can we talk a little bit about like what they do and why they're important. And also some of the new regulations around transparency reporting that I think are really interesting.

Cathryn Weems [00:23:38]:
Yeah, for sure. So my, my take on this is that they are. Well, they started I think about 2011, 2012, early 2010s, if that's how we're phrasing that decade. The companies, I think it was Google or Twitter that came out with the first two, maybe I can't remember which one, which order. The companies wanted to be transparent about some of the government requests they were receiving because of the risk of censorship that governments had with those requests. And so a lot of the requests that come in from governments have non disclosure orders attached to them, meaning you can't talk about them and so you can't share with the public to say oh my gosh, we got a request from whichever country about whichever piece of content to try and censor that content. Or maybe it is really violent of who knows, but you can't talk about it in some situations.

Cathryn Weems [00:24:33]:
However, saying that you adding that to a table of data where you say yeah, you got this, this one request gets added along with all of the requests you get from that country and you don't say what the content was, you don't say specific action for that one piece of content. You say in general what the, what percentage of actions you took on all of the requests you received. That sort of like compilation of all of the data related to all of those government orders was able to be compiled into what were the initial transparency reports. As far as I'm aware from being around some of them in those relatively early days. So that's kind of what it, what they are. They've transformed in some ways in terms of not just companies voluntarily sharing like the inside of the black box kind of stuff. It's. They still do that.

Cathryn Weems [00:25:18]:
There's most companies are still doing that piece, which is great. They have also now been required in some regulations. So I think the first one that I'm aware of was Germany's from the NetzDG or NEA regulation in 2018. Ish, I think about that. And then France had one Turkey had one, India, Korea, other places required specific transparency reports with some specific requirements of how to do them, how often to do them, where they needed to be posted, who they needed to be sent to, what they needed to include. And then obviously the most prominent example is that a lot of listeners may be familiar with based on previous podcasts you've done is the EU's Digital Services Act, DSA, where they're, they've kind of incorporated from what I can gather at least France and Germany like requirements and they've been expanded upon that and they've required this transparency reporting by all the companies. That's kind of like my understanding of the history of transparency reporting. And I think that they're an incredibly useful tool to share some of what's under the hood with the public or with journalists or with civil society, whoever's maybe interested.

Cathryn Weems [00:26:29]:
They can get really unwieldy. Based on some of the volume of scale that we talked about with YouTube, for example, the amount of government requests that they could be receiving could be very, very high. And so the volume of data that these reports maybe talk about can be kind of a little hard to wrap your head around. A lot of them do also try and share the story. So this country has actually we saw a lot of requests from this country after a certain incident, maybe there was a terrorist incident or something in a country, and maybe that country sent a lot of requests to try and remove content that was violative under their local law related to that terrorist incident, because maybe they were scared of future like or follow on violence happening in their country, on the ground violence. And so there's some very legitimate reasons for these requests to come in. And so I think it's a really, really good tool. The regulation that has required these transparency reports.

Cathryn Weems [00:27:24]:
I think the downside is that because some of the requirements are quite arduous, like quite extensive, and the process can be quite arduous to actually compile these reports, especially if it's like every six months or maybe more frequent. It just takes a lot of resources internally. And obviously as companies get more familiar with doing the specifics of the DSA transparency reporting, they will be able to automate more aspects of the data collection. And maybe the humans can then spend more time on figuring out what the story is that they want to share, but they can be really, really time consuming to produce the transparency report, but I do think overall they're valuable, even though nowhere near enough people probably look at them.

Sabrina Pascoe [00:28:04]:
That was actually super informative. I think the history of how we got to where we are with transparency reports. So I think a lot of people don't understand that tech companies are kind of put in this almost impossible position sometimes with, with governments that, like you said, legitimately will have requests for user data or take down requests with gag orders attached to them. And that's where this interesting conversation about like privacy and security takes place, right? Where privacy is being violated, quote, unquote, in the name of public safety, but to your point, we needed to find a way to build trust with our users. And so I think transparency reports started with these are the takedown requests we're getting and these are the requests for user data that we're getting from various countries to try and build some trust and also try and explain, explain a little bit the awkward position that these tech companies are placed in. Not easy telling a government no, right? If you disagree with a request, which they do. And there are instances where they do do that, but it's not the most straightforward situation.

Sabrina Pascoe [00:29:05]:
But yeah, it's really grown. And I think if you've never taken the time to read a transparency report, I would highly recommend it. They're super interesting and I've been really impressed to see how they've grown from those very kind of like basic early days to the transparency you read now where there is a story and a narrative about here are our policies, here are the different regulations that we're being asked to enforce on globally, here are trends across abuse areas or jurisdictions, or, you know, you can kind of play with the data to analyze it. The way that you find is the most intriguing to you, which is nice too. You can kind of cut and cross the data yourself, but I agree, I think they're, they're, they're really interesting and they do a good job of showing, especially as a Trust and Safety professional. I think there's like an assumption sometimes that like tech companies don't care about this stuff. And I'm like, do you have any idea how many people work like night and day coverage to make sure that like Trust and Safety issues are taken seriously? And I think these transparency reports do a good job of showing people like, hey, we really do care and we really are doing our best and we might not be perfect, but here's everything we've done to the best of our ability this year to mitigate harm and we will continue to improve and grow next year.

Sabrina Pascoe [00:30:17]:
And it's interesting too that you mentioned the Digital Services act, because I was going to ask you, like, do you think the DSA is sufficient? And it sounds like One thing you brought up that I hadn't even really considered is the DSA is one thing when you're a vlog, right? When you're a very large online platform and you have a Trust and Safety team of 2,000 people, what happens when you're a Trust and Safety team of 10 people and you're still required to, like, issue those transparency reports? And it sounds like there is some leniency, it's still a quite an operational burden to try and meet those reporting standards.

Cathryn Weems [00:30:51]:
So one thing you said about transparency reports I want to just touch on is that it's actually the one time, or maybe twice a year, because they're often every six months, where you get to actually show some of the things you've worked on as a Trust and Safety person. Because you often can't talk about the specifics. Almost ever. You can almost never talk about the specifics unless you're in the comms department, where you're having some level of conversation about some of the work. And so it's an area where lots of teams can actually feel that level of pride on their work and that it is. There is actually a public aspect to their work. So that's one other thing about transparency reporting that I think people probably don't think about as much anyway related to the dsa, whether it's sufficient or not. I don't know if I am in a position to judge, but I do think that handling the DSA requirements as a smaller platform, obviously, yes, if you're required to do any amount of the same things as a big company, that's going to be harder when you have less people and fewer resources, of course.

Cathryn Weems [00:31:48]:
However, sometimes at some of the bigger companies, there's almost so many people that you're coordinating across so many different departments. If you think about something like Google, where both you and I have worked, if you're trying to come out, and I don't know if they do one Google DSA report, if they do separate ones for different products that they have, but if you can imagine that there might be just one. And if you're trying to coordinate with the Drive team and the YouTube team and the App Store team and the whatever else, I can't think of any of the other Google. Google stuff like that's its own challenge. So I think people at V also have, yes, they have more resources and there is maybe some more automation for some of the data collection, but because they've been doing transparency reporting for longer, potentially because they were doing it voluntarily before it was required, but there are challenges, regardless of the size of the team, if you don't plan ahead of time. So if you have to report on January through June of whichever year and you don't start figuring out the data that you need to collect on January 1st, if you don't, if you don't start thinking about that transparency report till July, when that period has ended, you're screwed. You need to be thinking about it a few months before January starts so that you can make sure that you are tracking and logging the data in the right way with all of the right fields that you're required to report on. Otherwise you won't have the data. I mean, you can't possibly go back and redo six months worth of work. There's lots to think about if you're not, you know, regardless whether you're a small company, a large company.

Sabrina Pascoe [00:33:12]:
Yeah, absolutely. And I think for our listeners, definitely go and check out the transparency reports of companies like Google and Meta and TikTok and X, if they're still doing transparency.

Cathryn Weems [00:33:24]:
They did do one somewhat recently, surprisingly.

Sabrina Pascoe [00:33:26]:
But, but yeah, it's, it's really interesting. One thing that I like, I think it, I'm trying to remember, I think it was TikTok. I mean, I'm sure a lot of them do it, but TikTok was the one that I looked at most, most recently where you can see how they've changed their transparency reporting over time to improve it. And that was really interesting to see like, like you were saying, the historical progression of transparency reporting and how, like how companies acknowledge, you can see them acknowledging their own shortcomings every year to try and improve transparency. You think that's because now, because people.

Cathryn Weems [00:33:55]:
At these companies, you and I've worked there, not there, but at these companies, people care. There's. I don't think I've ever met anyone at any of these companies that doesn't care or are trying to do a bad job. Maybe there's some things that get released that are insufficient. Maybe there's some things, some decisions that happen that are incorrect, but nobody's trying to do a bad job. People in Trust and Safety specifically, and also in other parts of the companies, they care about what they're doing, they want to do a good job. And yes, bad things could potentially have happened, but that's not the intention. And so, and I think that people just assume bad intent in a way that is kind of surprising.

Sabrina Pascoe [00:34:32]:
I think being a Trust and Safety professional in particular is kind of being like an unsung hero, a Little bit. And so I think transparency reports are very validating. I want to move on to another topic really quick because I want to make sure we have time, but we've talked about content moderation and transparency reports and privacy and safety. One thing I wanted to ask you about was you're one of few women that I know in a leadership position in tech. The other thing is, in addition to transparency reports, you can go check out DEI reporting at a lot of these major tech companies and see that there are not a lot of women in executive positions or leadership roles at some of these tech companies. So I wanted to ask you, kind of on behalf of our viewers, what that journey has been like for you, how you've kind of managed to carve a space for yourself and maybe what advice you would give to more junior women in the field who are hoping to take on leadership roles in tech companies or in Trust and Safety more specifically.

Cathryn Weems [00:35:34]:
Yeah, for sure. This is something I care deeply about for a variety of reasons, I'm sure many of which are obvious, but. So I feel incredibly fortunate to have had a number of female managers, not men, not the majority of my managers have been male for sure, but I have had some great female leaders and managers and then leaders up through the chain of my reporting chain have been female. And I've had some amazing leaders that I've worked, worked for. So I feel very, very fortunate to have had that. And some of them have been incredibly inspirational. I try and give back where I can and try and be that.

Cathryn Weems [00:36:10]:
Like, because I am a female leader, I'm also a gay woman. I'm also older in tech than some. I wasn't always, but now I am. And so I try and like try and be that. I try and be cognizant of that in the situations I'm in. And if I feel, if I am able to speak up because I have had more senior roles or I am older or whatever it might be, whatever reason it might, I might have than somebody more junior. I try and be aware of that and see if I can help in, in those situations. Because I've, I've had, I've benefited from that myself.

Cathryn Weems [00:36:42]:
Yeah, I've definitely been in male centric environments. Of course I have, but I don't think about it so actively. Even when I was at Twitter before the takeover happened, I think our, like, our sort of larger department that I was sitting within, I think we were like almost 50, 50 male, female. And I think we might have even hit over 50% for leadership positions. I think more than 50% were actually female. So that was a remarkable sort of situation. And there was also diversity on a bunch of other spectrums within that group of people.

Cathryn Weems [00:37:13]:
It was wonderful and really, really admirable that they had been after pull such a phenomenal group of people together. And it was, it looked a little different or in fact quite different than a lot of other teams even within Twitter, let alone the rest of tech. So I feel like I've definitely benefited from, from some wonderful environments. As far as advice like, I don't know that. What advice. I don't know if I did anything specifically to or anything specific related to my gender that got me into the roles that I got into. Anyone who's worked for me has probably heard me say this already, but if you're looking for a promotion regardless of your gender, regardless of the promotion you're looking for, the most important thing that I'm looking for is are you doing your current job well or potentially better than better than average? Because some people fall into the trap of wanting the next thing or wanting a slightly different role and focusing on that and talking about that and then at the detriment to their current role.

Cathryn Weems [00:38:11]:
It's like we've hired you to do this current thing and we need you to do that. And if you knock that out of the park, we'll have the understanding and we'll be able to see that you have the ability to do more and take on more because you're going to knock that out of the park probably as well. So that would be the first thing I would say to anyone of any gender at any level. And also then talk to your manager to make sure they know what direction you want to go in in your careers so that they are more likely to be able to provide you opportunities regardless, again, regardless of your gender or their gender. Like make sure that you have those conversations proactively. And maybe that does come back a little to gender that women aren't as they may not think to have those conversations as proactively potentially. So I think that's something that could be related to gender specifically. And then the thing that is definitely related to gender that I do tell, I mean everyone should do this, but definitely females should do this.

Cathryn Weems [00:39:06]:
Please negotiate your salary when you're starting a new job. Ask for more money, ask for a starting bonus, ask for a sign on bonus, ask for like just negotiate in whatever way you can. Not all companies will allow you to negotiate and they'll come back saying that this is the final offer and that is what it is, even if they say that and you haven't yet negotiated, try and negotiate, and they can come back and say no. And I don't. I've never heard of a situation where they've turned around and pulled an offer because you tried to negotiate, because you don't have to do it in an aggressive way. You can do it in a respectful way, but men in general, we're obviously stereotyping here, will ask for that extra money that they will negotiate. Even when it's promotions within a company, it doesn't have to be a new company.

Cathryn Weems [00:39:49]:
When it's promotion time or annual review time, men in general will negotiate more than women in general. And so you're losing out on money. And the exponential, like, difference over time, it really adds up. So that would be the most practical thing, I would say, because I think all of us are doing these jobs because we're not yet independently wealthy and it's. We like them and they're fun, but also we need to be paid, right? So that would be the most important thing that I think is about gender.

Sabrina Pascoe [00:40:17]:
No, I couldn't agree more. I was. I was very fortunate. When I first joined Google, I actually came from a nonprofit background and didn't really have any of these skills because obviously, when you're working in the nonprofit background, you're not really in it for, like, the seniority or the compensation, but, but it is important, right? And because I'm Mexican American, when I joined Google, I had the opportunity to join a mentorship program that kind of pairs more junior Latina employees with more senior Latina employees. And I got paired with two. I did it twice, and I was lucky enough to get paired with the most amazing women who taught me a few things that to this day, like, changed my life. The first one was stop being so shy and, like, brag about yourself and track your accomplishments and share them with your manager on an ongoing basis.

Sabrina Pascoe [00:41:04]:
It's not necessarily intentional, but you'll just slip through the cracks and so ever. I have kept, like, an accomplishment tracker ever since so that when it comes time for performance reviews, I know exactly how I've contributed the overall, like, mission of the company. The other one that I think they told me was, like, literally sitting down with your manager. And it's not, like, rude or abrasive. It's. It's table stakes to be like, hey, I'm really interested in moving to the next level. Do you think I'm ready? And if the answer is no, why aren't I ready? And what would you need from me to get to that next level and by XYZ date by like Q2 of 2025 or whatever. And then you're speaking of social contracts.

Sabrina Pascoe [00:41:45]:
You're basically kind of making a social contract with your manager that says, okay, I'm not ready yet, but here are all the things that I need to work on to get there. Go back to that dialogue and say, okay, I did, you know, I did all the things.

Cathryn Weems [00:41:58]:
These are great tips for sure. The brag doc is or accomplishments doc is if you feel weird about bragging, then call it an accomplishments doc. Talking to your manager and asking for like some companies, especially some of the bigger companies have like career ladders and they have, you know, specific skill sets and that they want to see for each of the different levels and the different competencies and stuff. So you can actually even go through and do your assessment yourself and then talk to your manager. Most people will tell you that they're doing better. Very few people will tell you all the ways that they are not meeting the requirements. And, but it is interesting how the salary question with gender is the one that I noticed most.

Sabrina Pascoe [00:42:37]:
But yeah, that's. Well, that's the other one that I didn't know anything about then, but I do now. And I agree with you 100%. Negotiate, negotiate. Because women especially, and I talk about women, but if you talk about like intersectionality, like women of color, queer women, women are consistently, from what I've seen, underleveled and underpaid still even today at compared to their male counterparts. And. And a lot of that is trying to like trying to navigate the delicate dance of hey, I'm super. And like you said, it doesn't have to be abrasive.

Sabrina Pascoe [00:43:11]:
I'm super excited about this opportunity, but my compensation expectations are actually closer to xyz. Can you meet or are there other benefits that you're willing to negotiate, like return to office or education stipend or subsidized care.

Cathryn Weems [00:43:26]:
Covering cell phone or like just even anything. Yeah, 100%.

Sabrina Pascoe [00:43:31]:
So, yeah, I feel like we could do a whole discussion on negotiation, but anyways, I know we're getting close to time and so I just wanted to say thank you so much. This is why you're one of my favorite people to talk to. I think we've learned a lot today about content moderation, transparency, reporting, privacy and security and also how to advance as a woman in tech and in Trust and Safety. So I really appreciate you coming on.

Cathryn Weems [00:43:56]:
Of course. Thank you for having me. It was great to talk to you. Awesome.

Sabrina Pascoe [00:43:59]:
Thanks, Cathryn. Thanks for listening to Click to Trust. Don't forget to subscribe to never miss an episode and we'll see you next time on Click to Trust.