Inside Patreon’s creator-friendly content moderation tools
In this episode, Senior Product Manager Chris Price offers a glimpse behind the scenes of how Patreon’s trust and safety tools and content moderation processes work, as well as the philosophies that guide them.
When you’re hanging out online, chances are the website you’re using has a process in place to make sure everyone’s playing by the rules, so problematic materials get removed and the space remains safe. Easy, breezy, right? Not always… When policies are murky and your content is suddenly removed without a clear reason, you might be caught off guard and feel like you’re being punished or that you don’t have agency over your own creative work. But at Patreon, with updated content moderation tools, we take a more transparent, creator-first approach to preserving our community guidelines.
In this episode of Backstage with Patreon, Senior Product Manager Chris Price offers a glimpse behind the scenes of how Patreon’s trust and safety tools and content moderation processes work, as well as the philosophies that guide them.
Hello creators! You are Backstage with Patreon, where we open the curtain on how to build a thriving business on Patreon. I'm Brian Keller from the Creator Success Team, and today's guest is Chris Price, Patreon's product manager for Trust & Safety. He specializes in this connection between product tools and the operational teams that help keep users safe online, and he spent time in similar roles at Instagram and Facebook before coming to Patreon.
Now, Patreon's philosophy for content moderation and these tools is really about putting more control into the hands of creators as part of building your membership business on Patreon. So whether you've interacted with content moderation on Patreon or on other platforms or you've just heard about those kind of situations from other creators, we want you to have a solid understanding about how this process works for and with creators. So let's get started with Chris Price on Backstage with Patreon, and some simple definitions to start with. What is Trust & Safety from the product perspective that creators should be paying attention to?
Thanks, Brian. Thank you for having me. To start, I think that you can just break apart the term trust and safety and take it as one piece at a time. So when we're talking about trust and safety, we really try and organize the thinking from the product world into these two larger buckets. First, you have trust. When we're talking about trust, we mean the trust between the users and the company. And so in this case, we're talking about trust between Patreon and our patrons and the creators that use our platform. What we are focusing on in terms of what breaks trust and what makes trust, we're really getting down into this. Anything that ranges from account security to the transactions that go through our platform, making sure that patrons' money successfully goes to the creator that they're trying to support, that we are keeping your private information safe, we're keeping the creator's information safe.
When we talk about safety, we're really focusing on there is what does it look like to have an experience on Patreon? What we keep an eye on in the safety space is more around what content and what creations are shared on Patreon or through Patreon as benefit, who is interacting on the platform, who do we allow to use the platform to leverage those services, and is the community feeling like this is a safe and comfortable place to engage not only with a creator but with one another. And so that's really the starting point for anything that we consider to be trust and safety within the company, but also within the tech industry more broadly.
Yeah, it's definitely one of those teams where if everything is going well and we're doing the right things for our users, you may never hear about it, you may not need to worry about it, but it's always there in the background. We're always trying to support our users there. So maybe let me ask you that question. What attracts you to this kind of role where you are in the background, you're really supporting people trying to keep them safe?
That's a great question. I sort of stumbled into this industry when I first got into tech. The reason that I always share how that happened is in the interviewing process, I was looking to get into an operations role or a sales role when I was first starting off my career. The story that I would tell in interviews when people were asking me was like, "What was a really meaningful experience you had?" and I go back to my days of working retail and just being behind the counter at a store in the mall in Florida and having a family come in with an issue and I just happened to know exactly what the technical problem they were having was and was able to walk them through what to do, but it required me to actually go to my manager, get off shift and say, "Hey, this person's having this really specific issue, I can run across the hall and go help them with it. Is it okay if I clock out for five minutes and take care of that?" And they were like, "Yes, okay. And this one-off we'll let you do it."
I think that that kind of just built the basis for me that ultimately Trust & Safety is very similar to being in the service industry. We are at the end of the day, the people that you go to first when you have something bad happen to you on the site for whatever reason. But ideally, as you said Brian, you don't have to encounter us if nothing has gone wrong, right? Ideally, everything goes great and you don't have to get into this space. But the thing that I will zoom in on specifically for why Patreon and why Trust & Safety online motivates me, the number one thing is hands down just the reach of these problems and the difficulty in tackling them in a way that balances care as well as the need for sort of a timely resolution and transparency around the problem being solved.
I think if you just look at a lot of the narrative that gets shared around social media platforms in this day and age, there's a lot of blowback against censorship in the sense of you can't say anything online anymore and these very negative narratives. But I think that that is what we intentionally try and avoid at Patreon and what I think any good Trust & Safety team is always looking to solve through a combination of high quality product experiences, best in class policies and guidelines around what your community permits and doesn't permit, and being very transparent and open about those and having timely enforcement and moderation and making sure that when something bad does happen, that you respond to it quickly and ideally you're preventing it from happening in the first place.
You were starting to get into this a little bit. When there is content that is flagged that may be violative of some of Patreon policies, what is our philosophy? What's our approach for how Patreon reacts?
Yeah, great question. So Jack has spoken about this previously in any of his interviews. I think that if you were to sum up Patreon position on how does it respond to questions of moderation or questions that touch any part of the Trust & Safety ecosystem, it really is rooted in his experience as a creator on the internet over the last 15, 20 years.
The one case that I know he will go over to again and again that lives with me is this experience of posting something to a social media platform and then coming back a few hours later and finding that that content has been removed for reasons that are often unclear and having no autonomy or agency in that decision making process to easily appeal or prevent that decision from happening, which was often a mistake or there was a minor nuance involved in why the action was taken that could have been resolved very quickly if some notice had been given to him directly. And so when we think about our philosophy and our principles for how we build Trust & Safety products at Patreon, it is living the values of the company and making sure that creators feel like they have control over their content as much as possible. 100% of the time would be our target.
Separate from that, it's going into our values as a company of putting creators first and this concept of winning together. We want this to feel like it's a partnership, that we are not the police, we're not slapping your hands, that you should feel like you have done something bad and are therefore being punished. That is the exact opposite of what we want these experiences to be. It should not be a punishment. It should be a moment for education on both sides and an opportunity to build a stronger relationship as a group by encouraging creators to go in, look at our guidelines and where they have questions, ask them. Be proactive in seeking out from our policy team, from our moderation team, "Is this content that I want to share going to raise any problems for you all?" And if so, let's work through that together.
So part of why we wanted to bring you onto the show now is because Patreon has actually been working on a lot of new features related to content moderation and how we work with our creators. And so over the last six, 12 months, the way we approach it actually looks a lot different than it was before. So I was hoping you could walk creators through some of those changes and the reasons for taking that kind of approach.
Yeah. So I'll zoom out and just kind of talk about what moderation looked like on Patreon prior to last November. That is a world in which we effectively only had two real tools at our disposal. We had this process we refer to internally as a reform process where, as I mentioned, this is a partnership. We reach out to the creator, we let them know that we have found an issue with either a post or a part of their campaign or a benefit that they're offering that goes against either our community guidelines or our benefit guidelines. We start that conversation with them to address that issue and resolve it in partnership with them. The goal is to make sure that their campaign and their presence on Patreon is not disrupted in any visible way for their community as often as possible. That was basically the bulk of the moderation that we were doing was this perform process, very intense email-based communication.
The other option that we had was to suspend a campaign. When we say suspend, what we mean is we would take that campaign offline so that your fans wouldn't be able to see it and work with the creator on the backend to resolve those issues and then restore the campaign's visibility once those issues were resolved. You can think of it as we would start a reform in situations where there was a lightweight issue that could be tackled pretty quickly or we weren't concerned about content remaining visible on the site and that we would use the more aggressive option to suspend the campaign if either the creator was non-responsive to our reach or if the issue that we found was more graphic in nature and we had concerns about it remaining on the site.
Now what we're doing post level suspension is we've basically taken that idea of the suspended or the reform state and merged them into a product experience that we could build natively in Patreon as an app and as a website. That experience now operates at the level of the posts that you make on the site rather than affecting your entire account. So if there was one post you made and we have a small thing that we would like you to change about it, that post can stay visible. There's now a flag added to it and you know that post requires your attention. You understand what policies or guidelines that post does not meet that you will need to make some changes against, and then you have the option to contact and work with us directly around those changes while we're still preserving again, the visibility of that content as often as we can. We're also not taking any sort of larger action against your campaign to deal with what is a much more isolated issue.
So that was a huge step for us as a company to be able to go from the stage of literally your entire presence on Patreon to, "No, just this one specific thing that you posted is what we have a concern about," which adds a lot more context and clarity for the creators, but also makes it feel less scary to be told that there is something that requires your attention and your review when you're getting contact from our moderation team.
Now, I imagine there's some creators listening to us and think, "I'm never going to post anything that could fall afoul of that or never would be involved." What's your message to a creator who thinks or knows they're not going to be involved with that and why this still is relevant for them on Patreon?
Yeah, first, that's a great point. I think it's really important to acknowledge that the majority of creators have never and will never encounter a moderation action on Patreon. And that's fantastic. The more that we can say that, the happier I think we all are. But at the same time, we have to be willing to acknowledge that there are some people who may attempt to set up presences on Patreon and aren't as familiar with our policies or just need some minor steer here and there in order to stay within the guidelines that we have for what content and for what benefits can be offered via Patreon.
That said, I would say that for anybody who feels that this doesn't really apply to them or this will likely not apply to them, the chances are, and I would say if we look at the data with regard to how moderation occurs on Patreon, that many of the creators that we engage with in terms of content moderation or put into a reform process, that it is their first time going through that reform process. They don't have a history or a pattern of stepping into a place where they hit a guideline and maybe posted something that didn't quite comply with what Patreon allows. The reason for that is because typically this is not intentional. And after we've reached out and we've gone through this process with somebody the first time, they're not likely to repeat that same mistake, right? I call it a mistake because it is specifically their lack of knowledge or lack of understanding of exactly what we allow or don't allow and there was no harm behind what the creator had done in this case.
So generally, the majority of people are probably not going to encounter this. Those who do are likely not to encounter it multiple times. But as a starting principle, what we want to make sure is that when you see this experience, should you see this experience, that you are not immediately surprised, frustrated, or angry that you are getting this notice that one of your posts has been flagged.
We've solved for that in several ways here by ensuring that you get timely email communication, in product notifications. If you use our mobile apps, you'll push notification that we preserve the visibility of that content as much as possible and that we give you that contact option directly on your post to reach out and say, "Hey, I don't understand," or "I think this may have been a mistake," or "I'm thinking I can make the following change. Would that be good enough to address the issue?" And that's really, again, tied to the core principles of what this product is doing.
The last thing that I'll say on this is it is in Patreon's future to be using the post level suspension product a lot more. We really want to move away from kind of threatening a creator's entire page with the previous review and reform methods that we had. We really want this to be the starting point for any interaction that you have with our moderation team. It should be based on a single issue and not zoomed out to be a combination of multiple things that may require an extensive back and forth or be really confusing to work through. So we are trying to make sure that when we're engaging with creators, that we're starting from this sort of the smaller building blocks of where can we provide guidance and give them nudges and instruction and encourage them to more closely read our policies and reach out to us proactively even in some cases before they post, rather than go through something that feels, again, more punitive and more like a disruption to your day in day out usage of Patreon.
I think that might be something surprising that the interaction with content moderation and Trust & Safety aren't necessarily a major violation, a major issue. In theory, they actually could be quite simple, really easy to address. What are the kinds of things that actually can often come up that are these simpler, easy to address situations that come up?
Yeah. So case in point, one of the most common issues that we see on Patreon is that a creator has posted something and that post is made public to all audiences. So you don't actually have to be a pain fan, a pledged patron to that creator's page in order to see that content. We have some basic rules around certain types of content, especially if it contains things like nudity in an artistic capacity or otherwise may not be appropriate for all audiences to consume that we ask you to make sure that content is restricted to a patron-only visibility level.
So one of the main things that we're working on with this product to support in a larger number of use cases right now is that ability to quickly notify people, "Hey, this post is actually fine. We're okay with the content that you're sharing. We just ask that you only make it available to your fans who are paying for access because we have limitations on what we allow to be fully public on the site." So that's a really common easy thing that takes a couple of seconds to resolve. All you have to do is change the visibility of the post and you're done with the process that we're really hoping to be able to use this product more and more for in ways that previously it would've required back and forth email with the creator in order to resolve that exact same issue.
That's a really interesting aspect. Because of the paid membership, it's a more intimate community of folks who are opting in, they're really committed to it. So let's talk a little bit about the differences between Patreon's approach with other content platforms, social platforms. What's your sense or what could we say about how other companies are approaching the same kind of idea of moderation?
Yeah, there's a variety of issues that go into why content moderation can feel like such a poor experience on other platforms. I'll try and work through them piece by piece. The first I would say starts with ultimately what are the policies? What are the community guidelines of that platform? Patreon is creator-first and artist-led, and that enables us to have policies that lean into creative and artistic freedom as much as we really can. There are, of course, certain things that we can't allow on the site either because legally around the world those things are not permitted or because our tolerance of what we feel like has artistic merit or is artistically sound, that is something that Jack and our policy team work together to define. We've created a very broad and liberal swath of guidelines and what we allow, but there's got to be a red line somewhere on just simply what you, as business, say you are not comfortable permitting your product to be used for.
And so if we look at Patreon versus other platforms, we really probably have some of the most generous guidelines in terms of what types of content you can share on Patreon, and we're really proud of that. I think that's a key starting point.
Why that is the case I'll come back to in a second, but the next detail on this actually comes from the operations side. Operationally, most businesses view the cost of moderation as a cost center to their business. It is a place where they want to put probably the fewest dollars possible in while maintaining a relatively low risk profile for their business so that, one, they are in fact preventing most people from using the product from having a harmful experience. Two, that the company itself is protected from any sort of risk, and that could come from anybody from the app store holders at Google and Apple, to government regulation, to what have you. It could also lead to community revolt outright. If you're not doing a good enough job in this space, people who use your product are going to get up and move. They're going to say, "It's not fun being here and we're going to go elsewhere."
So the fact that most businesses purchase this cost center means that you are typically solving for addressing moderation issues in terms of scale and how quickly you can process the number of reports you're receiving, the number of things you're proactively finding on the site and working to take them down as quickly as possible. So that's why you see from a product standpoint that these experiences tend to be deletion first. And then anybody who wants to challenge that decision has to go through a lengthy appeals process, which can often take days or weeks to get a response on and may not actually result in the decision that you feel is fair or that most people would agree is correct. And so those are two key factors.
But bringing it back to why these policies and why this approach operationally exists, there's actually the dynamics the platform itself. In the case of most social media platforms, there's actually four players. There's the company that runs the platform itself, there's the creator of the content, there's the audience for that content, and then there's the advertisers who are putting ads nearby that content.
And so, one of the reasons why Patreon is able to have a much more generous and liberal set of community guidelines that are creator-first is because at the end of the day we are not factoring in the concerns of the advertiser as these other platforms have to based on their business model. We can say, "People are coming to Patreon to give you money specifically for this content. And as long as we are okay with the type of content that you are creating, what your creations look like, what they entail, that is okay with us. We can invest more in these higher quality moderation experiences because we want you to continue to find value in our platform. And we also know that that's the primary reason why you are here in the first place."
"The fact that we can focus on just creators and fans... That's a tremendous game changer for us."
And so I think that when you simplify the dynamics of who is your audience at the end of the day if you're thinking about it from the businesses standpoint, the fact that we can focus on just creators and fans and we don't have to worry about a third party coming in and saying, "We really don't want this content to show up near anything that we're posting or that we're advertising," that's a tremendous game changer for us.
Yeah, I love that way to wrap it up and connect back with creators and their paid memberships with their audience. So some other things we covered, we want our moderation approach to be about education, about dialogue. We're trying to be as precise as possible to not have to take broader action that would be more disruptive to creators. And many of the interactions between our team and creators are actually very simple resolutions, like just changing visibility of a post to patron only. So Chris, thanks so much taking this complicated, sensitive, scary topic and helping more creators understand it on Backstage with Patreon.
Thanks for having me, Brian. It was a pleasure.
Tune in next week to Backstage with Patreon when we have an interview with one half of the Some Work, All Play podcast about competitive running, David Roche. He co-hosts the show with his wife Megan, so we get into the benefits and challenge of the podcast as a family endeavor, as well as how to combine membership and creating with a coaching practice.
To catch every episode of Backstage with Patreon, follow or subscribe in your podcast app and leave us a review. We also have transcripts available at patreon.com/backstage. You're growing as a creator by listening to the show, so why not share the insights from this episode with another creator on Patreon or who is running a creative business? We'd love to have you as an active collaborator with Backstage with Patreon. Come join the discussion in the Patreon Creator Discord. Follow the link in the episode notes and you can get answers to your follow-up questions directly from the guests and weigh in on what topics we'll be covering next. Editing by Tyler Morrisette. I'm Brian Keller. See you next time, Backstage.