Building a Safe Passage: The Passes Trust & Safety Approach

Building a Safe Passage: The Passes Trust & Safety Approach

At Passes, maintaining a safe, respectful, and inclusive environment for our creators and fans is central to our mission. In order to maximize transparency with our community, our Trust & Safety team will publish a series of posts around how we’re building towards an industry-leading platform focused on protecting creators and fans. 

Our introductory post aims to impart helpful and transparent context about our safety practices compared to the broader industry landscape and provide a clear and comprehensive overview of our moderation framework. Stay tuned for more of these posts in 2025 - a year of exponential growth for the Passes community! 

Our Philosophy

Passes’ Trust & Safety philosophy is grounded on three main objectives

  1. Be the safest and most secure creator and fan platform on the planet. 
  2. Deploy best-in-class moderation and fraud prevention, detection, and elimination programs.
  3. Implement iron-clad and crystal clear policies that promote compliance and fairness within the Passes community.

Knowing Creators and Fans

With safety being top of mind, we have instituted security checkpoints at various stages in our platform to ensure we don’t sacrifice safety for growth. All of our creators are authenticated by a rigorous identity verification and online background check process, elaborated on below:

  1. A thorough identity verification check process.
  2. A manual social media platform presence review: our team reviews an applicant’s social media platforms, evaluating the authenticity of the creator.
  3. A comprehensive online presence review: this mitigates potential risks and ToS violations around allowing a creator to access the platform. We also scan for publicized age information, potential involvement around fraudulent and criminal activity, and press releases related to the applicant.

Only when an applicant successfully passes all three layers of review are they granted access to a creator account. 

Rinse and Repeat: Our Ongoing Compliance Checks

Once a creator becomes fully active, our compliance checks continue throughout their lifetime on the platform. Through our KYC provider, we run monthly Politically Exposed Persons (PEP) and Watchlist Reports (financial, crime, and international) and update their KYC information when a match appears. 

We maintain a community-wide review of compliance requirements across all creators, which includes the following:

  • Report-initiated compliance checks: our team of safety and support associates conduct an ad-hoc, comprehensive investigation around user reports of non-compliance by our Passes fans and creators. They then validate the veracity of these reports and swiftly remediate any non-compliance amongst the involved parties. 
  • Weekly policy reviews: we conduct weekly evaluation sessions focused on strengthening, clarifying, and updating our community guidelines. These sessions include consulting with both counsel and internal teams, communicating policy changes in advance to creators and fans, deploying operational resources to enact, and monitoring compliance pertaining to these policies.
  • Bi-annual compliance review: the focus is on ensuring creators have signed the latest versions of the ToS agreement.

All In Moderation: How We Review Content At Passes

Passes has strict policies prohibiting explicit content on the platform. Our content moderation program is focused on proactively scanning content posted by creators through our automated content moderation tools (see our automated moderation tools below) as a first layer of moderation review.

Our Trust & Safety team, our second layer of moderation, reviews flagged content according to our classification standards. They evaluate the level of egregiousness and compliance against our guidelines, and make a decision around banning the content or reclassifying it as compliant so fans may continue to access/purchase the content from creators. 

Additionally, we encourage creators and fans to report any content they find on the platform that they believe to be in violation of our content guidelines. Such reports are immediately escalated and censored by our system to prevent fans from accessing the content in question until fully reviewed for re-classification. 

Below is an overview of our content moderation review process:

Table B: Step-by-step process of how content is processed in the Passes platform

Our Best-in-class Automated Moderation Tools

To prevent undesirable content on the site and to protect our users, we utilize three tools:

  1. Amazon Rekognition Content Moderation: automatically runs on all images and videos to determine if they contain adult content. Any flagged content cannot be posted and viewed.
  2. Hive Moderation: automatically runs on all audio files to determine if they contain adult content. Any aforementioned policy on images and videos applies to audio as well.
  3. Microsoft PhotoDNA: automatically runs on all images to detect CSAM content (Child Sexual Abuse Material). All positive matches are manually reviewed and true positives (none have yet occurred) are reported to the NCMEC (National Center for Missing & Exploited Children).
  4. Thorn: We use Thorn to detect CSAM material across all our content.

Committing Ourselves to Continuous Growth In 2025 And Beyond 

Going into 2025, we will continue to uphold our values to the Passes community as we elevate the standard of what safety means for the creator industry:

  1. Moderation Review Guidelines: as our content categories grow, we will continue to optimize standards of moderation reviews according to the category. This will promote a fair and applicable moderation policy across different disciplines of content. 
  2. Audit program: we will regularly conduct an audit of our quality assurance process, thus ensuring it is consistent with our rubric guidelines. This will help maintain the accuracy of standardized reviews across any demographic of teams moderating our content. 
  3. Category / Label / Rating Calibration: as content continues to evolve and expand within Passes, we will set and launch new categories that require calibrated moderation reviews and content ratings, which will differ across content categories and geo-locations.
  4. Self-service mechanisms: we will continue to evolve our self-service features within the platform intended to help a creator or fan resolve their trust & safety incidents. This will allow for a more seamless experience while also granting an easily scalable approach as we expand globally.  
  5. Broader reporting access for urgent incidents: as we expand our support channel availability, we will enable new ways for fans and creators to flag a trust & safety issue and get round-the-clock help in the event of emergencies. 

Safety Is Essential To The Success Of Our Platform

Earning, maintaining, and respecting the trust of creators and fans is of the utmost importance to us, hence why we take the safety and well-being of our community as seriously as we do. The North Star for our team is to be the safest, most secure creator and fan platform in the world. We are proud of the work that has been done to set the standard of safety and security within the industry, and remain committed to continuously adapting to and collaborating with our community to uphold and elevate these standards as the industry evolves. 

If you have questions on our trust & safety policies, feel free to reach out to trust@passes.com.

- Passes Trust & Safety Team