Building a Safe Passage: The Passes Trust & Safety Approach
At Passes, maintaining a safe, respectful, and inclusive environment for our creators and fans is central to our mission. In order to maximize transparency with our community, our Trust & Safety team will publish a series of posts around how we’re building towards an industry-leading platform focused on protecting creators and fans.
Our introductory post aims to impart helpful and transparent context about our safety practices compared to the broader industry landscape and provide a clear and comprehensive overview of our moderation framework. Stay tuned for more of these posts in 2025 - a year of exponential growth for the Passes community!
Our Philosophy
Passes’ Trust & Safety philosophy is grounded on three main objectives:
- Be the safest and most secure creator and fan platform on the planet.
- Deploy best-in-class moderation and fraud prevention, detection, and elimination programs.
- Implement iron-clad and crystal clear policies that promote compliance and fairness within the Passes community.
For us to review the progress we’ve made towards these objectives, it’s important to evaluate where Passes stacks against other major platforms. Below is a benchmarking exercise of Trust & Safety components that Passes has implemented to date relative to other major platforms, as they stood at the time of this comparison, and based on their published policies:
With respect to the safety of underaged users (ages <18), the table above illustrates how Passes compares to social media titans and competitors in the creator monetization space. We believe that our minimum age requirement (15) and KYC process are industry-leading; our efforts to require official consent are at parity with all other major social media platforms and ahead of direct competition; minors’ access to age-inappropriate content and direct messaging mirror industry players’ moderation review tools and are well ahead of others in our approach to discoverability and the inclusion of an automated moderation layer.
Our Focus on Minor Safety
At Passes, emphasis on protecting underaged creators is a hallmark of our Trust and Safety program. Regarding our requirements for a user to access our platform and to apply independently to be a Passes creator, one must:
- Be above the age of 18.
- Be permitted to use the platform’s services under their country’s applicable laws.
- Not have violated any of our existing Terms of Services using a previous account.
However, for anyone who wants to be a creator but is between the ages of 15-18, they can still be allowed access provided they submit a consent form signed by a legal guardian or parent as part of our Know-Your-Customer (KYC) process. The legal guardian/parent is beholden to our Terms of Services and is responsible for any action that underage creator may perform on the platform. Our identity verification process also sets the standard for authentication, in comparison to other platforms that use this level of verification only for paid or business accounts.
Our minimum age requirement is 15 years of age in compliance with the FTC’s COPPA (Children's Online Privacy Protection Act) regulations. This is higher than Meta and TikTok, which require users to be 13 years of age. Any actions that violate our policy and/or cause harm to any minors are immediately addressed, which includes reporting to local law enforcement and the NCMEC (National Center for Missing & Exploited Children). For more information about our online protection for teen creators, review section 13 of our Community and Content Guidelines.
Creators who are under 18 years of age have what we consider “strict” censorship automatically toggled on in their settings, thus preventing any content/messages that are considered explicit and inappropriate for minors to publish or access. Our Community and Content Guidelines clearly state that we do not allow any explicit nudity and/or pornography – this is why our automated and manual moderation review mechanisms preemptively and swiftly remove any such content from the platform.
Safeguarding Creators: Trailblazing in Underage Protection
There are several rules that we have set in place in order to protect the welfare of underage creators (and fans) on the platform (see section 13 of our Community and Content Guidelines). See below for a highlight of several mechanisms currently in place:
Creator Safety Rails:
- Creators who are under 18 years of age can only follow creators that are deemed safe for minors by our algorithm.
- Creators and fans under 18 years of age are excluded from any recommendation algorithm to ensure they are not viewing or being searched randomly by fans/creators.
- Our community and content guidelines limit any underage creator’s access to direct messages and group chats with fans, thus ensuring their safety and preventing any potential predatory/harmful interaction with fans.
Fan Safety Rails:
- Fans under 18 years of age can only follow creators whose content does not contain anything ‘implicit’ in nature (as verified by our automated and manual moderation reviews).
- Fans cannot search for creators who are under 18 years of age.
- Fans under 18 years of age are prevented from seeing any content that is flagged as unsafe for underage consumers evaluated by our moderation team, screened against our content rating policies, and rated as content strictly only available for users ages 18 and above.
Knowing Creators and Fans
With safety being top of mind, we have instituted security checkpoints at various stages in our platform to ensure we don’t sacrifice safety for growth. All of our creators are authenticated by a rigorous identity verification and online background check process, elaborated on below:
- A thorough identity verification check process.
- A manual social media platform presence review: our team reviews an applicant’s social media platforms, evaluating the authenticity of the creator.
- A comprehensive online presence review: this mitigates potential risks and ToS violations around allowing a creator to access the platform. We also scan for publicized age information, potential involvement around fraudulent and criminal activity, and press releases related to the applicant.
Only when an applicant successfully passes all three layers of review are they granted access to a creator account.
Rinse and Repeat: Our Ongoing Compliance Checks
Once a creator becomes fully active, our compliance checks continue throughout their lifetime on the platform. Through our KYC provider, we run monthly Politically Exposed Persons (PEP) and Watchlist Reports (financial, crime, and international) and update their KYC information when a match appears.
We maintain a community-wide review of compliance requirements across all creators, which includes the following:
- Report-initiated compliance checks: our team of safety and support associates conduct an ad-hoc, comprehensive investigation around user reports of non-compliance by our Passes fans and creators. They then validate the veracity of these reports and swiftly remediate any non-compliance amongst the involved parties.
- Weekly policy reviews: we conduct weekly evaluation sessions focused on strengthening, clarifying, and updating our community guidelines. These sessions include consulting with both counsel and internal teams, communicating policy changes in advance to creators and fans, deploying operational resources to enact, and monitoring compliance pertaining to these policies.
- Bi-annual compliance review: the focus is on ensuring creators have signed the latest versions of the ToS agreement and requests they update all active parental consent forms for creators under 18 years of age.
All In Moderation: How We Review Content At Passes
Passes has strict policies prohibiting explicit content on the platform. Our content moderation program is focused on proactively scanning content posted by creators through our automated content moderation tools (see our automated moderation tools below) as a first layer of moderation review.
Our Trust & Safety team, our second layer of moderation, reviews flagged content according to our classification standards. They evaluate the level of egregiousness and compliance against our guidelines, and make a decision around banning the content or reclassifying it as compliant so fans may continue to access/purchase the content from creators.
Additionally, we encourage creators and fans to report any content they find on the platform that they believe to be in violation of our content guidelines. Such reports are immediately escalated and censored by our system to prevent fans from accessing the content in question until fully reviewed for re-classification.
Below is an overview of our content moderation review process:
Our Best-in-class Automated Moderation Tools
To prevent undesirable content on the site and to protect our users, we utilize three tools:
- Amazon Rekognition Content Moderation: automatically runs on all images and videos to determine if they contain adult content. Any flagged content cannot be posted and viewed.
- Hive Moderation: automatically runs on all audio files to determine if they contain adult content. Any aforementioned policy on images and videos applies to audio as well.
- Microsoft PhotoDNA: automatically runs on all images to detect CSAM content (Child Sexual Abuse Material). All positive matches are manually reviewed and true positives (none have yet occurred) are reported to the NCMEC (National Center for Missing & Exploited Children).
Committing Ourselves to Continuous Growth In 2025 And Beyond
Going into 2025, we will continue to uphold our values to the Passes community as we elevate the standard of what safety means for the creator industry:
- Moderation Review Guidelines: as our content categories grow, we will continue to optimize standards of moderation reviews according to the category. This will promote a fair and applicable moderation policy across different disciplines of content.
- Audit program: we will regularly conduct an audit of our quality assurance process, thus ensuring it is consistent with our rubric guidelines. This will help maintain the accuracy of standardized reviews across any demographic of teams moderating our content.
- Category / Label / Rating Calibration: as content continues to evolve and expand within Passes, we will set and launch new categories that require calibrated moderation reviews and content ratings, which will differ across content categories and geo-locations.
- Self-service mechanisms: we will continue to evolve our self-service features within the platform intended to help a creator or fan resolve their trust & safety incidents. This will allow for a more seamless experience while also granting an easily scalable approach as we expand globally.
- Broader reporting access for urgent incidents: as we expand our support channel availability, we will enable new ways for fans and creators to flag a trust & safety issue and get round-the-clock help in the event of emergencies.
Safety Is Essential To The Success Of Our Platform
Earning, maintaining, and respecting the trust of creators and fans is of the utmost importance to us, hence why we take the safety and well-being of our community as seriously as we do. The North Star for our team is to be the safest, most secure creator and fan platform in the world. We are proud of the work that has been done to set the standard of safety and security within the industry, and remain committed to continuously adapting to and collaborating with our community to uphold and elevate these standards as the industry evolves.
If you have questions on our trust & safety policies, feel free to reach out to help@passes.com.
- Passes Trust & Safety Team