12 ديسمبر 2022
12 ديسمبر 2022

Inside Snap’s Approach to Brand Safety

Last week Snap launched our latest Transparency Report on our new Privacy and Safety Hub. Alongside that, we wanted to share more about our long-standing commitment to provide a healthy and brand safe experience for our community and advertising partners.

Safety by Design

Snapchat was intentionally designed to be different from traditional social media—to help people express themselves with friends without the pressure to grow a following or compete for likes and enhance their relationships with their friends, family, and the world.

That’s why Snapchat opens up to a camera, rather than opening to a feed of unmoderated content. Across Snapchat, we have built safety into the fundamental architecture of our platform and don’t allow unmoderated content to reach a large audience. As a result, Snapchat is ranked as one of the most friendly social environments when compared to other platforms. 1

Instead of developing an open news feed that optimizes for unchecked, sensational, or toxic content, we built Discover— a content platform for vetted publishers and creators with a proven audience. Like a television network, our editorial teams choose which publishers and shows run on Discover. Snapchatters always know where content on Discover is coming from, and it is not interspersed with posts from friends. We feature high profile, respected partners, like Disney, NBCU, ViacomCBS, The Washington Post, Bloomberg, NFL, and NBA, mainstream celebrities like Meghan Trainor, Jason Derulo and Jack Harlow, and popular creators like David Dobrik.

In order to participate on Discover, our editorial partners must comply with our content guidelines—which prohibit the spread of misinformation, hate speech, conspiracy theories, violence and many other categories of harmful content.

Machine Learning + Human Review = Better Content Moderation

We have always taken a forward-leaning approach to content moderation and enforcing our policies. We combine machine learning with a dedicated team of real people to detect and review potentially inappropriate content in public posts.

For example, on Spotlight, where creators can submit creative and entertaining videos to share with the broader Snapchat community, all content is first reviewed automatically by artificial intelligence before gaining any distribution. Once a piece of content gains more viewership, it’s then reviewed by human moderators before it is given the opportunity to reach a large audience. Our approach to moderation for content on Spotlight reduces the risk of spreading misinformation, hate speech, or other potentially harmful content.

Additionally, we also make it easy for our community to report any inappropriate behavior and content through in-app reporting or our website.

Clear Guidelines, Consistent Enforcement

We avoid “legalese” and take great care to write our policies, terms, and guidelines so that they are easy to understand. We believe everyone should play by the same rules on Snapchat, that’s why our Community Guidelines apply equally to all, regardless of whether the account holder is a political figure, a publisher, advertiser, or a Snapchatter. These guidelines prohibit harmful or offensive content, and underscore a mutual commitment between Snap, Snapchatters, and our partners to protect the communities that we serve.

Brand Safety Tools

We provide an array of product features and tools to help ensure advertisers have full control of how and where their brand appears on Snapchat. For example, we offer:

  • Content Adjacency Tools. We know that a core concern for our partners is to ensure ads appear next to content appropriate for their brand. For this reason, advertisers can control where their ads run on Snapchat—in curated Discover content from publishers, in select publisher content, or across all of Snapchat.

  • Full-screen ads. Video ads on Snapchat are full-screen, so they don’t share the screen with any other content. Advertisers may explicitly exclude full screen ad inventory from Public accounts that have been deemed brand unfriendly before running a campaign.

  • Optimization. We want the ads we show Snapchatters to be fun, interesting, and relevant to them. To help accomplish that goal, we allow advertisers to show ads based on information related to Snapchatters’ interests. If Snapchatters prefer not to have ads shown to them based on this information, they can easily update their preferences in the app’s settings under Ad Preferences.

  • Protected Classes. Snap has never provided the ability to target by race, ethnicity or "ethnic affinity" at any point in our history. Our Snapchat Lifestyle Categories are carefully curated and avoid tracking of protected classes such as race, religion, health status, or sexual identity. We also avoid proxies for ethnicity like "ethnic affinity categories."

  • Age Gating. Snapchat is a central communications tool for young people, and we know parents and caregivers want additional ways to help keep their teens safe. Our publisher guidelines require content to be appropriate for a 13+ audience. If it’s not, our guidelines require publishers to age gate their content to an 18+ audience using our internal tools. We also restrict ads based on the Snapchatter’s age and have certain ad categories that we do not allow to advertise to those under 18.

  • Fraud Protection. We have zero tolerance for fraudulent ad activity on Snapchat. In conjunction with our user account fraud protection policies and self-reporting tools, we review ads for compliance with our policies before approving them to go live. If we learn of advertisers who violate our terms, we take immediate action to remove their ads, and any other appropriate actions. Read more about our methods for preventing and protecting against ad fraud in the Infringing Content section of our Advertising Policies.

In addition to these tools for advertisers and Snapchatters, Snap also has internal tools and practices to protect our community and give them control of their content and data, including spacing of sensitive content and the ability for Snapchatters to report or opt-out of content that they prefer not to see. As people use the app, the Discover feed will also adjust to reflect the Snapchatter’s interests and serve them new programming based on content they’ve previously engaged or interacted with.

We are focused on keeping our community safe, which is why we’re always exploring new and innovative ways to create a more positive experience for our community, partners, and stakeholders who rely on us to ensure a brand-safe experience.

1

2022 global KR&I research commissioned by Snap Inc. Question: Which of the following are true for you when it comes to [social app]? International Total Sample N = 13,519 daily social/communication app users.

1

2022 global KR&I research commissioned by Snap Inc. Question: Which of the following are true for you when it comes to [social app]? International Total Sample N = 13,519 daily social/communication app users.