At TikTok, we strive to be transparent and accountable to our users and you, the brands, agencies and partners we work with. As part of our mission to continuously improve our safety efforts, we started the Transparency Forum – an event series where we discuss updates and our tangible progress in making TikTok the safest place for creative self-expression.
Read on for a recap of the important trust and safety issues we've tackled so far.
Our top priority as a platform is the safety and security of our community. For that reason, we're setting a new standard when it comes to data security with our industry-leading approach to data governance, Project Clover. In this Transparency Forum, we discussed the following aspects of our Project Clover strategy:
Data access and controls
Privacy enhancing technologies
Local data storage
External oversight
Over the past several years, we have regularly expanded upon our transparency and accountability efforts, and our commitment to complying with the new Digital Services Act (DSA) reflects the engagement and transparency we have pioneered from the outset. This Transparency Forum event covered how we comply with the DSA, including:
Updates on content moderation and recommendation systems
Changes to targeted advertising to 13-17 year olds and the commercial content library
An overview of how we are meeting the DSA's transparency requirements
A discussion on what this all means for our community and brands.
We are deeply committed to protecting the safety of minors on our platform. We've designed our app with safety in mind, implementing a 'Safety by Design' approach that ensures we are building protection for our users, including teens and their parents. This Transparency Forum featured an in-depth discussion on:
The importance of focusing on minors' needs
How we build for minor safety
Our 'Safety by Design' approach
To stay informed about our upcoming Transparency Forum events and for regular updates on our work to foster a safe environment for our users, sign up for our newsletter.