Every day, tens of millions of people from around the world come to Roblox to play, learn, work, and socialize in immersive digital experiences created by the community. Our vision is to build a platform that enables shared experiences among billions of users. This is what’s known as the metaverse: a persistent space where anyone can do just about anything they can imagine, from anywhere in the world and on any device. Join us and you’ll usher in a new category of human interaction while solving exceptional challenges that you won’t find anywhere else.
As a Program Manager on our Trust & Safety team, you'll create a moderation incubation team to handle new moderation service requests for our product launches. You will work with Product Management, Engineering, Data Science, and Design to optimize safety experiences, with an emphasis on implementing scalable moderation best practices and workflows, introducing and fine tuning automation efforts. You will set the standard for moderation design principles on behalf of the entire moderation department. You're passionate about structure, organization, and well-run processes.
Apart from overseeing the technical requirements and operationalization of new products, you will also improve the internal moderation tooling, which affects more than 2000 agents around the world.
You will report to the Senior Manager of New Moderation Initiatives.
You Will:
You’ll Love:
Enter your email address below to get notified whenever we find a similar job post.
Unsubscribe at any time.