// ROLE SUMMARY
We are collecting human gameplay data to train AI agents that operate in interactive 3D environments. You will work through scripted scenarios in various game engines, logging your actions and annotating key decision points.
Game Testing Annotator
// DESCRIPTION
We are collecting human gameplay data to train AI agents that operate in interactive 3D environments. You will work through scripted scenarios in various game engines, logging your actions and annotating key decision points. Some sessions require you to play optimally; others ask you to intentionally explore suboptimal strategies so the AI can learn from a broader range of behaviors. Each session produces a replay file and a structured annotation log.
We are looking for people who can play systematically and document their reasoning. Gaming experience is important, but so is the ability to follow protocols and produce clean annotation logs. If you have done game QA, speedrunning, or competitive gaming, those backgrounds translate well.
Sessions run 2-4 hours and are scheduled in advance. Most contributors complete 3-4 sessions per week. You will need a PC that meets minimum specs for the game engines we use (we will provide these during onboarding). A stable internet connection is required for real-time data upload.
// SKILLS & REQUIREMENTS
// FREQUENTLY ASKED QUESTIONS
// READY TO GET STARTED?
Apply in minutes
Create your profile, select your areas of expertise, and start working on frontier AI projects.
Apply Now