// ROLE SUMMARY
Quality review sits between the annotation team and the client. You will sample completed batches, apply the scoring rubric, and produce accuracy reports that the project lead uses to give feedback to annotators.
Senior Quality Reviewer
// DESCRIPTION
Quality review sits between the annotation team and the client. You will sample completed batches, apply the scoring rubric, and produce accuracy reports that the project lead uses to give feedback to annotators. When you spot systematic errors -- a whole batch misapplying a label, for example -- you escalate to the team lead so the guidelines can be updated. The goal is to keep delivered data above a 95% accuracy threshold across all active projects.
You should have a track record of careful, systematic work. Backgrounds in copy editing, test engineering, clinical data review, or research assistance translate well. We will train you on our specific tools and rubrics, but we cannot teach the underlying mindset: you either notice when something is slightly off, or you do not.
Reviewers work on the same weekly cadence as annotators but with a one-day offset -- you start reviewing Monday batches on Tuesday. Expect 20-30 hours per week. Weekly calibration sessions keep the review team aligned, and you will have a dedicated Slack channel for real-time questions.
// SKILLS & REQUIREMENTS
// FREQUENTLY ASKED QUESTIONS
// READY TO GET STARTED?
Apply in minutes
Create your profile, select your areas of expertise, and start working on frontier AI projects.
Apply Now