// ROLE SUMMARY

Quality review sits between the annotation team and the client. You will sample completed batches, apply the scoring rubric, and produce accuracy reports that the project lead uses to give feedback to annotators.

Annotation Audit Specialist

Quality Review$3550/hrRemotePosted January 20, 2026

// DESCRIPTION

Quality review sits between the annotation team and the client. You will sample completed batches, apply the scoring rubric, and produce accuracy reports that the project lead uses to give feedback to annotators. When you spot systematic errors -- a whole batch misapplying a label, for example -- you escalate to the team lead so the guidelines can be updated. The goal is to keep delivered data above a 95% accuracy threshold across all active projects.

Prior experience in QA, editing, proofreading, or annotation review is strongly preferred. You need to be comfortable giving direct, constructive feedback -- annotators see your rejection notes, so clarity matters. Familiarity with inter-annotator agreement metrics (Cohen's kappa, Krippendorff's alpha) is a plus. Most important is an eye for detail and the patience to review item after item without letting accuracy slip.

This is remote, asynchronous work. You set your own schedule within project timelines. We pay hourly, with a performance bonus tied to your own review accuracy (audited by a third reviewer on a random sample). Typical reviewers work 15-25 hours per week.

// SKILLS & REQUIREMENTS

Ability to internalize complex rubrics quicklyClear, constructive written feedbackComfortable providing direct feedback to peersExperience with quality assurance toolingSystematic approach to error documentationBackground in copy editing, research, or data analysis

// FREQUENTLY ASKED QUESTIONS

// READY TO GET STARTED?

Apply in minutes

Create your profile, select your areas of expertise, and start working on frontier AI projects.

Apply Now