// ROLE SUMMARY

You will audit completed annotations produced by other team members, checking each one against the project rubric and flagging errors. This is second-pass work: someone else has already labeled the data, and your job is to verify that the labels are correct, consistent, and complete.

Annotation Quality Reviewer

Quality Review$4045/hrRemote(EU)Posted February 10, 2026

// DESCRIPTION

You will audit completed annotations produced by other team members, checking each one against the project rubric and flagging errors. This is second-pass work: someone else has already labeled the data, and your job is to verify that the labels are correct, consistent, and complete. On a typical day you will review 200-400 items, writing short justifications for every rejection. The work requires sharp attention and the ability to hold a full annotation schema in your head while scanning at speed.

You should have a track record of careful, systematic work. Backgrounds in copy editing, test engineering, clinical data review, or research assistance translate well. We will train you on our specific tools and rubrics, but we cannot teach the underlying mindset: you either notice when something is slightly off, or you do not.

Reviewers work on the same weekly cadence as annotators but with a one-day offset -- you start reviewing Monday batches on Tuesday. Expect 20-30 hours per week. Weekly calibration sessions keep the review team aligned, and you will have a dedicated Slack channel for real-time questions.

// SKILLS & REQUIREMENTS

Proficient in written EnglishAbility to internalize complex rubrics quicklyBackground in copy editing, research, or data analysisComfortable providing direct feedback to peersFamiliarity with inter-annotator agreement metricsExperience with quality assurance toolingClear, constructive written feedback

// FREQUENTLY ASKED QUESTIONS

// READY TO GET STARTED?

Apply in minutes

Create your profile, select your areas of expertise, and start working on frontier AI projects.

Apply Now