// ROLE SUMMARY
You will audit completed annotations produced by other team members, checking each one against the project rubric and flagging errors. This is second-pass work: someone else has already labeled the data, and your job is to verify that the labels are correct, consistent, and complete.
Data Quality Analyst
// DESCRIPTION
You will audit completed annotations produced by other team members, checking each one against the project rubric and flagging errors. This is second-pass work: someone else has already labeled the data, and your job is to verify that the labels are correct, consistent, and complete. On a typical day you will review 200-400 items, writing short justifications for every rejection. The work requires sharp attention and the ability to hold a full annotation schema in your head while scanning at speed.
Ideal reviewers combine speed with precision. You need to internalize annotation guidelines deeply enough to spot subtle deviations, not just obvious errors. Good written communication is essential because your rejection notes need to be clear enough that annotators can fix issues without a back-and-forth conversation.
Reviewers work on the same weekly cadence as annotators but with a one-day offset -- you start reviewing Monday batches on Tuesday. Expect 20-30 hours per week. Weekly calibration sessions keep the review team aligned, and you will have a dedicated Slack channel for real-time questions.
// SKILLS & REQUIREMENTS
// FREQUENTLY ASKED QUESTIONS
// READY TO GET STARTED?
Apply in minutes
Create your profile, select your areas of expertise, and start working on frontier AI projects.
Apply Now