Skip to main content

Annotation Tasks

Feature Overview

Annotation task management is the core workflow module of the IO data platform, providing complete task lifecycle management functionality. Through status grouping, progress tracking, quality control and team collaboration, it ensures annotation work proceeds efficiently and orderly.


Main Features

Task Status Management

Status Grouping

The system divides tasks into five main statuses: pending start (tasks created but not yet started), in progress (tasks currently being annotated), pending review (annotation completed tasks waiting for review), review passed (annotation tasks that passed review), data submitted (data submitted for training). This status grouping allows you to clearly understand the current status of each task.

Status Flow

Tasks go through the following status flow in their lifecycle: after task creation, enter pending start status, after starting annotation, change to in progress, after completing annotation, enter pending review, after review passes, change to review passed, if review fails, return to in progress for re-annotation, finally after data submission, change to data submitted. This process ensures orderly annotation work.

Task Creation and Assignment

Task Creation

When creating tasks, you need to select data to annotate on the data page, set task name, description, priority, specify annotators and reviewers, select project the task belongs to, and set task completion time. These configurations ensure tasks can be executed as expected.

Batch Operations

The system supports batch creation of multiple annotation tasks, batch assignment of tasks to annotators, batch modification of task attributes, and batch deletion of unnecessary tasks. These batch operations greatly improve task management efficiency, especially when handling large numbers of tasks.

Progress Tracking and Monitoring

Real-time Progress

The system displays task completion percentage, estimated remaining completion time, annotation quality statistics, and annotation efficiency trend analysis in real-time. This information helps you timely understand task progress and make corresponding adjustments.

Progress Reports

Provides various reports such as personal progress (individual annotator work progress), team progress (overall team work progress), project progress (project-level progress statistics), quality reports (annotation quality analysis reports), meeting different levels of management needs.

Quality Control System

Review Mechanisms

The system provides multiple review mechanisms: rule-based automatic quality checking, manual quality checking by reviewers, random sampling quality checking, full review of all annotations. These mechanisms ensure annotation quality meets requirements.

Quality Indicators

Through indicators such as pass rate (proportion passing review on first attempt), accuracy (statistics of annotation accuracy), consistency (consistency between different annotators), completeness (completeness checking of annotations), comprehensively evaluate annotation quality.

Team Collaboration Functions

Task Assignment

Supports various assignment methods such as intelligent assignment (intelligent assignment based on annotator capabilities and workload), manual assignment (administrator manual assignment), reassignment (reassigning tasks to other annotators), task transfer (transferring tasks between different annotators), ensuring tasks can be reasonably assigned.

Communication and Collaboration

Provides collaboration functions such as task comments (adding comments and feedback in tasks), problem reporting (annotators reporting encountered problems), solutions (reviewers providing solutions), experience sharing (team members sharing annotation experience), promoting team communication and knowledge sharing.

Data Export and Integration

Annotation Result Export

Supports export in standard formats such as LeRobot, HDF5, custom export format based on needs, batch export of annotation results from multiple tasks, and export of only new or modified annotations. These functions meet data export needs for different scenarios.

Training Data Preparation

Provides functions such as data cleaning (automatic cleaning and preprocessing of annotation data), format conversion (converting to model training required format), quality validation (validating exported data quality), version management (managing different versions of training data), ensuring exported data can be directly used for model training.

Applicable Roles

Administrator

As a platform administrator, you can view overall status of all tasks, manage annotator and reviewer resources, monitor overall annotation quality, and configure task processes and rules. These functions ensure the platform's task management service is stable and efficient.

Project Manager

Project managers can create annotation tasks for projects, track project annotation progress, monitor annotation quality status, and coordinate annotator and reviewer work. Through task management module, project managers can effectively control project annotation work.

Annotator

Annotators can receive tasks assigned to them, execute specific annotation work, update task completion progress, and feedback problems encountered during annotation. These functions support annotators in efficiently completing annotation tasks.

Reviewer

Reviewers can review tasks completed by annotators, evaluate annotation quality, provide feedback to annotators, and establish and update annotation standards. This role plays an important role in ensuring annotation quality.