Back to Blogs

Expert Review with Workflows

December 5, 2023
5 mins
blog image

Introduction

Expert review workflows are crucial for accurate and successful annotation projects ensuring high data quality, efficient task allocation, and time savings. In this walkthrough, you’ll learn how to customize workflows to facilitate expert review flows and improve collaboration. As the AI and computer vision landscapes evolve, expert review workflows help you maintain data integrity, ensure optimal model performance, and maintain flexibility for future unknown labeling demands. 

Understanding Workflows

Workflows are systematic processes (or graphs) that define how tasks are organized, assigned, routed, and automated within an annotation project. They provide a structured way of handling various stages of a project, ensuring that each step is completed efficiently and in the correct order while tracking performance at each step.

Expert Review

With the importance of training data ever-increasing, expert review workflows ensure the highest quality of annotations, in turn leading to improved model performance. The expert review ensures data meets the required standard through subject matter experts thoroughly checking and validating a subset of the annotations created. 

Benefits of Expert Review Workflows

Expert review workflows offer a range of benefits that contribute to the success of data-centric projects:

  • Improved Data Quality: Expert review ensures that data is accurate and error-free, leading to more reliable models and results.
  • Efficient Task Allocation: Workflows help allocate tasks to the right experts, ensuring that each annotation or review is handled by the most qualified individuals.
  • Error Detection and Correction: Issues can be identified and addressed promptly during the review process, preventing them from propagating further in the project.
  • Time and Resource Savings: Automation within workflows streamline the process, reducing the time and effort required for manual coordination and ensuring experts aren’t wasting their time on menial tasks.

Setting up Expert Review Workflows with Encord

Create a New Workflow Template

First, navigate to the "Workflow Templates" and click on the "+ New workflow template" button. For this walkthrough, we will create a workflow for an object detection model.

Configuring the Workflow Template

In the center, you will find the edit button and by clicking on it you will find on the right-hand side of the screen, you'll find the workflow library. This library contains components to build your workflow.

Let’s look at each of these components as we add them to our insect detection project.

Start Stage

It's where your project begins, offering a clear overview of the project's foundation and helping team members understand the data they'll be working with.

Annotate Stage

This stage is the heart of the workflow, where data is annotated. The stage initially includes all annotators by default. To choose specific annotators, click the Annotate component, go to the Selective tab, enter the user's email, and select from the list. Only collaborators added via Project-level Manage collaborators will be available. The optional Webhook feature adds a layer of real-time notifications, enhancing project monitoring.

Review Stage

Multiple review stages can be included within a project, each with its unique set of reviewers and routing conditions helping to establish a structured process where subject matter experts validate annotations and detect errors.

Strict Review

With strict review, tasks stay put after label approval or rejection, giving reviewers time for adjustments and the ability to add comments for missing annotations. This provides reviewers with an additional opportunity to evaluate and, potentially, revise their judgments. This added layer of scrutiny helps to maintain accuracy and quality.

Router

A Router divides the pathway that annotation and review tasks follow within the workflow. You have the choice between two router types to select for your project:

Percentage Router

Precisely allocates annotations based on defined percentages, which is useful for the precise distribution of tasks, ensuring an equal workload split between different stages or groups of reviewers.

Collaborator Router

Customize annotation workflows based on collaborators to assign tasks strategically, ensuring alignment with expertise and responsibilities, and providing flexibility for diverse collaborators. For instance, a new annotator, Chris, may have his tasks automatically routed to an expert review queue, assigning pathology annotations to Dr. Smith and radiology annotations to Dr. Johnson. This approach optimizes the workflow, maintains quality through expert review, and allows flexibility for exceptions, enhancing collaboration in diverse teams

Now that we've covered each element of the workflow, let's explore an instance of a workflow designed for object detection.

Using Workflows in Annotation Projects

To understand the integration of workflows in annotation projects, let's create an annotation project for an insect detection model with the following steps:

  1. Select and name the annotation project.

  1. Add insect dataset. You can create a new dataset here as well.

  1. Add the ontology for the annotation project.

  1. For quality assurance, opt for a workflow, either by creating a new one or utilizing an existing setup.

  1. And you are ready to start annotating! Select your annotation project and open the summary.

The Summary provides an overview of your workflow project, displaying the status of tasks in each workflow stage, and offering a high-level visual representation of project progress.

  1. Navigate to the Queue for task management and labeling initiation, with options tailored to user permissions.

It encompasses the Annotator's Queue, Reviewer's Queue, and Annotator & Reviewer, Admin, and Team Manager Queue. Users can filter, initiate, and assign tasks as needed, and this functionality varies based on the user's role. Admins and Task Managers can assign and release tasks, ensuring efficient task management within the project.

  1. Select the Start Labeling button to annotate your dataset.

Label your dataset!

Once the data has been annotated, reviewers find the labeled data to be reviewed in Queue as well. 

The reviewer has the option to exert bulk action on multiple reviews at once.

Once the review is complete, any rejected images can again be found in the Annotator’s queue. The reason for rejection can also be specified and the annotator must resolve the issue to submit the re-annotated data.

The approved images are found in the expert review queues.

Once all the reviews are accepted the annotation is complete!

The multiple review stages process in the annotation project contributes to the refinement of the dataset, aligning it with the desired standards and objectives of the project. The flexibility to perform bulk actions on multiple reviews simultaneously streamlines the review workflow and the ability to specify reasons for rejection provides valuable feedback to annotators. 

Wrapping Up

In conclusion, expert review workflows play a pivotal role in ensuring the accuracy and success of data-centric projects like annotating an insect detection model. These workflows offer benefits such as improved data quality, efficient task allocation, and time savings. As technology advances, the importance of expert review workflows in maintaining data integrity becomes increasingly evident. They are an essential component in the evolving landscape of data-driven projects, ensuring optimal model performance.

Training CTA Asset
Optimize your annotation project with expert review workflows
Book a live demo

cta banner

Build better ML models with Encord

Get started today
Written by
author-avatar-url

Akruti Acharya

View more posts