Human annotators with the right profile and experience reduce the cost and increase the quality and data throughput.
Sigma does not crowd-source, but has a large database of vetted trained annotators.
Sigma always creates the team for the project.
Sigma relies on highly trained human annotators instead of crowdsourcing the data annotation work. Crowdsourcing often provides different levels of quality, and it requires extensive quality assessment (QA) methods. While QA helps increase the accuracy, it also increases the cost and the time it takes to perform the project.
Human annotators with the right profile and experience reduce the overall cost of the project and speed up the availability of annotated data because of the increased throughput and the time savings in QA.
Performing data annotation with high quality requires annotators who have attention to detail and patience, and, very importantly, who strictly follow the annotation guidelines. Our more than 30-year experience in data annotation shows that not everyone has the particular skills and mindsets to produce consistent and high-quality annotations.
However, selecting candidates with specific profiles takes time. This is why Sigma has established a continuous process of candidate selection and has an extensive database of vetted data collection and annotation candidates. Thanks to this process, Sigma can scale quickly with the most appropriate professionals for each project.
Having the right professionals does not increase the cost. Choosing suitable candidates is the best way of approaching the challenge of getting the right data, with the required quality, at the right time, and in a cost-efficient manner.