
Spinoff Blog Project
――Annotations that support AI in the DX era. The reality of that analog field
Why Human Science does not use crowdsourcing
Until now, our company has been publishing various blogs related to annotation and AI. In those, we have focused on conveying general knowledge and know-how. While the task of annotation may seem simple when expressed in words, it is a task that inevitably involves a lot of human interaction due to its inherent "ambiguity" and the fact that it is a "task that cannot be avoided by humans." Therefore, it often becomes quite messy and cannot be resolved with the neat theories that are commonly found. In order to ensure quality and productivity, a variety of experiences and know-how are actually required.
Therefore, understanding the specific problems that occur in the actual annotation field and how to address them can serve as helpful hints for leading to successful annotation.
In our company, we want to convey what actually happens on-site and the specific responses and measures we take. Unlike regular blogs or columns, we aim to share the real conditions of our work environment, including our unique characteristics and commitments, under the title of a spin-off blog project.
>>Past Published Blogs (Partial)
- Table of Contents
1. Annotation system for human sciences that does not utilize crowdsourcing
Needless to say, if you understand the merits and demerits of crowdsourcing and utilize it effectively, you can achieve more efficient AI development. Tips on how to utilize it have been discussed in previous blogs. One of the significant advantages of crowdsourcing is the ability to mobilize a large number of personnel within a short timeframe and budget. For example, securing the right talent from tens of thousands of crowd workers allows you to advance tasks rapidly and collect diverse data, which is a considerable benefit.
So, do we utilize crowdsourcing at our company? The truth is, we do not.
While there are certainly significant benefits to utilizing crowdsourcing, our company not only receives few of those benefits, but also finds that it does not align with the values we hold dear in our work and our corporate culture of manufacturing. Therefore, we believe that conducting annotations with directly contracted workers is ultimately better. Our company conducts trials during the hiring process, establishes a system to assign annotators based on the characteristics of the work, and provides training for workers at the start of each project, as we are committed to this approach.
2. Why Human Science Does Not Use Crowdsourcing
While this is not a universal statement, when utilizing the advantages of crowdsourcing to achieve short delivery times with a large number of workers, it becomes challenging to ensure close two-way communication between the project manager and the workers for training and to confirm the workers' understanding of the tasks. Annotation is a labor-intensive task that requires human effort, and thus, the loyalty and motivation of the workers towards the company significantly influence quality and productivity. Enhancing loyalty and motivation that arise from close communication and a sense of familiarity ultimately leads to stabilized quality and improved productivity; therefore, we believe it is better to have work done by annotators contracted by our company.
While some crowdsourcing platforms may allow for securing talent at a low cost, situations where the hourly wage falls below the minimum wage can lead to a decrease in motivation to continue working. This has also become a social issue in some Western countries. In business, forcing dissatisfaction or tolerance on one partner raises questions about the sustainability of the business.
Our company has received high evaluations for our information security management system, and as a result, we often receive work with high security requirements, which inevitably leads to more tasks being performed in our internal security room. In telework-focused crowdsourcing, it is indeed challenging to address these issues. In terms of security training, reinforcing security awareness and rules has its limits, as measures against information leakage using annotation tools and NDA agreements alone are not sufficient. Therefore, education for workers and individualized guidance based on the situation is necessary, and we believe that utilizing our contracted annotators is the best way to proceed with the work in this regard.
3. Our Commitment to Communication for Human Resource Development
In order to balance the conflicting goals of ensuring quality while also increasing productivity, our company is committed not only to improving work processes and automating annotations but also to "annotation work by elite teams" and "emphasizing the education of annotators." Annotations vary significantly in specifications and work content for each project. However, despite the substantial differences in specifications and content depending on the project, there are common "key points of annotation" that exist in the annotation work. To cultivate these, it is essential to maintain close communication with annotators, continue education, and accumulate experience. When these qualities are developed, the time required for understanding the work and education for each project is significantly reduced. This tendency becomes even stronger in high-difficulty annotation tasks, where such qualities are necessary to ensure high quality and high productivity.
Implementing such an educational process through crowdsourcing, which is generally considered difficult for ongoing contracts, poses challenges. However, by establishing direct contracts and building close relationships with annotators, there will be an increase in proactive suggestions from workers. The experience and know-how will be accumulated not only by the workers themselves but also as organizational knowledge through the PM, ultimately leading to efficient, high-quality, and productive annotation.
4. Summary
Up to this point, we have conveyed the benefits of our annotator system and the direct contract annotator system. We value connecting people, not only with our clients but also with our partner contract annotators. This is not only a characteristic of our department but also part of our corporate culture (which was something I was very surprised by when I transitioned to Human Science). To be honest, it is genuinely enjoyable to deepen interactions and progress work while communicating with people, and it brings me great joy when annotators say, "I want to work at Human Science; it is easy to work here." However, there are times when, unfortunately, we find ourselves in situations where neither party is satisfied, and we may also spend a lot of time on personnel training and sudden staffing needs. Considering efficient staffing, short delivery times, and data collection, it might be better to utilize crowdsourcing. Additionally, due to our seriousness, there are times when we feel we spend too much time on follow-ups, communication, and ensuring quality. Of course, pursuing efficiency is one of the corporate propositions that should not be forgotten, and we are constantly exploring various possibilities and honing our skills. However, we strive to pursue efficiency without falling into self-satisfaction, ensuring that we do not neglect the "quality" and "people" that can be considered the conscience of the company.
Author:
Kazuhiro Sugimoto
Annotation Department Group Manager
・At my previous job at a Tier 1 automotive parts manufacturer, I focused on quality design and quality improvement guidance for the manufacturing line, and I have experience as a project manager for model line construction and in cross-departmental projects such as business efficiency improvement (lean improvement) consulting teams.
・In my current position, I have been involved in the establishment and expansion of the annotation business, the construction and improvement of the management system for annotation projects, following my work on management systems such as ISO and promoting knowledge management.
QC Level 1, Member of the Japan Society for Quality Control