As artificial intelligence (AI) developments continue to boom, the technology becomes more capable, causing people to fear more for the security of their roles. However, a new Massachusetts Institute of Technology (MIT) study shows that the cost of deploying the technology likely makes it more economical for your employer to keep you, at least for now.
When people think of jobs being replaced by AI, most people only consider how good the technology is at performing human tasks. For that reason, many of the studies and headlines you see focus their attention on what tasks AI could automate, only exacerbating feelings of job insecurity.
Also: Why Agile doesn’t work for most IT pros: The bigger you are, the harder you fall
However, the MIT study, Beyond AI Exposure, shifts from that typical approach and accounts for an overlooked factor — the cost.
In the study, the five researchers first surveyed workers to understand what performance would be required of an automated system. They then modeled the cost of building such a model, and finally, they saw if AI adoption is economically attractive. It’s worth noting that the study looks explicitly at tasks that can be automated by computer vision, meaning visual tasks such as inspecting for quality.
An economically grounded estimate of task automation is a good indicator of what the future for roles will look like since, ultimately, implementing these elaborate AI models has a high cost, and companies are only interested in investing in technology that offers a return on investment.
To explain why considering cost is so essential, the study uses the example of a small bakery that might be considering using computer vision to automate visually checking its ingredients for quality.
Also: How tech professionals can survive and thrive at work in the time of AI
Since this task itself only makes up 6% of a baker’s duties, a small bakery with five bakers making $48,000 yearly can save up to $14,000 in automating the tasks. The cost of deploying the computer vision is much higher than the money they would save, so it would not be financially sound to automate the task even though the technology is available.
The 45-page paper goes through the framework and methods leveraged to conduct the study, as well as the results, which should help put your job concerns at ease.
“We find that only 23% of worker compensation “exposed” to AI computer vision would be cost-effective for firms to automate because of the large upfront costs of AI systems,” said the study.
Furthermore, 77% of vision tasks are not worth automating due to the cost-effectiveness of deploying a single system that can only be used at a firm level.
Also: Despite all the AI hype, success depends on just one thing
The study acknowledges that as the technology continues to be developed, the cost involved with deploying the technology will decrease. However, even with cost decreases of 20% per year, it would still take decades for it to become economical for most firms to use computer vision.
Although the study doesn’t account for other AI uses, such as text generation, the researchers envision their framework being used to investigate other areas beyond computer vision.