advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

If you’re not an AI behaviour forensic expert, you should think about becoming one

There are currently a myriad reports and studies into the jobs of the future, with data, artificial intelligence and machine learning now forcing organisations and people alike to rethink which professions will hold value in the near future.

One of those coveted jobs could be an AI behaviour forensic expert, according to the latest study by research firm Gartner. More specifically they predict that as much as 75 percent of large organisations will look to hire an AI behaviour forensics expert by 2023.

This as user trust in AI or machine learning-based solutions are expected to plummet in the coming years, especially as incidents of irresponsible privacy breaches and data misuse keep occurring, the firm notes.

Bias also plays a contributing factor here Gartner explains.

Inherent problems

Bias based on race, gender, age or location, and bias based on a specific structure of data, have been long-standing risks in training AI models. In addition, opaque algorithms such as deep learning can incorporate many implicit, highly variable interactions into their predictions that can be difficult to interpret,” they point out.

It is these kinds of factors that will prompt companies to hire experts to ensure that while AI and machine learning are delivering the business-based outcomes required, they aren’t unduly causing negative after effects as well.

“New tools and skills are needed to help organisations identify these and other potential sources of bias, build more trust in using AI models, and reduce corporate brand and reputation risk,” says Jim Hare, research vice president at Gartner. 

“More and more data and analytics leaders and chief data officers (CDOs) are hiring ML forensic and ethics investigators,” he adds.

It’s also a developing situation that the industry is acutely aware of, according to Gartner.

“Some organisations have launched dedicated AI explainability tools to help their customers identify and fix bias in AI algorithms,” they say. “Commercial AI and machine learning platform vendors are also adding capabilities to automatically generate model explanations in natural language,” the firm continues.

What to consider

Hare says that the task now falls to data and analyst leaders, as well as CDOs, to monitor how artificial intelligence and machine learning is implemented in their business, and avoid any potentially fatal missteps in this regard.

“They must make ethics and governance part of AI initiatives and build a culture of responsible use, trust and transparency. Promoting diversity in AI teams, data and algorithms, and promoting people skills is a great start,” stresses Hare.

“Data and analytics leaders must also establish accountability for determining and implementing the levels of trust and transparency of data, algorithms and output for each use case. It is necessary that they include an assessment of AI explainability features when assessing analytics, business intelligence, data science and ML (machine learning) platforms,” he concludes.

It therefore stands to reason that if your interested in a career involving artificial intelligence, and a behaviour forensic expert was not on your radar before, it definitely should be now.

[Image – Photo by Franck V. on Unsplash]

advertisement

About Author

advertisement

Related News

advertisement