By 2023, 75% of large organizations will hire AI behavior forensic, privacy and customer trust specialists to reduce brand and reputation risk predicts Gartner. Bias based on race, gender, age or location, and bias based on a specific structure of data, has been long-standing risks in training AI models.
“New tools and skills are needed to help organizations identify these and other potential sources of bias, build more trust in using AI models, and reduce corporate brand and reputation risk,” said Jim Hare, research vice president at Gartner. “More and more data and analytics leaders and chief data officers (CDOs) are hiring ML forensic and ethics investigators.”
Increasingly, sectors like finance and technology are deploying combinations of AI governance and risk management tools and techniques to manage reputation and security risks. In addition, organizations such as Facebook, Google, Bank of America, MassMutual and NASA are hiring or have already appointed AI behavior forensic specialists who primarily focus on uncovering undesired bias in AI models before they are deployed.
“While the number of organizations hiring ML forensic and ethics investigators remains small today, which number will accelerate in the next five years,” added Mr. Hare.
Some organizations have launched dedicated AI explainability tools to help their customers identify and fix bias in AI algorithms. Commercial AI and ML platform vendors are adding capabilities to automatically generate model explanations in natural language. There are also open-source technologies such as Local Interpretable Model-Agnostic Explanations (LIME) that can look for unintended discrimination before it gets baked into models.
“Data and analytics leaders must also establish accountability for determining and implementing the levels of trust and transparency of data, algorithms and output for each use case. It is necessary that they include an assessment of AI explainability features when assessing analytics, business intelligence, data science and ML platforms” said said Jim Hare.
(Image Courtesy: www.eenewseurope.com)