Skip to main content
February 06, 2024

The opportunities afforded by AI are many, but explainability may well be the key to uptake

Roger Portnoy

CHIEF STRATEGY OFFICER AT OBJECTWAY

Reading time: 5 min

OWINTALK | BEHIND BUSINESS, BEYOND NEWS

Objectway features The Wealth Mosaic WealthTech Landscape Report 2024 to dive deep into the industry’s AI journey, emphasising both its potential benefits and challenges, running through use cases of AI and explaining how explainability works.

Banks, wealth, and asset managers alike are rapidly recognising AI as likely to become one of the most strategically essential technologies for the wealth management industry to leverage. Indeed, the potential for AI to enhance profitability and improve client experiences is undeniable, driving around 80% of banks to invest in AI, with plans for increased spending over the next two years.

While the willingness to embrace AI is high, operationalising and monetising its benefits remains a challenge for many players. Middle and back-office operations are key areas where AI can drive efficiencies, with robotic process automation and intelligent process design already making significant strides. Advanced OCR, computer vision, and NLP techniques are further unlocking opportunities for hyper-automation, particularly in data processing and document analysis.

The suitability process, a critical aspect of client interaction, is gaining prominence, requiring efficient data management and processing. AI, specifically computer vision and NLP, proves invaluable in improving alignment with product governance, ensuring compliance, and driving business gains.

Generative AI takes the forefront in enhancing adviser-client relationships, supporting decision-making processes, and personalizing offerings. The digital transformation facilitated by AI allows wealth managers to turn disparate bits of data into reusable knowledge assets, providing a holistic picture for better decision-making.

AI’s role extends to portfolio management, offering the potential to maximize returns and increase customer satisfaction through the construction of better portfolios and the processing of unstructured data. However, as interest in AI grows, the need for explainability becomes paramount.

Wealth managers need to articulate and document how AI algorithms make decisions, addressing concerns related to data privacy, embedded bias, and transparency. Explainability is not just about being right or wrong but understanding the rationale behind decisions, making it crucial for compliance and client trust.

Two approaches to explainability are highlighted – on one hand explainability is linked to process-based unpacking, on the other it looks at attribution, sensitivity and contribution of factor inputs. The latter proves vital in predicting client behaviour, illustrated through an innovative customer churn prediction system based on Machine Learning and applying the concept of Shapley Additive Explanations (SHAP) – a cutting-edge, model independent method that provides explanations for the output of any ML model.

Continuous testing and optimization of explanatory models are emphasized to overcome the limitations of current AI systems and ensure effective interaction with users. Ultimately, adopting explainable AI systems is crucial for mitigating risks, maintaining positive business outcomes, enhancing operational performance, and improving the overall client experience in the rapidly evolving wealth management landscape.

This is just a taste of the insightful considerations shared by Roger. Access the full report HERE!

Delve deeper into the transformative power of AI in wealth management and stay ahead in this ever-evolving landscape.

RELATED POST