• Register
Return to: Home > Comments > Explainable AI (XAI) spotlights factors influencing algorithms

Explainable AI (XAI) spotlights factors influencing algorithms

Explainable AI (XAI) emphasises not just how algorithms provide an output, but also how they work with the user, and how the output or conclusion is reached. XAI approaches shine a light on the algorithm’s inner workings to show the factors that influenced its output. The idea is for this information to be available in a human-readable way, rather than being hidden within code

ACCA’s latest report Explainable AI addresses explainability from the perspective of accountancy and finance practitioners. Head of Business Insights, Narayanan Vaidyanathan said, ‘It is in the public interest to improve understanding of XAI, which helps to balance the protection of the consumer with innovation in the marketplace.’

Complexity, speed and volume of AI decision-making often obscure what is going on in the background (the black box), which makes the model difficult to interrogate.  Explainability, or the lack of this, affects the ability of professional accountants to understand and display scepticism.  In a recent ACCA survey, more than double, 54%, agreed with this statement compared to those who didn’t. 

Vaidyanathan continued, ‘It’s an area that’s relevant to being able to trust technology and to be confident that it’s used ethically and XAI can help in this scenario.  It’s helpful to think of it as a design principle as much as a set of tools.  Moreover, this is AI decoded, and designed to augment the human ability to understand and interrogate the results returned by the model.’

Key messages for practitioners:

  • Maintain awareness of evolving trends in AI: 51% of respondents were unaware of XAI.  This impairs the ability to engage. The report sets out some of the key developments in this emerging area to help raise awareness. 
  • Beware of oversimplified narratives: In accountancy, AI isn’t fully autonomous, but nor is it a complete fantasy. The middle path of augmenting, as opposed to replacing, the human works best when the human understands what the AI is doing; which needs explainability.
  • Embed explainability into enterprise adoption:  Consider the level of explainability needed, and how it can help with model performance, ethical use and legal compliance.

Policy makers, for instance in government or at regulators, frequently hear the developer/supplier perspective from the AI industry. This report can complement that with a view from the user/demand side, so that policy can incorporate consumer needs. 

The report’s key messages for policy makers are:

  • Explainability empowers consumers and regulators: improved explainability reduces the deep asymmetry between experts who understand AI and the wider public. And for regulators, it can help reduce systemic risk if there is a better understanding of factors influencing algorithms that are being increasingly deployed across the marketplace.
  • Emphasise explainability as a design principle: An environment that balances innovation and regulation can be achieved by supporting industry to continue, indeed redouble, its efforts to include explainability as a core feature in product development.

Narayanan Vaidyanathan added, ‘XAI can be polarising, with some having unrealistic expectations for it to be like magic and answer all questions. While others are deeply suspicious of what the algorithm is doing in the background. XAI seeks to bridge this gap, by improving understanding to manage unrealistic expectations, and to give a level of comfort and clarity to the doubters.’

Top Content

    Brazil: regulation and technology form basis for recovery

    Opportunities in the capital markets and the ever-growing influence of technology are expected to have a significant impact on the Brazilian accounting profession over the next 12 months, writes Paul Golden.

    read more

    Mentoring support and the opportunity to delegate

    Jon Lisby will be known to many from his former role as CEO of Kreston International. Here, he explains the background to his new venture, Global Alliance Advisory Services (GAAS), and how he aims to offer support to alliance CEOs.

    read more

    Global by name, global by nature

    Stephen Heathcote became chief executive officer of PrimeGlobal on 1 June 2019. Robin Amlôt met him to discuss the various new challenges that he has taken on, and his ambitions for the association.

    read more

    ARGA team, assemble!

    The new top team has been named that will see in root-and-branch reform at the Financial Reporting Council (FRC) as it transforms into the Audit, Reporting and Governance Authority (ARGA). Will the new duo be as dynamic as some are hoping? Robin Amlôt reports.

    read more

    CORONAVIRUS TIMELINE: REACTIONS FROM THE ACCOUNTANCY PROFESSION

    As the Coronavirus (COVID-19) continues to spread across the world, the International Accounting Bulletin and The Accountant will be collating all the latest news and updates from the profession on the pandemic’s impact.

    read more
Privacy Policy

We have updated our privacy policy. In the latest update it explains what cookies are and how we use them on our site. To learn more about cookies and their benefits, please view our privacy policy. Please be aware that parts of this site will not function correctly if you disable cookies. By continuing to use this site, you consent to our use of cookies in accordance with our privacy policy unless you have disabled them.