Are the stars aligning? Rethinking reporting for the age of AI

Post # 92

August 6, 2025

Claire Bodanis

With thanks to the IR Society for commissioning a version of this blog for the Summer edition of their magazine, Informed.

Awash with reporting consultations and discussions, and with AI marching swiftly on, Claire believes the time is right to rethink reporting from first principles – to ensure that it achieves its purpose for investors and other stakeholders, while making it much simpler and easier to produce. Claire’s gathering support for her proposal, so if you’re interested in supporting or hearing more, please contact her – claire@falconwindsor.com.

Like many who work in reporting, in the last year I seem to have spent more time talking and writing about it than I have actually doing it. There was the UK Financial Reporting Council’s digital reporting consultation in November. The EU’s Omnibus in February. The UK government’s early work to resurrect their Non-Financial Reporting Review, which kicked off in May. And in late June (finally, hurray!), the UK government’s consultation on adopting the ISSB standards, IFRS S1 and S2 (now known as UK SRS) – along with consultations on assurance and transition planning. I’m pleased to say that they’re encouraging us to ‘consider responses to these three consultations together’ and that our feedback ‘will help to develop an enhanced sustainable finance framework, including sustainability-related financial disclosures, that is fit for the future and maintains the UK’s position as a global leader in this area.’

But what we really need is not just a consultation on these three things together, but a consultation on everything to do with reporting together. While I’m delighted that all these bodies involved in determining how companies should report are alive to its challenges, and seem disposed to address them, the fact that they all come from different angles at the same time points to the real problem at the heart of it all: reporting requirements do not derive from one place, and no single body owns or governs them. Reporting has got into the lengthy mess it has precisely because so many organisations are involved, piling requirements onto companies with good intent but often misguidedly, because they start by asking ‘What do we want companies to do?’ instead of ‘How does this serve the purpose of reporting?’

This problem is not new. But it’s become critical because the sheer volume of information that companies must report today, on so many subjects, threatens to render the exercise meaningless – swamping the truthful (we hope) story being told by management and the Board, that’s at the heart of reporting. It’s therefore no surprise that many are hoping that generative AI will ride to the rescue, assuming we can mitigate its considerable risks.

And perhaps it can. But only if reporting itself is rethought first, starting from its purpose; and only if generative AI is introduced carefully, with proper training and guidelines in support of that purpose. We’ve already done the legwork on the introduction of AI with ‘Your Precocious Intern’, our research and recommendations on the responsible use of generative AI in reporting, in partnership with Insig AI.  

Here’s my view on what rethinking reporting itself could look like – and which I’m working on developing into a more detailed proposal to suggest to the government. Although first, a caveat. It assumes a world in which humans still run companies, and are answerable to other human stakeholders.

Let’s start with the purpose of reporting. What is it? Many will say, ‘To meet regulatory requirements’. And yes, it must. But the more important question is, what is the purpose of those requirements? Why are companies required to report at all? My definition, which seems to resonate with everyone I discuss this with, is: ‘To build a relationship of trust with investors and other stakeholders through truthful, accurate, clear reporting that people believe because it tells an honest, engaging story.’

Reporting serves this purpose by providing two types of information:

  1. Accurate data and disclosures in accordance with reporting requirements

  2. A truthful story; namely, the opinion of management and the Board as to the meaning of those disclosures for the company and its future prospects.

When I started in reporting nearly 25 years ago, reports were so much shorter that when it came to writing them, this distinction didn’t really matter. The ‘front half’ – everything except the financial statements – could be written pretty much as a single story that incorporated all the other necessary disclosures, without clarity or meaning being obscured.

Now that reports are so long, this has become impossible. A single narrative cannot contain everything required without being bent unreadably out of shape. So we must make the distinction much clearer – not just between financial disclosures and the story, but between all disclosures and the story. A growing number of companies, particularly FTSE 100s, are already doing their best by creating a ‘disclosure statements’ section at the end of the strategic report, or an ‘additional information’ section right at the back.

So what should generative AI be doing in this disclosures-plus-story approach to reporting? The biggest benefit it can offer reporters is in the heavy lifting of reams of information; while its biggest risk, highlighted by investors in our research, is that companies start using Copilot or chatbots (the types most likely to be used in report writing) to produce opinion. If used in this way, reporting would no longer give real insight into the minds of management and the Board. Signing off an opinion written by AI would not make it their own opinion, albeit making them accountable for it.

My proposal for rethinking reporting for the age of AI, which the keen-eyed amongst you may remember from my response to the FRC’s digital reporting consultation, is simple:

  • Codify the purpose of reporting: It’s essential that we do codify the purpose, because without it, there is no means of judging how effective any changes are likely to be.

  • Mandate two parts to the annual report:

    ‘Our disclosures’: A set of disclosures covering everything that is material to the business, provided in a single, structured statement, subdivided by type (financial statements, governance statements, environmental statements, remuneration statements and so on). Let generative AI do its best with that, as long as the resulting statement is checked and signed by humans.

    ‘Our opinion’: A fair, balanced and understandable narrative, authored by each of management and the Board, which gives their truthful opinion of what those disclosures mean for the company and its prospects. Companies should be allowed to produce this however they think best, assuming a basis in written form that can stand as a document of record to which they can be held to account. Generative AI has no place in creating this – opinion must be the preserve of humans. If companies do use AI in writing opinion, they should be required to disclose it.

  • Non-material disclosures to be published on the corporate website: Any other disclosures that the government wants companies to make should be published in a dedicated section of the corporate website, rather than in the annual report.

With generative AI offering considerable benefits while also threatening to undermine reporting altogether, and with so many reviews in train – particularly the Non-Financial Reporting Review – we have the perfect opportunity to rethink the whole endeavour for the benefit of companies and stakeholders alike. The UK SRS is breaking reporting out of its old finance-only mould, by establishing the principle of the connectivity of financial and other information. Now it’s time to break the other mould – the old reporting structures that prevent us from communicating effectively.