The promise is seductive: Artificial intelligence will liberate finance professionals from mundane tasks, creating space for strategic thinking that drives business value.
"Any repeatable process that you have within finance automated should free up time [for FP&A professionals] to focus more on strategic thinking," says Justin Barch, managing director of revenue and growth at the Association of Financial Professionals (AFP). "We don't believe that it's going to take jobs away. It's just going to change jobs."
Yet beneath this optimistic veneer lies a more complex reality. At a recent FutureCFO roundtable of senior FP&A professionals across Asia-Pacific, co-organised with AFP, the conversation revealed a stark disconnect between AI's theoretical potential and its practical implementation challenges. While the technology promises to transform financial planning and analysis, practitioners are grappling with fundamental obstacles that threaten to undermine the strategic value proposition of AI.
The data integrity paradox
The most pervasive challenge isn't technological but foundational. One delegate acknowledged significant challenges around data systems. "There is a view that with many data, AI will actually be able to help. So, we will need to go through the difficulty of resolving all our data challenges first."
This data integrity paradox creates a circular problem: AI needs clean, reliable data to generate valuable insights, but organisations struggle with fragmented systems and inconsistent data quality—the very issues AI is supposed to address.
A delegate from the healthcare sector commented that: "From a finance perspective, we need to talk about data reliability. How do we merge data from across platforms? There will be some challenges."
The financial services sector faces additional complexity. A delegate at the roundtable notes, "The challenge is the reputation of systems — also putting in data that is subject to regulatory action." The regulatory dimension adds layers of compliance requirements, making AI implementation more cautious and deliberate.
The logistics industry presents similar fragmentation issues. One delegate describes a fundamental shift in analytical approach. He notes that unlike before, when everything had to be checked to build a solution, AI can now provide comprehensive information. However, they face the challenge of AI occasionally delivering inaccurate information or misinterpreting their requests.
The quality-confidence gap
Perhaps more troubling than data challenges is the emerging quality-confidence gap. Organisations are discovering that current AI capabilities don't meet the analytical rigour required for high-stakes financial decisions.
Murtaza Rangwala, president & CFO at Olam Global Agri in the agri-commodity sector, articulates this concern: "The challenge with AI lies in the limited availability of domain-specialised tools and the lack of proven concepts. In my view, the quality of output currently delivered by these sector-specific AI solutions does not yet match or surpass human capability.
Murtaza Rangwala
“Given the high-value nature of our decisions, particularly around risk management and capital allocation, the precision and reliability of outputs from FP&A systems are absolutely critical." Murtaza Rangwala
This quality gap extends beyond simple accuracy to encompass analytical sophistication. Another delegate observed that current models "are still at early stages in terms of numerical ability. They can do stuff like prediction, but fundamental stuff."
For FP&A professionals accustomed to complex scenario modelling, variance analysis, and multi-dimensional forecasting, these limitations represent a significant barrier to adoption.
The challenge becomes more pronounced when considering the iterative nature of AI interactions.
Based on their personal experience, one delegate notes that AI can make mistakes. They emphasised that if parameters are not specified clearly, AI may provide incorrect answers, corroborating an earlier comment from another delegate.
The expectation-reality mismatch
The roundtable revealed a troubling expectation-reality mismatch that could undermine AI's long-term adoption in FP&A. Whilst senior executives often view AI as a panacea for efficiency challenges, practitioners are discovering that implementation creates additional workload rather than reducing it.
Karl Davies, managing director of H&BA Solutions, identifies a particularly problematic dynamic: "An issue arises when employees outside the region, who use AI, become immediately experienced and knowledgeable around alleged facts in Asia, for example, assumptions around new tax credits which are in place, and using that false information to drive their behaviour.
Karl Davies
“The second challenge comes from non-accountants trying to use the information to become accountants — again, to create more work for the finance community." Karl Davies
This phenomenon highlights a critical risk: the democratisation of AI without domain expertise can create dangerous knowledge illusions. When non-finance professionals gain access to AI-powered financial analysis tools, they may make decisions based on flawed assumptions or an incomplete understanding of financial principles.
Another delegate echoes this concern and questions the reliability of the reports generated and whether they would need to recheck them entirely. They emphasise the necessity for some level of standardisation across platforms to ensure that the data produced is authentic.
Strategic positioning challenges
The conversation revealed that many organisations are struggling to position AI strategically within their financial planning and analysis (FP&A) functions. Rather than transforming analytical capabilities, AI is often being deployed tactically for specific use cases without consideration of broader strategic implications.
One delegate expressed a desire to adopt AI as they implement a new ERP solution. However, he noted that their consultant identified AI as a black box, raising concerns about the availability of sufficient data to ingest and support a decision-making process.
This "black box" problem is particularly acute for FP&A professionals who need to understand the logic behind financial recommendations. Unlike other business functions where directional accuracy suffices, financial planning requires transparency in analytical reasoning to support decision-making and regulatory compliance.
One delegate described their organisation's approach: "We've created a centralised data warehouse, so that makes it a single source of truth, so at least we don't have different teams reporting different data for the same issues." However, even with centralised data, the challenge of AI transparency remains.
The human-AI collaboration framework
Despite these challenges, leading practitioners are developing frameworks for human-AI collaboration that preserve analytical rigour whilst leveraging AI capabilities. Grace Zhao, head of FP&A of a global procurement organisation, outlines their procurement analytics strategy by emphasising a structured, phased approach:
Grace Zhao
"We are trying to integrate the spend data from all regions and embark on the journey utilising AI after we sort out the main issues, to put parameters to ask the system to help us identify or put the spend into different buckets." Grace Zhao
This approach — using AI for categorisation and pattern recognition whilst maintaining human oversight for strategic interpretation — represents a promising middle ground. It acknowledges AI's current limitations whilst capitalising on its strength in processing large datasets.
On the topic of change management, one delegate shared that their approach is to employ various validation techniques, collaborate with different departments, and seek validations from multiple areas. They even go so far as to obtain information from third parties, such as vendors, to ensure that the data they rely on is accurate and reliable.
The skill evolution imperative
The roundtable highlighted an urgent need for skill evolution within FP&A teams. Traditional financial analysis skills must be augmented with prompt engineering, data literacy, and AI interaction capabilities.
Mieo Teng Tan, FP&A manager, describes their current experience:
Mieo Teng Tan
"All finance members in our company have access to a premium version of Copilot. It helps in automating processes and gives very structured reports, but sometimes when I generate tables using Copilot, there are still errors that require high-level checks." Mieo Teng Tan
This observation underscores a critical point: AI doesn't eliminate the need for financial expertise. Instead, it changes how that expertise is applied. FP&A professionals must develop expertise in validating AI outputs, understanding model limitations, and translating AI insights into a business context.
Freddie Koh, head of finance at Viatris, articulates a holistic change management approach: "Our finance team thinks of AI in three interconnected parts. First is the storage of data for AI use, second is evaluating the AI capabilities and maturity suitable for an organisation's strategic deployment, and third is the readiness of our teams, which is the most critical enabler for AI adoption.
Freddie Koh
"Community of practice can be formed within the organisation to nurture the early AI adopters starting from the smaller individual productivity projects and improve their AI proficiencies in preparation for the enterprise level projects. All three parts need to be working in unison to fully realise the benefits of any AI investment." Freddie Koh
The investment dilemma
The discussion revealed significant confusion regarding the justification for AI investment. Unlike traditional technology investments with clear ROI metrics, the value proposition of AI for FP&A remains nebulous.
One delegate expressed that they need to be convinced before assembling a business case for proposing AI solutions. He explained that the models they have refined over time undergo continuous improvement when faced with challenges or data inaccuracies. This sophisticated and dynamic model provides them and the executive leadership team with the necessary insights for decision-making.
This pragmatic perspective highlights a crucial consideration: many organisations have already developed sophisticated analytical frameworks that deliver business value. The burden of proof for AI lies not just in demonstrating capability but in proving superior value compared to existing approaches.
Meeting the new strategic imperative
Despite current limitations, the consensus among practitioners is clear: AI adoption is inevitable, and organisations that fail to develop AI capabilities risk a competitive disadvantage. Barch notes that "just on automation, you can get 30% of your workday back through automating tasks" while acknowledging that "we are at the beginning of the learning."
The key insight from this practitioner discussion is that AI's strategic value in FP&A won't come from replacing human judgment but from augmenting analytical capabilities and enabling finance professionals to focus on higher-value strategic activities. However, realising this potential requires addressing fundamental challenges around data quality, analytical transparency, and organisational change management.
"Think of companies like Google and Microsoft, who have huge consumer businesses," chimed in a delegate. "The speed at which change is taking place shows many reasoning models launched just weeks ago that are more advanced with numbers, showing logic and creating basic predictive models while explaining how it's done."
Decoding the FP&A-AI future
The path to deepening FP&A's strategic value with AI is neither straightforward nor guaranteed. Current implementations reveal significant gaps between promise and practice, ranging from data integrity challenges to quality and confidence issues. Yet the potential remains compelling: AI that truly augments human analytical capabilities could transform FP&A from a supporting function to a strategic driver of business value.
Success will require organisations to move beyond tactical AI deployments toward comprehensive transformation of their analytical capabilities, skill sets, and organisational structures. As Barch concludes:
Justin Barch
"[AI] is not going away. It's only going to accelerate. And if it's really done well, it should give you more time to focus on the business and get past the transactional stuff." Justin Barch
The question isn't whether AI will transform financial planning and analysis (FP&A). More importantly, it's whether FP&A professionals will successfully navigate the transformation to emerge as more strategic business partners. Those who don't risk watching from the sidelines as AI transforms the FP&A landscape they once commanded.
Delegates to the FutureCFO-Association of Financial Professionals roundtable discussion on the topic of "Deepening FP&A's Strategic Value with AI"
Allan is Group Editor-in-Chief for CXOCIETY writing for FutureIoT, FutureCIO and FutureCFO. He supports content marketing engagements for CXOCIETY clients, as well as moderates senior-level discussions and speaks at events.
Previous Roles
He served as Group Editor-in-Chief for Questex Asia concurrent to the Regional Content and Strategy Director role.
He was the Director of Technology Practice at Hill+Knowlton in Hong Kong and Director of Client Services at EBA Communications.
He also served as Marketing Director for Asia at Hitachi Data Systems and served as Country Sales Manager for HDS’ Philippine. Other sales roles include Encore Computer and First International Computer.
He was a Senior Industry Analyst at Dataquest (Gartner Group) covering IT Professional Services for Asia-Pacific.
He moved to Hong Kong as a Network Specialist and later MIS Manager at Imagineering/Tech Pacific.
He holds a Bachelor of Science in Electronics and Communications Engineering degree and is a certified PICK programmer.