As AI adoption in Asia/Pacific is accelerating, organisations are in need to focus on skill development as they progress in their AI adoption journey.
According to Deepika Giri, head of research, Big Data &AI at IDC Asia/Pacific, by 2025, we are bound to see the rise of smaller, more efficient models, driving cost optimisation and improved performance.
She says the commoditisation of AI infrastructure will further lower development costs, while the integration of traditional AI/ML models with large language models (LLMs) will pave the way for intelligent agents to dominate, unlocking even greater capabilities
In a recent report by the market intelligence and advisory services provider, by 2028, 80% of foundation models used for production-grade use cases will include multimodal AI capabilities to deliver improved use case support, accuracy, depth of insights, and inter-mode context.
This shift, according to IDC, underscores a broader trend toward more sophisticated AI systems that can process and integrate multiple data types—text, images, audio, and video—enabling enterprises to unlock new levels of intelligence and automation.
Among the other key findings of the IDC FutureScape: Worldwide AI and Automation 2025 Predictions — Asia/Pacific (Excluding Japan) Implications are the following:
Small models: By 2026, 90% of Asia-based top 1000 (A1000) enterprises’ use cases for LLMs will be dedicated to training SLMs because of cost, performance and expanded deployment options.
AI Adoption Inflection Point:Â By 2027, AI adoption barriers will become indistinct due to the AI infrastructure commoditisation, advanced LC/NC tools and security frameworks, leading to reduction of AI build costs by nearly 70%.
Dawn of Enterprise AI Agents:Â By 2025, 60% of A1000 organisations will use enterprise agents configured for specific business functions, instead of focusing on individual co-pilot technologies to achieve faster business value from AI.
Hybrid Cloud Inferencing: By 2026, 70% of A1000 enterprises will adopt hybrid edge-cloud inferencing as organizations fully integrate edge into cloud infrastructure and management strategy.