As we reach the midpoint of 2025, one thing remains clear: disruption hasn’t slowed; it’s accelerating. Organizations that anticipated the pace of change earlier this year are already reaping the rewards. But for those still catching up, the time to accelerate your efforts is now.
At Futurum Research, we aim to help leaders decode complexity, navigate volatility, and turn emerging trends into tangible advantages. In the first half of this year, we’ve seen our predictions begin to play out in real time – from the rise of agentic AI transforming digital labor to cloud marketplaces redefining go-to-market strategies, and AI-powered PCs reshaping workplace productivity.
The next six months will demand more agility, boldness, and strategic clarity. Business models across many industries are being tested, customer expectations continue to climb, and the pressure to align technology, talent, and transformation has never been greater.
This isn’t just about seeing the future; it’s about looking for the Signals to help you build a sustainable competitive advantage.
I encourage you to revisit the updated predictions outlined in this report and ask:
What’s shifted, and how will that impact my current growth strategy? What’s emerging faster than expected, and what impact is that having? And what investments are showing tangible results?
As we move into the second half of the year, now is the time to recommit to growth, innovation, and building experiences that elevate both customer success and employee engagement.
Here’s to shaping what’s next, together.

Chief Strategy and Research Officer
The Futurum Group
By the end of 2025, the impact of agentic AI on enterprise software will be marked by strategic adoption rather than just widespread mediation. At least 15-20% of routine tasks within specific corporate functions–notably HR and customer service–will be initiated and managed by AI agents, with software firms leading the way in adoption. This will catalyze a significant evolution in software licensing, with a clear trend toward outcome-based and consumption-driven pricing models over traditional per-seat licenses.
“The conversation for CIOs has shifted from ‘if’ to ‘how.’ In 2025, we see a strategic imperative to build an agent-based AI strategy. The focus is less on the novelty of a single agent performing a task and more on creating a cohesive fabric of specialized agents that can automate entire business functions. This is the year enterprise agentic AI moves from the lab to the core of strategic IT planning.”

Vice President & Practice Lead
AI Platforms
The acceleration of this trend has become more nuanced and is now driven by three evolved factors:
As we reach the halfway mark for 2025, our initial prediction about increased efforts around agentic AI in cybersecurity was accurate: we saw and continue to see significant activity around agentic AI in security, both in terms of “securing agentic” and “using agentic for security.” We expect increased adoption of agentic workflows in security operations teams, particularly as large vendors fine-tune their agentic offerings.
“The broad evolution of and interest in agentic AI is extremely important to cybersecurity teams. As a new technology, the adoption of agentic AI across the organization means that security teams must quickly understand the technology, analyze the impact it may have on security posture, determine how to secure it, and implement these changes while supporting innovation and experimentation. That same technology, however, can potentially be a boon to security teams themselves, as they use it selectively to assist with well-defined security tasks.”

Vice President & Practice Lead
Cybersecurity & Resilience
By the end of 2025, the primary bottleneck for scaling enterprise AI will shift from model development to data readiness. In response, most enterprises will focus on deploying an “AI-Powered Data Control Plane” to automate data discovery, preparation, and governance, making it the most critical investment for unlocking value from AI workloads.
“We’re seeing a critical pivot from experimenting with AI to industrializing it, and this has exposed a massive gap in data readiness. The winning strategy in 2025 won’t be about building or integrating the best AI models but creating the best data to feed those models at scale and without risk.”

Vice President & Practice Lead, Data Intelligence, Analytics, & Infrastructure
Emerging enterprise concerns over data quality and the maturation of data engineering platform capabilities drive this trend.
Enterprise IT in 2025 is entering a period of profound reinvention as CIOs push AI beyond isolated functions, wrestle with the immature foundations of agentic AI, and escalate efforts to counter suddenly looming post-quantum threats. Futurum’s Q2 2025 CIO Insights survey reveals 89% of IT leaders driving AI for strategic transformation, 80% elevating quantum-resilient security to board-level priority, and 71% rethinking optimal cloud environments under intensifying AI workload pressures.
“As AI becomes integral to business strategy, CIOs are being forced to reconsider how and where it’s optimal to deploy compute resources. The need for low latency, cost efficiency, and compliance in AI applications is driving a rapid shift toward hybrid and multi-cloud strategies. For IT leaders, this means 2025 will be a pivotal year for a comprehensive realignment of their infrastructure with the realities of the AI era.”

Vice President & Practice Lead
Digital Leadership & CIO
Three primary reasons are driving this change.
Cloud marketplaces will become as big a Go-to-Market (GTM) for Independent Software Vendors (ISVs) as traditional distribution is for commercial hardware. Over $300 billion of committed cloud spending will continue to help fuel this engine.
“Every vendor is trying to figure out their marketplace strategy, which ones to prioritize, how to operationalize it, and how to bring traditional partners on that journey. The most successful vendors in the marketplace will be the ones that understand how to include service delivery partners as part of their marketplace strategy.”

Vice President & Practice Lead
Ecosystems, Channels, & Marketplaces
Generative AI-powered features will enter widespread use in 2025, thereby requiring significant shifts in pricing models, with seat-license models being supplanted by consumption-based and outcome-based approaches. A 2024 Futurum Intelligence survey of 895 decision-makers and influencers found that 40% of respondents were paying for software on a consumption-based pricing model, and 15% were using an outcome-based model.
“As vendors continue to roll out new and enhanced versions of AI agents, consumption-, interaction-, and outcome-based pricing models are quickly becoming the most common approaches for linking the benefits of AI with the cost of the resource. This will be increasingly important to CEOs that need to justify their investment into AI, and particularly agentic AI systems. However, vendors need to ensure that any pricing model deployed – as well as any ROI promises made – clearly lay out all ancillary costs and restrictions so customers are able to make an accurate assessment of whether the pricing model works for their business and use cases.”

Research Director & Practice Lead Enterprise Software & Digital Workflows
Three primary reasons are driving this change.
As AI agents proliferate, we expect a strong shift to outcome-based pricing models in 2025, as these CEOs are prioritizing tangible and visible ROI from their AI investments. This outcome-based approach to pricing ensures that customers are not paying for software that is not delivering promised results, which can be contrasted with a consumption-based model that does not incorporate any type of ROI guarantee.
AI-capable PCs (PCs equipped with an NPU and capable of running some AI training and inference workloads locally) will come to represent at least 40% of new PC shipments by the end of 2025.
“The AI PC is, first and foremost, a radically better PC than pre-AI PCs. It is tangibly faster, more powerful, more capable and more useful. The all-day battery life alone is such a radical system improvement that even without its AI capabilities, it would be worth the upgrade. But perhaps more importantly in the long term, the AI PC also lays the necessary foundation for the next generation of software experience, which will be dominated by agentic AI. As agentic AI begins to insert itself into every application, from search, system management and security to productivity and creativity software, users in both the consumer and the commercial segments will need PCs designed securely to handle agentic AI workloads both in the cloud and locally, in order to take full advantage of the coming disruption/opportunity.”

Research Director & Practice Lead
Intelligent Devices
Three primary reasons are driving this change.
As AI-capable PCs are an evolution of pre-AI PCs, all previous use cases for PCs still apply. However, new use cases have already begun and will continue to emerge.
Moving some AI Processing from the Cloud to Devices to expand the reach of AI beyond the data center. As large language models and large mixed models (multimodal AI) become more efficient, and AI PC systems become more capable, AI PCs will accelerate the expansion of AI workloads from the cloud to AI-enabled devices. Many of the large language models trained in the cloud a year ago can already be trained directly on-device today. As that trend continues, organizations will increasingly be able to train, test and fine-tune many of these models securely, onsite and at a fraction of the cost they would have otherwise incurred. Additionally, AI PCs allow pre-trained models to be quickly and securely customized by organizations locally rather than in the cloud.
Agentic AI in the PC. As agentic AI begins to transform the way users interface with apps and software, AI-capable PCs will be uniquely positioned to deliver secure, local, highly individualized on-device agentic-AI experiences to users concurrent with more general-use cloud-based agentic AI experiences, Use case examples range from AI agents drafting email responses, managing calendars and performing complex searches in seconds to reducing the time it takes to design a presentation, report or proposal from hours to minutes.
All Day & Multi-Day Battery Life. PCs capable of delivering all-day and multi-day battery life even in thin-lightweight form factors will also transform the way users work and play with their PCs, not only in hybrid and remote work scenarios but at the office as well, with notebook PCs becoming far easier to carry around between meetings.
In 2H 2025, advanced packaging (AP) and HBM capacity are expected to remain critical bottlenecks for global compute deployment, driven by accelerating AI inference workloads and the growing adoption of evolving LLMs with multimodal capabilities. The technology advancement of AP and HBM and their adoption should be closely watched, as the two will underpin the future development of semiconductors used in AI servers and smartphones.
As Moore’s Law has slowed down, leading chipmakers are aiming to leverage various types of advanced packaging technology to improve the overall performance of semiconductors further, making advanced packaging an increasingly vital front of technology.
In 2H 2025, advanced packaging (AP) and HBM capacity are expected to remain critical bottlenecks for global compute deployment, driven by accelerating AI inference workloads and the growing adoption of evolving LLMs with multimodal capabilities. The technology advancement of AP and HBM and their adoption should be closely watched, as the two will underpin the future development of semiconductors used in AI servers and smartphones.
Memory Is Bedrock for AI Accelerator: Memory bandwidth is vital as model training is often bandwidth-constrained rather than purely compute-constrained. The attention mechanism in the transformer model has to store and calculate the relationship between all the tokens. Memory requirement is quadratic in proportion to the sequence length. Similarly, memory is also a bigger constraint during inference, due to the need to handle longer context windows and an enlarged key‑value cache (KV cache) in the transformer model. Memory consumption for KV cache grows linearly with the token size. To that end, HBM has become the essential component for AI, offering a higher speed of transferring data and lower power consumption than traditional DRAM products.
Advanced Packaging Pushes Beyond Moore’s Law: Advanced packaging has become an essential technology in the AI hardware supply chain, especially as Moore’s Law has slowed down in recent years. Chipmakers are turning to advanced packaging as a new solution to sustain performance improvements. By integrating the compute die, memory, and packaging substrate more closely, advanced packaging enables better power efficiency, higher performance, and faster data transfer between components. This is an ongoing industry shift and a key trend to follow.
Supply Constraint Remains in AI Compute: Compute demand has surged since the introduction of ChatGPT-3 in late 2022, and the acceleration of AI inference workloads has driven a second wave of growth. Today, we assess that the global compute landscape remains supply-constrained, driven not only by extraordinary demand for AI chips globally but also by structural bottlenecks across the supply chain. These include the complexity and capacity constraints of advanced packaging and HBM, technical difficulties, and slower ramp-up production in downstream assembly and integration. While compute supply–demand dynamics might be alleviated as time passes, we believe the possibility of supply constraints in compute deployment will continue in 2H 2025.
Advanced Packaging (AP): Advanced packaging technologies are indispensable for all the AI chips, just like HBM. Advanced packaging for AI accelerators, such as TSMC’s CoWoS-S, CoWoS-R, and CoWoS-L, plays a critical role in not only packaging the compute die, HBM, I/O die, and other necessary components into a single AI chip but also further improving the performance of an AI accelerator. Demand for CoWoS-L has increased due to NVIDIA’s transition from Hopper GPUs to Blackwell, which is later supported by CoWoS-L technology instead of CoWoS-S. Another packaging technology to watch is WMCM (Wafer-Level Multi-Chip Module), expected to power Apple’s next-generation A20 SoC for the iPhone 18, which will come into play in 2026.
Co-Packaged Optics (CPO): CPO represents an advanced form of heterogeneous integration that combines optical components and semiconductor devices within a single package, aiming to overcome high-bandwidth data center applications’ performance and power limitations. While the industry continues to debate the commercial viability and timing of CPO adoption, the growing need for high-speed, low-power interconnect solutions makes its eventual deployment increasingly necessary. The earliest realistic timeline for CPO to enter the market is around late 2026 and 2027. One potential real-world application of CPO could be NVIDIA’s Rubin-series GPUs, which are expected in 2026, and the following Feynman product line.
By the end of 2025, over 70% of software developers will use generative AI to augment development work, including code generation, completion, review, analysis, bug fixing, or unit tests. Of developers and DevOps engineers, 20% will use AI and open standards to automate tasks and basic workflows as the use of agentic AI expands to platform engineering.
“Generative AI is becoming as essential to the software development process as IDEs, testing, and DevOps automation tools. Engineers are learning the best uses of reasoning LLMs across a growing number of agentic, multi-step tasks. MCP servers extend the capabilities of development and automation tools vendors so they are now accessible by LLMs used in development.”

Vice President & Practice Lead
Software Lifecycle Engineering
Get important insights straight to your inbox, receive first looks at eBooks, exclusive event invitations, custom content, and more. We promise not to spam you or sell your name to anyone. You can always unsubscribe at any time.