New Delhi [India], April 22 (ANI): The next phase of artificial intelligence is no longer just about faster GPUs. As AI moves from single-task generation to autonomous, multi-step “agentic” systems, the economic value is migrating to the broader infrastructure stack, with CPUs and memory emerging as the new bottlenecks, Morgan Stanley said in its latest research report.
The shift marks a structural pivot from raw compute to orchestration. While GPU demand remains robust, each model call now requires more coordination, persistent memory and system-level processing, widening the AI spend pool beyond accelerators. “Agentic AI widens the trade beyond GPUs, with CPUs becoming the control plane for multi-step workflows and system orchestration,” the report noted.
Morgan Stanley quantifies the opportunity through a new framework that sizes the incremental compute and memory demand. CPU-side orchestration can account for 50-90% of total workload latency in agentic systems, materially raising general-purpose compute intensity. The brokerage estimates $32.5-60 billion of incremental CPU total addressable market by 2030, within a total server CPU Total Addressable Market (TAM) of more than $100 billion. On the memory side, agentic workloads could drive 15-45 exabytes of additional DRAM demand by 2030, equivalent to 26-77% of 2027 annual DRAM supply.
The architecture of AI clusters is also changing. Agentic workloads rely on CPU-centric or hybrid designs to manage reasoning, tool execution and memory orchestration, pushing the CPU-to-GPU ratio higher at the cluster level. Memory is no longer passive storage but an active system component supporting persistent context and continuous learning. This drives content growth across CPUs, DRAM and the broader infrastructure stack, including foundry capacity, ABF substrates and interconnect layers.
The investment implications are broad and global. “Supply-constrained enablers (foundry, ABF, BMC, interconnect) should capture outsized economics as system complexity rises,” Morgan Stanley said. Beneficiaries span the full stack — CPU vendors, memory suppliers, storage companies, advanced packaging and substrate providers, foundries, equipment makers and server manufacturers. In short, the AI trade is expanding beyond the accelerator to anyone enabling the system that makes intelligent agents work.
The brokerage expects the agentic AI cycle to redefine infrastructure priorities over the next five years. As enterprises deploy agents that can plan, reason and interact with external tools, data center architectures will need to be re-optimized for coordination rather than just peak compute. That puts greater emphasis on low-latency interconnects, high-bandwidth memory and resilient foundry supply chains, areas that are already supply-constrained and likely to command premium pricing.
The outlook also suggests a rebalancing within the semiconductor industry. While Nvidia and other GPU leaders remain central, the incremental revenue pool is shifting toward CPU incumbents and memory suppliers. Morgan Stanley sees orchestration CPUs carving out a $82.5-110 billion data center market by 2030, with agentic workloads contributing the bulk of incremental growth.
For investors, the report underscores that owning the best accelerator is no longer sufficient. “The beneficiaries of this shift are global and full-stack,” it said, pointing to a wider and more diversified AI investment landscape. As agentic AI scales, the winners will be those that can deliver end-to-end system efficiency rather than point solutions.
The report noted that the transition from generative to agentic AI is still in its early stages, but the infrastructure buildout required will be more complex and capital-intensive than the first wave. If execution keeps pace, the next five years could see CPUs and memory rival GPUs as the primary drivers of AI-related semiconductor revenue growth. (ANI)


