Arm has introduced the AGI CPU, its first production silicon and a major step beyond its traditional role of licensing chip designs, as the company now plans to sell complete processors built for modern AI data centers. The new chip focuses on agentic AI workloads where systems run continuously and require CPUs to manage scheduling, memory, storage, and data movement across large accelerator clusters.
Arm says the AGI CPU uses up to 136 Neoverse V3 cores and delivers 6GB/s memory bandwidth per core with sub-100ns latency, while operating at a 300W TDP with one dedicated core per program thread. This setup helps maintain consistent performance under heavy workloads without throttling, which matters when AI systems run non-stop at scale.
The AGI CPU targets what Arm calls agentic AI infrastructure, where software agents handle tasks in real time without human delays, so the CPU becomes central to keeping everything coordinated across thousands of processes. Arm designed the chip to deliver sustained performance across dense server environments, where efficiency and cooling limits define how much compute you can pack into a rack.
The company’s reference design includes a dual-node server with 272 cores per blade, scaling up to 8,160 cores in an air-cooled rack, while liquid-cooled setups can go beyond 45,000 cores per rack. Arm claims this setup delivers more than double the performance per rack compared to current x86 systems, based on its own internal benchmarks.
Meta leads as a key partner and co-developer, and the AGI CPU will work alongside Meta’s MTIA accelerators. Arm also confirmed partnerships with companies including Cerebras, Cloudflare, F5, OpenAI, SAP, and SK Telecom, covering AI orchestration, APIs, and enterprise workloads.
Commercial systems are already available from ASRock Rack, Lenovo, and Supermicro, while broader rollout is expected in the second half of 2026. The chip will use a 3nm process with support from major ecosystem players such as AWS, Google, Microsoft, NVIDIA, Samsung, SK hynix, and TSMC.