Google Cloud and Intel have announced a significant expansion of their multi-year partnership, focusing on artificial intelligence infrastructure. The agreement, revealed on Thursday, will see Google Cloud continue to utilise Intel's Xeon processors and see the two companies co-develop new custom infrastructure processing units (IPUs).
This strategic move comes as the global technology industry faces increasing demand for central processing units (CPUs), which are essential for running AI models and general AI infrastructure, complementing the graphics processing units (GPUs) used for model training.
Expanding a Decades-Long Partnership
Under the new terms, Google Cloud will deploy Intel's latest Xeon 6 processors for AI, cloud, and inference tasks, continuing a relationship that has seen Google use various Intel Xeon chips for decades. Intel chief executive Lip-Bu Tan stated, "AI is reshaping how infrastructure is built and scaled. Scaling AI requires more than accelerators — it requires balanced systems. CPUs and IPUs are central to delivering the performance, efficiency and flexibility modern AI workloads demand."
The partnership will also advance the co-development of custom ASIC-based IPUs, a collaboration that began in 2021. These specialised chips are designed to accelerate and manage data centre workloads by offloading tasks from the main CPUs, improving overall system efficiency.
Industry-Wide CPU Demand
The announcement aligns with a broader industry trend highlighting renewed focus on CPU technology. SoftBank-owned Arm Holdings recently unveiled its first in-house chip, the Arm AGI CPU, amid a reported worldwide shortage of these critical components. While Intel declined to disclose the financial terms of the expanded deal with Google, the collaboration underscores the competitive and strategic importance of advanced processor development in the AI era.
Box: What is an IPU? An Infrastructure Processing Unit (IPU) is a programmable networking device designed to manage infrastructure tasks like storage virtualization, network security, and traffic routing. By handling these functions, it frees up the server's central CPU to focus on primary application workloads, boosting data centre performance.
Strategic Implications and Future Focus
The deepened alliance positions both companies to better compete in the rapidly evolving AI hardware market. The focus on custom IPU development suggests a long-term strategy to create differentiated, optimised solutions for Google's vast cloud infrastructure. As the AI boom continues to strain global semiconductor supply chains, such partnerships between cloud providers and chip manufacturers are becoming increasingly vital to ensure the necessary infrastructure can scale effectively.