Microsoft has begun deploying its home‑grown Maia 200 AI inference chip in its data centers while confirming it will continue purchasing GPUs from Nvidia and AMD. CEO Satya Nadella emphasized the company’s ongoing partnerships and the need to stay ahead in AI hardware. The Maia 200, designed for high‑performance AI model inference, will first be used by Microsoft’s Superintelligence team as they develop frontier models, and will also support OpenAI models on Azure. The move reflects a hybrid strategy of building in‑house chips while leveraging external suppliers.
Leer más →