Microsoft has unveiled its latest AI chip, Maia 200, which the company says delivers up to three times the performance of ...
Application error: a client-side exception has occurred (see the browser console for more information).
Maia 200 packs 140+ billion transistors, 216 GB of HBM3E, and a massive 272 MB of on-chip SRAM to tackle the efficiency crisis in real-time inference. Hyperscalers prioritiz ...
Microsoft is not just the world’s biggest consumer of OpenAI models, but also still the largest partner providing compute, networking, and storage to ...
Microsoft’s Maia 200 AI chip highlights a growing shift towards a model of vertical integration where one company designs and ...
Microsoft's platforms may avoid DRAM supply issues, if this report is to be believed.
The Official Microsoft Blog on MSN
How Microsoft is empowering Frontier Transformation with Intelligence + Trust
At Microsoft Ignite in November, we introduced Frontier Transformation — a holistic reimagining of business aligning AI with human ambition to help organizations achieve their highest aspirations and ...
In a blog post released on Monday, Scott Guthrie, Microsoft's executive vice president of Cloud + AI, introduced Maia 200, ...
Microsoft has unveiled its Maia 200 AI accelerator, claiming triple the inference performance of Amazon's Trainium 3 and superiority over Google's TPU v7.
Eight LinkedIn Learning courses to build AI skills in 2026, from generative AI and ethics to agents, productivity, ...
Microsoft released patches for CVE-2026-21509, a new Office zero-day vulnerability that can be exploited to bypass security features.
The company said Maia 200 offers three times the compute performance of Amazon Web Services Inc.’s most advanced Trainium processor on certain popular AI benchmarks, while exceeding Google LLC’s ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results