Microsoft’s new Maia 200 inference accelerator chip enters this overheated market with a new chip that aims to cut the price ...
Today, we’re proud to introduce Maia 200, a breakthrough inference accelerator engineered to dramatically improve the economics of AI token generation. Maia 200 is an AI inference powerhouse: an ...
Microsoft (MSFT) has released its latest artificial intelligence accelerator chip, the Maia 200, which is built for inference ...
With over 100 billion transistors, Maia 200 offers "powerhouse" AI inferencing possibilites, Microsoft says.
Microsoft Azure's AI inference accelerator Maia 200 aims to outperform Google TPU v7 and AWS Inferentia with 10 Petaflops of FP4 compute power.
BERLIN, Jan 27 (Bernama-dpa) -- Microsoft has presented its new specialised chip, which is intended to make large AI applications faster and cheaper in the future, reported German Press Agency (dpa).
Microsoft today announced its second generation Maia 200, an AI accelerator processor for datacenters that's optimized for ...
Microsoft has announced that Azure’s US central datacentre region is the first to receive a new artificial intelligence (AI) inference accelerator, Maia 200.
Microsoft recently announced Maia 200, a new AI accelerator specifically designed for inference workloads. According to ...