Microsoft is betting heavily on the artificial intelligence of OpenAIleveraging the cloud Azure: has indeed connected tens of thousands of Nvidia chips to make the most of AI. An investment that continues for the company, which wants to put artificial intelligence at the center of its offer.
Microsoft: Tens of thousands of Nvidia chips for OpenAI
In 2019, when Microsoft invested one billion dollars in OpenAI, also agreed to build a huge advanced supercomputer for startup research on artificial intelligence. However, Microsoft lacked the capability to deliver what OpenAI required at the time. And she wasn’t sure she could create such a large solution in hers Azure cloud service.
OpenAI trains a growing number of AI models. But to do that it needs a lot of data. To meet this need, Microsoft had to join tens of thousands of Nvidia’s A100 graphics chips. Which are critical for training AI models.
According to the company itself, Microsoft had to change the positioning of the servers on the racks to prevent power outages. The Redmond company did not provide a specific cost for the project. But he said it exceeds the several hundred million dollars.
The supercomputer created by Microsoft has permission for OpenAI to release ChatGPTa chatbot that has attracted over a million users in the few days following its launch in November 2019. And that with the arrival of the latest version, which has been an instant success.
As interest in AI tools like ChatGPT increases from businesses and consumers, there will be more and more pressure on cloud service providers like Microsoft, Amazon and Google to ensure that their data centers are capable of Of provide the enormous computing power needed.
Leave a Reply
View Comments