ALBAWABA - After an endeavor that lasted for 122 days, Elon Musk's xAI announced that its AI training system, dubbed "Colossus" has been brought online, with a whopping 100,000 Nvidia H100 GPU training cluster, beating Mark Zuckerberg’s Meta in the race of AI.
"Over the next couple of months, it will double in size, bringing it to 200k (50k H200s)," Musk promised, emphasizing that he was still not satisfied with the powers that ‘Colossus’ has at the moment.
Established in July of last year, Musk’s newest AI venture, xAI, launched its supercomputer over the Labor Day weekend, and set to train its large language model (LLM), Grok, which aims to compete with OpenAI’s renowned GPT-4.
Despite the fact that Grok is only accessible to paying customers of Musk's X social media platform, many Tesla experts believe that it will ultimately become the artificial intelligence that drives the EV maker's humanoid robot Optimus, according to Fortune.
Musk predicts that the impressive project may ultimately yield Tesla $1 trillion in revenues yearly. Adding his plans to double the computing power of Colossus in a matter of months if he is able to acquire 50,000 chips from Nvidia's new and more sophisticated H200 series, which are around twice as powerful.
In response to the unveiling of Colossus, Nvidia posted a message of congratulations to Elon Musk and the xAI team. Bringing light that it would be the most powerful and have "exceptional gains" in terms of energy efficiency.