Elon Musk, the CEO of Tesla Inc. TSLA, has expressed his dissatisfaction with the time period GPU whereas saying that the corporate’s core AI infrastructure is not training-constrained.
What Occurred: Through the first-quarter earnings name on Tuesday, Musk revealed that Tesla has been actively increasing its core AI infrastructure. He acknowledged, “We’re, at this level, not training-constrained, and so we’re making fast progress.”
The tech billionaire additionally disclosed that Tesla has put in and commissioned 35,000 H100 computer systems or GPUs and Tesla anticipates this quantity to doubtlessly attain round 85,000 by the top of the yr, primarily for coaching functions.
See Additionally: Elon Musk Says Tesla Optimus Humanoid Robotic May Be Accessible Externally By The Finish Of Subsequent 12 months: Right here’s The Progress Report
“We’re ensuring that we’re being as environment friendly as doable in our coaching,” Musk stated, including that IT is not only in regards to the variety of H100s however “how effectively they’re used.”
Through the dialog, Musk additionally expressed his discomfort with the time period GPU. “I at all times really feel like a wince once I say GPU as a result of IT’s not. GPU stand — G stands for graphics, and IT doesn’t do graphics,” the tech mogul acknowledged.
“GPU is [the] improper phrase,” he stated, including, “They want a brand new phrase.”
Subscribe to the Benzinga Tech Trends newsletter to get all the most recent tech developments delivered to your inbox.
Why IT Issues: Musk’s assertion got here after Tesla reported its first-quarter monetary income of $21.0 billion, which confirmed a 9% year-over-year lower, lacking the Avenue consensus estimate of $22.15 billion. The corporate acknowledged that its income was affected by decreased common promoting costs and decreased automobile deliveries through the quarter.
Then again, Nvidia Company NVDA final yr made a major impression on the AI and computing sectors by introducing its H100 knowledge middle chip, which added greater than $1 trillion to the corporate’s total worth.
In February, earlier this yr, IT was reported that the demand for the H100 chip, which is 4 occasions quicker than its predecessor, the A100, in coaching giant language fashions or LLMs and 30 occasions quicker in responding to person prompts, has been so substantial that prospects are encountering wait occasions of as much as six months.
In the meantime, earlier this month, Piper Sandler analyst Harsh V. Kumar engaged immediately with Nvidia’s administration workforce and reported that regardless of Nvidia’s Hopper GPU being in the marketplace for nearly two years, demand stays sturdy, outstripping provide. Prospects are hesitant to shift their orders from the Hopper to the Blackwell, fearing prolonged wait occasions resulting from anticipated provide limitations.
Try extra of Benzinga’s Future Of Client Tech by following this hyperlink.
Learn Subsequent: Elon Musk Likes Video games Originating From This Nation As a result of They Haven’t Been Corrupted By ‘Woke DEI Lies’
Disclaimer: This content material was partially produced with the assistance of Benzinga Neuro and was reviewed and revealed by Benzinga editors.
Photograph through Shutterstock