Samsung archrival sells out of valuable HBM cargo however is mum on who the most important shopper was — Nvidia and AMD can’t get sufficient excessive bandwidth reminiscence chips however is there another person? – TechToday


SK Hynix, a key Samsung competitor, says IT has bought out its whole 2024 manufacturing of stacked high-bandwidth reminiscence DRAMs. These chips are essential for AI processors in knowledge facilities, however the firm stays tight-lipped about its largest shoppers.

SK Hynix’s just lately appointed vp, Kitae Kim, who can also be the pinnacle of HBM gross sales and advertising, confirmed the information in an interview posted on SK Hynix’s web site.

“Proactively securing buyer buy volumes and negotiating extra favorable circumstances for our high-quality merchandise are the fundamentals of semiconductor gross sales operations,” mentioned Kim. “With glorious merchandise in hand, IT’s a matter of velocity. Our deliberate manufacturing quantity of HBM this yr has already bought out. Though 2024 has simply begun, we’ve already began getting ready for 2025 to remain forward of the market.”

‘Extremely wanted’

As EE Information Europe factors out, the shortage of HBM3 and HBM3E format chips may doubtlessly hinder the expansion of each reminiscence and logic sectors of the semiconductor trade this yr.

“HBM is a revolutionary product which has challenged the notion that semiconductor reminiscence is just one a part of an general system. Specifically, SK Hynix’s HBM has excellent competitiveness,” Kim added. 

“Our superior Technology is extremely wanted by world tech firms,” he added, leaving us questioning who his firm’s greatest shoppers is perhaps. Nvidia and AMD are identified to be voracious for prime bandwidth reminiscence chips, however there are different gamers within the extremely aggressive AI market who is perhaps eager to snap up HBM inventory to keep away from being left excessive and dry.

Curiously, whereas SK Hynix can’t manufacture sufficient of its present HBM merchandise to maintain up with the excessive demand, its fundamental rivals on this area, Samsung and Micron, at the moment are centered on HBM3E. Micron has begun manufacturing “in quantity” its 24GB 8H HBM3E, which will probably be utilized in Nvidia’s newest H200 Tensor Core GPUs. On the similar time, Samsung has begun sampling its 36GB HBM3E 12H to prospects, and this may properly be the reminiscence utilized in Nvidia’s B100 B100 Blackwell AI powerhouse which is anticipated to reach by the tip of this yr.

SK Hynix isn’t going to be left behind for lengthy nevertheless. IT is anticipated to start manufacturing its personal 36GB HBM3E within the first half of this yr, following an improve of its Wuxi plant in China.

Extra from TechRadar Professional



Training

Share
Published by
Training

Recent Posts

Finest Web Suppliers in Ontario, California

What's the finest web supplier in Ontario, California?Frontier Fiber is the most effective web service…

16 hours ago

Find out how to overcome being a NPC at a convention

Have you ever ever felt overwhelmed in a sea of recent faces, uncertain the right…

16 hours ago

These Are The Prime Fintech Innovators Acknowledged At Benzinga’s 2024 World Fintech Awards

The tenth Annual Benzinga World Fintech Awards, held at Convene Brookfield Place in New York…

16 hours ago

Suppliers might want to enhance cyber defenses amid AI adoption: Moody’s

Hearken to the article 4 min This audio is auto-generated. Please tell us when you…

16 hours ago

Be a part of Us for the WIRED Huge Interview Occasion

WIRED's Huge Interview has lengthy been the definitive supply for in-depth conversations with the executives,…

2 days ago

Nonprofit Management – The Job Searching Podcast

Episode 265 - The Challenges and Rewards of Transitioning from Company to Nonprofit ManagementAs I…

2 days ago