SK Hynix, a key Samsung competitor, says IT has bought out its whole 2024 manufacturing of stacked high-bandwidth reminiscence DRAMs. These chips are essential for AI processors in knowledge facilities, however the firm stays tight-lipped about its largest shoppers.
SK Hynix’s just lately appointed vp, Kitae Kim, who can also be the pinnacle of HBM gross sales and advertising, confirmed the information in an interview posted on SK Hynix’s web site.
“Proactively securing buyer buy volumes and negotiating extra favorable circumstances for our high-quality merchandise are the fundamentals of semiconductor gross sales operations,” mentioned Kim. “With glorious merchandise in hand, IT’s a matter of velocity. Our deliberate manufacturing quantity of HBM this yr has already bought out. Though 2024 has simply begun, we’ve already began getting ready for 2025 to remain forward of the market.”
‘Extremely wanted’
As EE Information Europe factors out, the shortage of HBM3 and HBM3E format chips may doubtlessly hinder the expansion of each reminiscence and logic sectors of the semiconductor trade this yr.
“HBM is a revolutionary product which has challenged the notion that semiconductor reminiscence is just one a part of an general system. Specifically, SK Hynix’s HBM has excellent competitiveness,” Kim added.
“Our superior Technology is extremely wanted by world tech firms,” he added, leaving us questioning who his firm’s greatest shoppers is perhaps. Nvidia and AMD are identified to be voracious for prime bandwidth reminiscence chips, however there are different gamers within the extremely aggressive AI market who is perhaps eager to snap up HBM inventory to keep away from being left excessive and dry.
Curiously, whereas SK Hynix can’t manufacture sufficient of its present HBM merchandise to maintain up with the excessive demand, its fundamental rivals on this area, Samsung and Micron, at the moment are centered on HBM3E. Micron has begun manufacturing “in quantity” its 24GB 8H HBM3E, which will probably be utilized in Nvidia’s newest H200 Tensor Core GPUs. On the similar time, Samsung has begun sampling its 36GB HBM3E 12H to prospects, and this may properly be the reminiscence utilized in Nvidia’s B100 B100 Blackwell AI powerhouse which is anticipated to reach by the tip of this yr.
SK Hynix isn’t going to be left behind for lengthy nevertheless. IT is anticipated to start manufacturing its personal 36GB HBM3E within the first half of this yr, following an improve of its Wuxi plant in China.