Nvidia CEO Jensen Huang expressed that Samsung’s HBM3 and HBM3e chips are not yet certified for Nvidia’s AI accelerators during a briefing at Computex 2024. While SK hynix remains the primary supplier, Samsung faces challenges with heat and power issues in its latest HBM modules.

Nvidia CEO Jensen Huang announced that Samsung’s High Bandwidth Memory (HBM) chips, specifically HBM3 and HBM3e, are not yet ready for certification needed for Nvidia’s artificial intelligence (AI) accelerators. This was disclosed during a briefing at Computex 2024.

Currently, SK hynix is the primary supplier of HBM3 and HBM3e memory to Nvidia, crucial for AI model training such as ChatGPT. Nvidia is also assessing HBM chips from Samsung and Micron but has not approved them yet, citing additional engineering work needed.

There have been reports suggesting Samsung’s latest HBM modules suffer from heat and power consumption issues. However, both Nvidia and Samsung have downplayed these concerns. Samsung maintains that its testing is progressing well and plans to significantly increase HBM production in 2024, including mass production of 12-layer modules.

SK hynix is leading the market, with fully booked production capacity for HBM3 and HBM3e through next year and plans for a new production complex costing $14.6 billion. Despite Samsung being the largest global memory chip producer, its slower progress in HBM production has caused concern among investors, leading to changes in its semiconductor division leadership.

Share.
Leave A Reply

Exit mobile version