Nvidia CEO Jensen Huang revealed at Computex 2024 that Samsung’s HBM3 and HBM3e chips are not ready for certification for Nvidia’s AI accelerators, raising questions about engineering requirements and competition with SK hynix.

Nvidia CEO Jensen Huang announced at Computex 2024 that Samsung’s advanced High Bandwidth Memory (HBM) chips are not yet ready for certification for use in Nvidia’s AI accelerators. Huang stated that more engineering work is needed before Samsung’s HBM3 and HBM3e components can be certified, although it is not specified whether this work needs to be done by Samsung, Nvidia, or both. Currently, SK hynix is the primary supplier of these memory chips to Nvidia.

Recent reports indicated that Samsung’s latest HBM modules are experiencing issues with excessive heat and power consumption. However, Samsung has denied the accuracy of these claims, asserting that its testing is progressing smoothly and that its products work well with a variety of processors. Despite this, Samsung has not specifically addressed performance with Nvidia processors.

Samsung remains the largest global producer of memory chips, though it lags in HBM production compared to SK hynix, which is leading in the provision of HBM3 and HBM3e chips. SK hynix’s production capacity is fully booked through the next year, with plans to invest $14.6 billion in a new production complex to meet growing demand. Concerns among Samsung’s investors about its competition with SK hynix have led to a recent change in leadership within Samsung’s semiconductor division.

Share.

Ivan Massow Senior Editor at AI WEEK, Ivan, a life long entrepreneur, has worked at Cambridge University's Judge Business School and the Whittle Lab, nurturing talent and transforming innovative technologies into successful ventures.

Leave A Reply

Exit mobile version