Close Menu
AI Week
  • Breaking
  • Insight
  • Ethics & Society
  • Innovation
  • Education and Training
  • Spotlight
Trending

UN experts warn against market-driven AI development amid global concerns

September 20, 2024

IBM launches free AI training programme with skill credential in just 10 hours

September 20, 2024

GamesBeat Next 2023: Emerging leaders in video game industry to convene in San Francisco

September 20, 2024
Facebook X (Twitter) Instagram
Newsletter
  • Privacy
  • Terms
  • Contact
Facebook X (Twitter) Instagram YouTube
AI Week
Noah AI Newsletter
  • Breaking
  • Insight
  • Ethics & Society
  • Innovation
  • Education and Training
  • Spotlight
AI Week
  • Breaking
  • Insight
  • Ethics & Society
  • Innovation
  • Education and Training
  • Spotlight
Home»Education and Training»Nvidia CEO raises concerns over Samsung’s HBM3 chips for AI accelerators
Education and Training

Nvidia CEO raises concerns over Samsung’s HBM3 chips for AI accelerators

Kai LaineyBy Kai LaineyJune 9, 20241 ViewsNo Comments2 Mins Read
Share
Facebook Twitter LinkedIn WhatsApp Email

Nvidia CEO Jensen Huang expressed that Samsung’s HBM3 and HBM3e chips are not yet certified for Nvidia’s AI accelerators during a briefing at Computex 2024. While SK hynix remains the primary supplier, Samsung faces challenges with heat and power issues in its latest HBM modules.

Nvidia CEO Jensen Huang announced that Samsung’s High Bandwidth Memory (HBM) chips, specifically HBM3 and HBM3e, are not yet ready for certification needed for Nvidia’s artificial intelligence (AI) accelerators. This was disclosed during a briefing at Computex 2024.

Currently, SK hynix is the primary supplier of HBM3 and HBM3e memory to Nvidia, crucial for AI model training such as ChatGPT. Nvidia is also assessing HBM chips from Samsung and Micron but has not approved them yet, citing additional engineering work needed.

There have been reports suggesting Samsung’s latest HBM modules suffer from heat and power consumption issues. However, both Nvidia and Samsung have downplayed these concerns. Samsung maintains that its testing is progressing well and plans to significantly increase HBM production in 2024, including mass production of 12-layer modules.

SK hynix is leading the market, with fully booked production capacity for HBM3 and HBM3e through next year and plans for a new production complex costing $14.6 billion. Despite Samsung being the largest global memory chip producer, its slower progress in HBM production has caused concern among investors, leading to changes in its semiconductor division leadership.

Share. Facebook Twitter LinkedIn Telegram WhatsApp Email Copy Link
Kai Lainey
  • X (Twitter)

Related News

IBM launches free AI training programme with skill credential in just 10 hours

September 20, 2024

Protege secures $10 million seed round to launch innovative AI training data platform

September 20, 2024

Industry finalists announced for multifamily workplace awards

August 16, 2024

Innovative security technology deployed at major European sporting events

August 16, 2024

iLearningEngines to showcase innovations at key investor conferences

August 16, 2024

VocTech analysis sheds light on post-election workforce dynamics

August 16, 2024
Add A Comment
Leave A Reply Cancel Reply

Top Articles

IBM launches free AI training programme with skill credential in just 10 hours

September 20, 2024

GamesBeat Next 2023: Emerging leaders in video game industry to convene in San Francisco

September 20, 2024

Alibaba Cloud unveils cutting-edge modular datacentre technology at annual Apsara conference

September 20, 2024

Subscribe to Updates

Get the latest AI news and updates directly to your inbox.

Advertisement
Demo
AI Week
Facebook X (Twitter) Instagram YouTube
  • Privacy Policy
  • Terms of use
  • Press Release
  • Advertise
  • Contact
© 2025 AI Week. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.