
The week of 20 May 2024 has been quite eventful for Samsung’s semiconductor unit, the world’s largest producer of memory chips like DRAMs, SRAMs, NAND flash, and NOR flash. Early this week, an unexpected change of guard at Samsung’s semiconductor business rocked the industry.
When Samsung abruptly replaced its semiconductor business chief Kyung Kye-hyun with DRAM and flash memory veteran Jun Young-hyun, the transition was mostly credited to the “chip crisis” associated with Samsung being a laggard in high bandwidth memory (HBM) business, where SK hynix has become a market leader.
Figure 1 Jun Young-hyun led Samsung’s memory chip business from 2014 to 2017 after working on the development of DRAM and flash memory chips. Source: The Chosun Daily
It’s worth noting that management reshuffles at Samsung are usually announced at the start of the year. However, being seen as a laggard in HBM technology has pushed the memory kingpin into a desperate position, and the appointment of a new chip unit head mostly reflects that sense of crisis at the world’s largest memory chip supplier.
HBM, a customized memory product, has enjoyed explosive growth in artificial intelligence (AI) applications due to its suitability for training AI models like ChatGPT. HBM, where DRAM chips are vertically stacked to save space and reduce power consumption, helps process massive amounts of data produced by complex AI applications.
SK hynix, Samsung’s Korean memory chip rival, produced its first HBM chip in 2013. Since then, it has continuously invested in developing this memory technology while bolstering manufacturing yield. According to media reports, SK hynix’s HBM production capacity is fully booked through 2025.
SK hynix is also the main supplier of HBM chips to Nvidia, which commands nearly 80% of the GPU market for AI applications, a premise where HBM memory chips are strategically paired with AI processors like GPUs to overcome data overheads. On the other hand, Samsung, currently catching up on HBM technology, is known to be in the process of qualifying its HBM memory chips for Nvidia AI processors.
During Nvidia’s annual event, GPU Technology Conference (GTC), held in March 2024 in San Jose, California, the company’s co-founder and CEO Jensen Huang endorsed Samsung’s HBM3e chips, then going through a verification process at Nvidia, with a note “Jensen Approved” next to Samsung’s 12-layer HBM3e device on display at GTS 2024 floor.
HBM test at Nvidia
While the start of the week stunned the industry with an unusual reshuffle at the top, the end of the week came with a bigger surprise. According to a report published in Reuters on Friday, 24th May, Samsung’s HBM chips failed to pass Nvidia’s test for pairing with its GPUs due to heat and power consumption issues.
In another report published in The Chosun Daily that day, Professor Kwon Seok-joon of the Department of Chemical Engineering at Sungkyunkwan University said that Samsung has not been able to fully manage quality control of through-silicon vias (TSVs) for packaging HBM memory chips. In other words, high yield in packaging multiple DRAM layers has been challenging. Another insider pointed to reports that the power consumption of Samsung’s HBM3E samples is more than double that of SK hynix.
Figure 2 According to the article published in Reuters, a test for Samsung’s 8-layer and 12-layer HBM3e memory chips failed in April 2024. Source: Samsung Electronics
While Nvidia declined to comment on this story, Samsung was quick to state that the situation has not been concluded, and that testing is still ongoing. The South Korean memory chipmaker added that HBM, a specialized memory product, requires optimization through close collaboration with customers. Jeff Kim, head of research at KB Securities, quoted in the Reuters story, acknowledged that while Samsung anticipated to quickly pass Nvidia’s tests, a specialized product like HBM could take some time to go through customers’ performance evaluations.
Still, it’s a setback for Samsung that could go to advantage of SK hynix and Micron, the remaining players in the high-stake HBM game. Micron, which claims that its HBM3e consumes 30% less power than its competitors, has announced that its 24-GB, 8-layer HBM3e memory chips will be part of Nvidia’s H200 Tensor Core GPUs, breaking the previous exclusivity of SK hynix as the sole HBM supplier for Nvidia’s AI processors.
A rude awakening?
Samsung, being a laggard in HBM, won’t be the only worry for the upcoming chief Jun. Despite the recovery in memory prices, Samsung’s semiconductor business is lagging in competitiveness on various fronts. According to another Reuters report, Samsung’s high-density DRAMs and NAND flash products are no longer ahead of the competition.
Next, the Korean tech heavyweight’s foundry operation is struggling to catch up with market leader TSMC. Samsung’s chip contract-manufacturing business has struggled to win big customers, while TSMC is still far ahead in terms of overall market share. Then there is the global AI wave in which Samsung is currently struggling to find its place besides its HBM woes.
Samsung is known for its fierce competitive skills, and the appointment of the new chief of its semiconductor unit signals that it means business. The Korean tech giant is facing an uphill battle in catching up in HBM memory technology, but one thing is for sure: Samsung is no stranger to charting hot waters.
Related Content
- AI boom and the politics of HBM memory chips
- Memory Bottlenecks: Overcoming a Common AI Problem
- HBM memory chips: The unsung hero of the AI revolution
- Generative AI and memory wall: A wakeup call for IC industry
- A sneak peek at HBM cold war between Samsung and SK hynix
The post Samsung’s memory chip business: Trouble in paradise? appeared first on EDN.