Samsung in Talks with Nvidia to Supply Next-Gen HBM4 AI Memory Chips

Samsung is negotiating with Nvidia to supply next-generation HBM4 AI chips, aiming to reclaim its lead in the global semiconductor and memory market.

Raja Awais Ali

10/31/20252 min read

Samsung in Talks with Nvidia to Supply Next-Generation HBM4 Chips

Samsung Electronics is reportedly in advanced talks with Nvidia to supply its next-generation High Bandwidth Memory (HBM4) chips, signaling a major strategic move to regain ground in the global semiconductor market. The discussions mark a new phase in Samsung’s ambition to challenge SK Hynix’s dominance in the high-end AI memory sector.

According to industry insiders, Samsung plans to launch the new HBM4 chips in 2026, though the exact delivery timeline has not yet been confirmed. Currently, Samsung is shipping HBM3E chips to global clients and is working to enhance performance and energy efficiency for its next generation of memory.

Nvidia, the world’s leading AI chipmaker, confirmed that it is collaborating closely with Samsung on both HBM3E and HBM4 technologies. However, details regarding the scale or volume of the potential supply deal have not been disclosed. At present, Nvidia’s primary HBM supplier is SK Hynix, which is also preparing to roll out its HBM4 chips soon — intensifying competition in this rapidly evolving market.

For Samsung Electronics, this negotiation is a critical step toward regaining leadership in the HBM (High Bandwidth Memory) market. Over the past few years, Samsung has lagged behind SK Hynix in terms of performance and market share, but the company now aims to reverse that gap through its HBM4 technology. Analysts believe that if Samsung secures Nvidia’s HBM4 supply deal, it could significantly expand its market share and regain dominance in the AI memory sector.

The HBM4 chips are designed specifically for artificial intelligence (AI) servers, data centers, and high-performance computing systems, offering massive data bandwidth and lower power consumption. Samsung stated that its HBM4 chips are built using 4-nanometer logic base dies and sixth-generation 10nm-class DRAM, capable of achieving speeds up to 11 gigabits per second (Gbps) — outperforming the current JEDEC industry standard.

The partnership between Samsung and Nvidia spans more than 25 years, and this new collaboration could deepen their technological integration across HBM4, AI megafactories, and advanced semiconductor manufacturing.

Following the announcement, Samsung’s stock rose over 4%, reflecting renewed investor confidence in the company’s long-term AI and chip growth potential. Market experts suggest that a finalized deal could strengthen Samsung’s position as a global semiconductor leader, while Nvidia would benefit from a more diversified supply chain amid soaring demand for AI infrastructure.

However, challenges remain. The global HBM market is extremely competitive, with SK Hynix and Micron also racing to develop similar next-gen memory technologies. Production scalability, cost efficiency, and delivery timelines will determine how well Samsung can execute its HBM4 strategy in the coming year.

In conclusion, Samsung’s ongoing negotiations with Nvidia over the supply of next-generation HBM4 chips mark a pivotal moment in the company’s semiconductor roadmap. If the partnership materializes, it could reshape the global memory landscape, strengthen AI hardware ecosystems, and reinforce Samsung’s role at the forefront of next-generation chip innovation.