Samsung says it's first to ship HBM4, a day after Micron revealed its own sales
(2026/02/13)
- Reference: 1770954337
- News link: https://www.theregister.co.uk/2026/02/13/samsung_and_micron_start_shipping/
- Source link:
Samsung and Micron say they’ve started shipping HBM4 memory, the faster and denser RAM needed to power the next generation of AI acceleration hardware.
Samsung yesterday [1]announced it has begun mass production of HBM4 and even shipped some to an unnamed customer – probably Nvidia, which has confirmed its forthcoming Vera Rubin kit will use the memory.
The Korean giant says its memory delivers a consistent processing speed of 11.7 gigabits-per-second, but users can juice that to hit 13Gbps under some circumstances. Total memory bandwidth can reach 3.3 terabytes-per-second in a single stack.
[2]
For now, Samsung can sell this stuff in capacities between 24 and 36 gigabytes, but already plans to reach 48GB.
[3]
[4]
Memory is so hot right now, but Samsung claims it has enhanced thermal resistance by 10 percent and heat dissipation by 30 percent, compared to HBM3E. The company also claims this new memory is 40 percent more energy efficient, meaning this kit uses less electricity and runs cooler, suggesting users can look forward to slower growth in their energy bills.
The Korean giant didn’t say what it will charge for this memory, an item of interest given soaring memory prices. But it did forecast its HBM sales will more than triple in 2026 compared to 2025, and that it expects to ship samples of HBM4E in the second half of 2026.
[5]OpenAI dishes out its first model on a plate of Cerebras silicon
[6]Positron: we don’t need no fancy HBM to compete with Nvidia’s Rubin
[7]While you pay through the nose for memory, Samsung expects to triple its profits in Q4
[8]Micron says memory shortages are here for the foreseeable future
Samsung claimed it is first to crank up production of HBM4 and ship it, but a day earlier rival memory-maker Micron said it was also cranking out the chips.
Speaking at an event hosted by Wolfe Research, Micron CFO Mark Murphy decided to “address some recent inaccurate reporting by some on our HBM4 position” by revealing the company has also started high-volume HBM4 production and shipped some to customers.
[9]
“Our HBM yield is on track. Our HBM4 yield is on track. Our HBM4 product delivers over 11 gigabits per second speeds, and we're highly confident in our HBM4 product performance and quality and reliability,” he said, adding that Micron delivered product a quarter earlier than previously forecast. He also noted that Micron has pre-sold every single HBM4 chip it can make this year.
The news from Samsung and Micron means SK Hynix is the only major memory-maker yet to announce it has started production of HBM4.
Nvidia plans to release its Vera Rubin accelerators in the second quarter of 2026, and to use memory from Samsung and SK Hynix. Samsung’s news therefore matters to those who want Nvidia’s latest and greatest, and investors who hope the GPU giant can continue its exceptional growth.
[10]
Investors seem to have enjoyed Micron’s news, as its share price spiked almost ten percent on news of its early HBM4 production.
For the rest of us, HBM4 production may bring the misery of price rises for lesser memory, because Samsung and others have shifted production capacity to high-margin products for AI applications, causing prices for other products to soar. ®
Get our [11]Tech Resources
[1] https://news.samsung.com/global/samsung-ships-industry-first-commercial-hbm4-with-ultimate-performance-for-ai-computing
[2] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/systems&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aY6v8BlWRpXa-EiSsOkDRAAAAEg&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0
[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/systems&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aY6v8BlWRpXa-EiSsOkDRAAAAEg&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/systems&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aY6v8BlWRpXa-EiSsOkDRAAAAEg&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[5] https://www.theregister.com/2026/02/12/openai_model_cerebras/
[6] https://www.theregister.com/2026/02/04/positron_hbm_no_need/
[7] https://www.theregister.com/2026/01/08/samsung_memory_profits/
[8] https://www.theregister.com/2025/12/18/micron_q1_2026/
[9] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/systems&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aY6v8BlWRpXa-EiSsOkDRAAAAEg&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[10] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/systems&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aY6v8BlWRpXa-EiSsOkDRAAAAEg&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[11] https://whitepapers.theregister.com/
Samsung yesterday [1]announced it has begun mass production of HBM4 and even shipped some to an unnamed customer – probably Nvidia, which has confirmed its forthcoming Vera Rubin kit will use the memory.
The Korean giant says its memory delivers a consistent processing speed of 11.7 gigabits-per-second, but users can juice that to hit 13Gbps under some circumstances. Total memory bandwidth can reach 3.3 terabytes-per-second in a single stack.
[2]
For now, Samsung can sell this stuff in capacities between 24 and 36 gigabytes, but already plans to reach 48GB.
[3]
[4]
Memory is so hot right now, but Samsung claims it has enhanced thermal resistance by 10 percent and heat dissipation by 30 percent, compared to HBM3E. The company also claims this new memory is 40 percent more energy efficient, meaning this kit uses less electricity and runs cooler, suggesting users can look forward to slower growth in their energy bills.
The Korean giant didn’t say what it will charge for this memory, an item of interest given soaring memory prices. But it did forecast its HBM sales will more than triple in 2026 compared to 2025, and that it expects to ship samples of HBM4E in the second half of 2026.
[5]OpenAI dishes out its first model on a plate of Cerebras silicon
[6]Positron: we don’t need no fancy HBM to compete with Nvidia’s Rubin
[7]While you pay through the nose for memory, Samsung expects to triple its profits in Q4
[8]Micron says memory shortages are here for the foreseeable future
Samsung claimed it is first to crank up production of HBM4 and ship it, but a day earlier rival memory-maker Micron said it was also cranking out the chips.
Speaking at an event hosted by Wolfe Research, Micron CFO Mark Murphy decided to “address some recent inaccurate reporting by some on our HBM4 position” by revealing the company has also started high-volume HBM4 production and shipped some to customers.
[9]
“Our HBM yield is on track. Our HBM4 yield is on track. Our HBM4 product delivers over 11 gigabits per second speeds, and we're highly confident in our HBM4 product performance and quality and reliability,” he said, adding that Micron delivered product a quarter earlier than previously forecast. He also noted that Micron has pre-sold every single HBM4 chip it can make this year.
The news from Samsung and Micron means SK Hynix is the only major memory-maker yet to announce it has started production of HBM4.
Nvidia plans to release its Vera Rubin accelerators in the second quarter of 2026, and to use memory from Samsung and SK Hynix. Samsung’s news therefore matters to those who want Nvidia’s latest and greatest, and investors who hope the GPU giant can continue its exceptional growth.
[10]
Investors seem to have enjoyed Micron’s news, as its share price spiked almost ten percent on news of its early HBM4 production.
For the rest of us, HBM4 production may bring the misery of price rises for lesser memory, because Samsung and others have shifted production capacity to high-margin products for AI applications, causing prices for other products to soar. ®
Get our [11]Tech Resources
[1] https://news.samsung.com/global/samsung-ships-industry-first-commercial-hbm4-with-ultimate-performance-for-ai-computing
[2] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/systems&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aY6v8BlWRpXa-EiSsOkDRAAAAEg&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0
[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/systems&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aY6v8BlWRpXa-EiSsOkDRAAAAEg&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/systems&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aY6v8BlWRpXa-EiSsOkDRAAAAEg&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[5] https://www.theregister.com/2026/02/12/openai_model_cerebras/
[6] https://www.theregister.com/2026/02/04/positron_hbm_no_need/
[7] https://www.theregister.com/2026/01/08/samsung_memory_profits/
[8] https://www.theregister.com/2025/12/18/micron_q1_2026/
[9] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/systems&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aY6v8BlWRpXa-EiSsOkDRAAAAEg&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[10] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/systems&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aY6v8BlWRpXa-EiSsOkDRAAAAEg&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[11] https://whitepapers.theregister.com/