Samsung has announced a major breakthrough in high bandwidth memory technology with the development of the industry’s first 12-stack HBM3E DRAM. This new memory module offers the highest capacity and bandwidth to date, setting a new standard for performance in memory technology.

The South Korean tech giant revealed that the HBM3E 12H DRAM provides a maximum bandwidth of 1,280GB/s and a capacity of 36GB. These numbers represent a significant leap forward in performance, with a 50% increase in both bandwidth and capacity compared to the previous generation 8-stack HBM3.

HBM, or High Bandwidth Memory, is a type of memory technology that features multiple DRAM modules stacked vertically, with each layer or stack providing additional capacity. In the case of Samsung’s latest HBM3E, each DRAM module has a capacity of 24 gigabits, equivalent to 3 gigabytes, and there are twelve of these stacks in total.

Memory manufacturers such as Samsung, SK Hynix, and Micron are all racing to increase the number of stacks in their HBM modules while keeping the overall height of the chip as thin as possible. This effort to maximize capacity while minimizing size is driven by the growing demand for high-performance memory solutions in applications like artificial intelligence, where GPUs paired with HBM modules are in high demand.

Samsung’s achievement with the 12-stack HBM3E was made possible by the use of advanced thermal compression non-conductive film (TC NCF) technology. This innovative film allowed Samsung to maintain the same height as 8-stack HBM modules while increasing the number of stacks by 50%. By reducing the gap between stacks and eliminating voids, Samsung was able to achieve a 20% increase in vertical density compared to previous generations.

The use of TC NCF also enabled Samsung to optimize the chip design for improved performance and heat dissipation. By using small and large bumps during chip bonding, Samsung was able to enhance signaling efficiency and thermal management, resulting in higher overall performance.

Samsung claims that the higher performance and capacity of the HBM3E 12H will provide significant benefits for customers, particularly in data center applications. For AI workloads, Samsung projects a 34% increase in training speed and an 11.5x boost in the number of simultaneous users for inference services compared to the previous generation HBM3 8H.

Samples of the HBM3E 12H have already been provided to customers, with mass production expected to begin in the first half of this year. With this breakthrough in high bandwidth memory technology, Samsung is poised to deliver cutting-edge solutions that will drive innovation in data centers, artificial intelligence, and other high-performance computing applications.

LEAVE A REPLY

Please enter your comment!
Please enter your name here