Saturday, April 20, 2024
HomeAi in FinanceMicron Begins Mass Production of Top HBM3E Solution to Boost AI Growth

Micron Begins Mass Production of Top HBM3E Solution to Boost AI Growth

Micron Technology, Inc. Unveils HBM3E Solution for AI Accelerators

Micron Technology, Inc.

Micron HBM3E helps reduce data center operating costs by consuming about 30% less power than competing HBM3E offerings

BOISE, Idaho, Feb. 26, 2024 (GLOBE NEWSWIRE) — Micron Technology, Inc. (Nasdaq: MU), a global leader in memory and storage solutions, today announced it has begun volume production of its HBM3E (High Bandwidth Memory 3E) solution. Micron’s 24GB 8H HBM3E will be part of NVIDIA H200 Tensor Core GPUs, which will begin shipping in the second calendar quarter of 2024. This milestone positions Micron at the forefront of the industry, empowering artificial intelligence (AI) solutions with HBM3E’s industry-leading performance and energy efficiency.

HBM3E: Fueling the AI Revolution
As the demand for AI continues to surge, the need for memory solutions to keep pace with expanded workloads is critical. Micron’s HBM3E solution addresses this challenge head-on with:

  • Superior Performance: With pin speed greater than 9.2 gigabits per second (Gb/s), Micron’s HBM3E delivers more than 1.2 terabytes per second (TB/s) of memory bandwidth, enabling lightning-fast data access for AI accelerators, supercomputers, and data centers.

  • Exceptional Efficiency: Micron’s HBM3E leads the industry with ~30% lower power consumption compared to competitive offerings. To support increasing demand and usage of AI, HBM3E offers maximum throughput with the lowest levels of power consumption to improve important data center operational expense metrics.

  • Seamless Scalability: With 24 GB of capacity today, Micron’s HBM3E allows data centers to seamlessly scale their AI applications. Whether for training massive neural networks or accelerating inferencing tasks, Micron’s solution provides the necessary memory bandwidth.

“Micron is delivering a trifecta with this HBM3E milestone: time-to-market leadership, best-in-class industry performance, and a differentiated power efficiency profile,” said Sumit Sadana, executive vice president and chief business officer at Micron Technology. “AI workloads are heavily reliant on memory bandwidth and capacity, and Micron is very well-positioned to support the significant AI growth ahead through our industry-leading HBM3E and HBM4 roadmap, as well as our full portfolio of DRAM and NAND solutions for AI applications.”

Micron developed this industry-leading HBM3E design using its 1-beta technology, advanced through-silicon via (TSV), and other innovations that enable a differentiated packaging solution. Micron, a proven leader in memory for 2.5D/3D-stacking and advanced packaging technologies, is proud to be a partner in TSMC’s 3DFabric Alliance and to help shape the future of semiconductor and system innovations.

Micron is also extending its leadership with the sampling of 36GB 12-High HBM3E, which is set to deliver greater than 1.2 TB/s performance and superior energy efficiency compared to competitive solutions, in March 2024. Micron is a sponsor at NVIDIA GTC, a global AI conference starting March 18, where the company will share more about its industry-leading AI memory portfolio and roadmaps.

About Micron Technology, Inc.
We are an industry leader in innovative memory and storage solutions transforming how the world uses information to enrich life for all. With a relentless focus on our customers, technology leadership, and manufacturing and operational excellence, Micron delivers a rich portfolio of high-performance DRAM, NAND and NOR memory and storage products through our Micron® and Crucial® brands. Every day, the innovations that our people create fuel the data economy, enabling advances in artificial intelligence and 5G applications that unleash opportunities — from the data center to the intelligent edge and across the client and mobile user experience. To learn more about Micron Technology, Inc. (Nasdaq: MU), visit micron.com.

© 2024 Micron Technology, Inc. All rights reserved. Information, products, and/or specifications are subject to change without notice. Micron, the Micron logo, and all other Micron trademarks are the property of Micron Technology, Inc. All other trademarks are the property of their respective owners.

Micron Media Relations Contact
Kelly Sasso 
Micron Technology, Inc. 
+1 (208) 340-2410 
ksasso@micron.com

Micron Investor Relations Contact
Satya Kumar
Micron Technology, Inc.
+1 (408) 450-6199
satyakumar@micron.com

Micron Technology, Inc. has announced the launch of its HBM3E (High Bandwidth Memory 3E) solution, aimed at improving the performance and energy efficiency of artificial intelligence (AI) solutions. The company’s 24GB 8H HBM3E will be integrated into NVIDIA H200 Tensor Core GPUs, enhancing their capabilities and reducing data center operating costs by consuming around 30% less power compared to other offerings in the market.

HBM3E: Powering AI Workloads

With the increasing demand for AI technologies, memory solutions play a crucial role in meeting the growing workloads. Micron’s HBM3E solution offers several key features to address this demand:

  • Superior Performance: With a pin speed exceeding 9.2 gigabits per second (Gb/s), Micron’s HBM3E provides over 1.2 terabytes per second (TB/s) of memory bandwidth, enabling fast data access for AI accelerators and data centers.
  • Exceptional Efficiency: Micron’s HBM3E consumes approximately 30% less power than competing products, making it a cost-effective solution for data center operations. This lower power consumption enhances important metrics related to data center operational expenses.
  • Seamless Scalability: With a capacity of 24 GB, Micron’s HBM3E allows for easy scaling of AI applications in data centers, supporting tasks such as training neural networks and accelerating inferencing processes.

Micron’s executive vice president and chief business officer, Sumit Sadana, highlighted the company’s leadership in delivering innovative memory solutions to support the growing demands of AI workloads. Micron’s HBM3E design, leveraging advanced technologies such as 1-beta technology and through-silicon via (TSV), provides a unique packaging solution. The company’s collaboration with TSMC’s 3DFabric Alliance further reinforces its commitment to driving semiconductor and system innovations.

In addition to the 24GB 8H HBM3E, Micron is set to introduce the 36GB 12-High HBM3E in March 2024, offering even greater performance and energy efficiency. The company will also participate in the NVIDIA GTC conference in March to showcase its leading AI memory portfolio and future roadmaps.

About Micron Technology, Inc.

Micron Technology, Inc. is a global leader in memory and storage solutions, delivering high-performance DRAM, NAND, and NOR memory and storage products under the Micron® and Crucial® brands. The company’s innovations drive the data economy, enabling advancements in artificial intelligence, 5G applications, and more.

For more information about Micron Technology, Inc. and its products, visit micron.com.

Micron Media Relations Contact:
Kelly Sasso
Micron Technology, Inc.
+1 (208) 340-2410
Email: ksasso@micron.com

Micron Investor Relations Contact:
Satya Kumar
Micron Technology, Inc.
+1 (408) 450-6199
Email: satyakumar@micron.com


Leah Sirama
Leah Sirama
Leah Sirama, a lifelong enthusiast of Artificial Intelligence, has been exploring technology and the digital realm since childhood. Known for his creative thinking, he's dedicated to improving AI experiences for all, making him a respected figure in the field. His passion, curiosity, and creativity drive advancements in the AI world.
RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -

Most Popular