Home

Micron Delivers Industry’s Highest Capacity SOCAMM2 for Low-Power DRAM in the AI Data Center

BOISE, Idaho, Oct. 22, 2025 (GLOBE NEWSWIRE) -- In an era of unprecedented AI innovation and growth, the entire data center ecosystem is transforming toward more energy-efficient infrastructure to support sustainable growth. With memory playing an increasingly critical role in AI systems, low-power memory solutions have become central to this transformation. Micron Technology, Inc. (Nasdaq: MU) today announced customer sampling of 192GB SOCAMM2 (small outline compression attached memory modules) to enable broader adoption of low-power memory within AI data centers. SOCAMM2 extends the capabilities of Micron’s first-to-market LPDRAM SOCAMM, delivering 50% more capacity in the same compact footprint. The added capacity can significantly reduce time to first token (TTFT) by more than 80% in real-time inference workloads.1 The 192GB SOCAMM2 uses Micron’s most advanced 1-gamma DRAM process technology to deliver greater than 20% improvement in power efficiency,2 further enabling power design optimization of large data center clusters. These savings become quite significant in full-rack AI installations, which can include more than 40 terabytes of CPU-attached low-power DRAM main memory.3 The modular design of SOCAMM2 improves serviceability and lays the groundwork for future capacity expansion.