Published in Network

SK hynix stuffs 256GB into DDR5

by on18 December 2025


Intel approves for Xeon 6

SK hynix has muscled its way to the front of the server memory pack by becoming the first to certify 256GB DDR5 RDIMMs on Troubled Chipzilla’s Xeon 6 platform.

While consumers are still dealing with shortages and daft pricing, SK hynix is chasing the AI money with high-density memory aimed squarely at data centres. The company says its new 256GB DDR5 RDIMM, built using 32GB fifth-generation 10nm-class DRAM, has now passed Chipzilla’s full data centre certification process.

That makes it the first server memory module of its kind officially cleared to run on Xeon 6 CPUs, a badge that actually matters in enterprise circles.

SK hynix had already validated a 128GB solution based on the same 32Gb process in January 2025, so this is very much the next logical step. According to the firm, the new module delivers 16 per cent higher AI inference performance while cutting power use by 18 per cent compared with older 256GB parts built on 16Gb 1a DRAM.

SK hynix said the module completed extensive testing at Intel’s Advanced Data Centre Development Laboratory, covering performance, compatibility and reliability.

The company reminded everyone that it validated a separate 256GB product earlier this year using fourth-generation 10nm class memory, but this one goes further.

By being first out of the gate on Xeon 6, SK hynix is clearly keen to underline its technical leadership in high-capacity DDR5. The firm says it plans to deepen ties with global data centre operators as server demand continues to balloon due to AI workloads.

SK hynix, head of DRAM product planning and enablement, Sangkwon Lee, said: “We are now able to respond more swiftly to customer needs, solidifying our leadership in the server DDR5 DRAM market.”

“As a full-stack AI memory creator, we will actively address the growing demand for high-performance, low-power, and high-capacity memory solutions to enhance customer satisfaction further,” Lee said.

As AI inference shifts from basic answers to more complex reasoning, memory capacity and bandwidth have become hard limits. Processing vast datasets in real time means servers need more memory per socket, not just faster CPUs or accelerators.

SK hynix reckons servers fitted with the new 256GB module can push inference performance 16 per cent higher than systems using 128GB parts based on 32Gb dies.

Using larger DRAM chips also trims power consumption by roughly 18 per cent compared with earlier 256GB designs, improving performance per watt where it counts. Data centre operators chasing AI efficiency will notice, even if the rest of the market is still stuck waiting for affordable upgrades.

Last modified on 18 December 2025
Rate this item
(0 votes)