SK Hynix decided to showcase its HBM4 implementation to the public at TSMC’s NA Technology Symposium, alongside several other memory products.
SK hynix’s HBM4 Technology Can Now Stack Up To 16-Layers; Mass Production Slated For H2 2025
Well, when it comes to HBM manufacturers in the market, it seems like SK hynix is way ahead of all others, especially with its HBM4 technology. It is claimed that the firm has already prepared a commercial version of the process, whilst competitors like Micron and Samsung are still in the sampling stages, which shows that, at least for now, SK Hynix is winning the race. At TSMC’s North America Technology Symposium, the firm showcased what it calls “AI memory” leadership by unveiling several new products discussed ahead.
First and foremost, SK hynix gave the public a preview of its HBM4 process, giving a slight rundown on specifications as well. So, we are looking at HBM4, which has a capacity of up to 48 GB, 2.0 TB/s bandwidth, and an I/O speed rated at 8.0 Gbps. SK Hynix has announced that they are looking for mass production by H2 2025, which means that the process could see integration into products as early as the end of this year, which is amazing. It is important to note that the Korean giant is the only firm that has showcased HBM4 to the public.

Alongside HBM4, we saw SK hynix’s implementation of the 16-layer HBM3E, which is also the first of its kind, featuring 1.2 TB/s bandwidth and much more. This particular standard is said to be integrated with NVIDIA’s GB300 “Blackwell Ultra” AI clusters, as NVIDIA plans to transition to HBM4 with Vera Rubin. Interestingly, SK Hynix claims that they have managed to connect so many layers through Advanced MR-MUF and TSV, and we are probably looking at the pioneer of the mentioned technologies.
Apart from HBM, SK hynix also showcased its lineup of server memory modules, notably RDIMMs and MRDIMMs products. High-performance server modules are now being built based on the newer 1c DRAM standard, and this has resulted in the modules reaching speeds of up to 12,500 MB/s, which is simply astonishing.
Notably, SK hynix exhibited a range of modules designed to enhance AI and data center performance while reducing power consumption. These included the MRDIMM lineup with a speed of 12.8 gigabits per second (Gbps) and capacities of 64 GB, 96 GB, and 256 GB; RDIMM modules with a speed of 8 Gbps in 64 GB and 96 GB capacities; and a 256 GB 3DS RDIMM.
– SK hynix
There’s no doubt that SK hynix currently has an edge over the HBM and DRAM markets, beating long-standing players like Samsung mainly by driving innovation and partnerships with the likes of NVIDIA.