• HBM series
HBM3E: Utilizing 16-layer stacking technology, it features a bandwidth of 1.23TB/s, a capacity of 36GB, and supports an interface speed of 9.6Gbps. It is applied to NVIDIA GB300 AI servers, with a 10% improvement in heat dissipation efficiency.
HBM4: The world's first mass-produced HBM4, with a bandwidth of 2TB/s and a single-stack capacity of 48GB, is planned for commercial use in the second half of 2025 and is compatible with supercomputers and high-end AI clusters.
• DRAM series
◦ DDR5: Speed up to 6400MT/s, supports 12800MT/s MRDIMM modules, typical model H5CG4H24MFR-XBC, used for enterprise-level servers.
LPDDR5:30% power consumption reduction, 6400Mbps speed, integrated into smartphones and in-vehicle infotainment systems.