SK hynix Unveils Next-Gen AI-Optimized NAND Storage Vision to Boost Data Center and Edge AI Performance

SK hynix revealed its next-generation AI-NAND (AIN) family optimized for AI workloads, designed to dramatically enhance AI processing speed, bandwidth, density, and energy efficiency in data centers and edge devices.

Oct 27, 2025
SK hynix Unveils Next-Gen AI-Optimized NAND Storage Vision to Boost Data Center and Edge AI Performance

SK hynix has unveiled its visionary next-generation NAND storage solutions tailored specifically for artificial intelligence applications, aiming to meet the accelerating demand for high-speed, efficient data processing in AI data centers and edge devices. The announcement came at the 2025 Open Compute Project (OCP) Global Summit in San Jose, California, where the company detailed its AI-NAND (AIN) family lineup developed to optimize performance, bandwidth, and density.

With AI inference workloads generating massive volumes of data that require rapid access and processing, SK hynix’s AIN family addresses critical bottlenecks between storage and AI compute. AIN P (Performance) targets superior data throughput and energy efficiency, designed to minimize delays between NAND storage and AI operations, thus supporting large-scale AI inference workloads more effectively. Meanwhile, AIN B (Bandwidth) utilizes SK hynix’s proprietary High Bandwidth Flash (HBF) technology to stack multiple NAND layers vertically, substantially increasing data transfer rates critical for real-time AI applications. AIN D (Density) focuses on maximizing storage capacity to petabyte-scale levels while controlling power consumption and costs — a key factor for extensive AI datasets.

Chun Sung Kim, Head of eSSD Product Development at SK hynix, emphasized that the company is innovating both NAND flash memory and controllers with new architectures tailored to AI needs. SK hynix plans to begin sampling these AI-optimized NAND solutions by the end of 2026, positioning itself as a key player in the coming era of AI-driven storage infrastructure.

The announcement highlights SK hynix’s commitment to supporting the AI ecosystem’s rapid growth by tackling the challenging demands of speed, scale, and energy efficiency inherent in AI data processing. These advances promise to accelerate machine learning, deep learning, and inference applications across cloud data centers and edge environments such as autonomous vehicles, robotics, and personalized AI assistants.

By delivering NAND storage solutions customized for the AI era, SK hynix aims to reduce latency, increase throughput, and improve power efficiency, directly impacting the performance and scalability of future AI systems worldwide.