Samsung CXL memory appliance with orchestration console redefines possibilities in memory technology
• CMM-B serves as memory pool that supports CXL 1.1 and 2.0 protocols, and can host up to 24xE3.S CMM-D devices, with total capacity ranging from 3 to 24TB.The software-defined memory technology solutions, powered by Compute Express Link (CXL), are designed to provide the scalability, manageability, configurability, and flexibility required as data centers transition into the next-gen disaggregated system architectures.
Kioxia Unveils Upcoming High-Capacity LC9 Series 122.88TB NVMe 2.5-Inch SSD for AI Applications
Sandisk introduces iNAND AT EU752 UFS4.1 EFD
• The LC9 is the firm’s first BiCS8 3D flash QLC 2tb die product, providing a PCIe 5.0 interface and dual-port capability for fault tolerance or connectivity to multiple compute systems. These high-capacity QLC-based SSDs are suitable for deploying in hybrid cloud and multi-cloud systems. High-capacity SSDs feed training and inference data to AI server systems via these cloud configurations.
• Building on its portfolio of advanced automotive-grade storage technology, the iNAND AT EU752 is the first automotive grade UFS4.1 interface device.
• Automotive AI systems such as Advanced Driver Assist Systems (ADAS), Autonomous Driving (AD) and eCockpit need to pull information from sensors, maps and AI databases to function safely. iNAND embedded flash drives become crucial to making sure data is available and reliable when it’s needed and provide real-time in-vehicle storage for the AD computer to overcome latency and connectivity issues that arise when accessing the cloud.
SK hynix showcased memory technology for AI data centers in NVIDIA GTC 2025
• Among the industry-leading AI memory technology to be displayed at the show are 12-high HBM3E and SOCAMM, a new memory standard for AI servers. The company is now planning to complete the preparatory works for large-scale production of the 12-high HBM4 within 2025 2H for immediate start of supply to order.* SOCAMM (Small Outline Compression Attached Memory Module): a low-power DRAM-based memory module for AI server
Micron innovates from data center to edge with NVIDIA
• The firm maintains its leadership in designing and delivering LPDDR for data center applications as the first and only memory company shipping both HBM3E and SOCAMM products for AI servers in the data center.
• The company’s SOCAMM, a modular LPDDR5X memory solution, was developed in collaboration with NVIDIA to support the NVIDIA GB300 Grace Blackwell Ultra Superchip, and the deployment of Micron HBM3E products in NVIDIA Hopper and NVIDIA Blackwell systems underscores Micron’s critical role in accelerating AI workloads.
WW SSD demand reached the highest point of total 2024 shipment amount in Q4’24, yet 2025 shipment is expected to stay at a higher level than the previous stage.
Flash Sufficiency: maintain at low level since 2024.
Price Trend: NAND flash price drops to a low level in Q4’24, yet Q1’25 is facing a slight bound due to chip makers’ plan of production cut considering the general demand isn’t significantly strong.
DRAM Sufficiency: 8Gb contract price had a slight drop in 2025/B, yet will stay at a relatively high place from the lowest spot price level considering consistent production cut on DDR4.
Price Trend: maintain at extreme low level throughout 2024 but is likely to have more production in 2025 1H to fulfill HBM demands.
Ⓒ 2025 ADATA Technology Co., Ltd. All Rights Reserved.