Samsung shipped LPDDR6X memory samples to Qualcomm this week, according to reports from The Bell cited by SamMobile and Windows Report. The samples target Qualcomm's upcoming AI250 accelerator, which could feature over 1TB of the next-generation memory.
The move comes as Samsung announced mass production of sixth-generation high-bandwidth memory (HBM4) chips. Samsung appears to be regaining competitiveness in memory after trailing rival SK hynix in the AI boom.
SK hynix beat Samsung in operating profit for the first time in 2025, according to CNBC. The company held a 57% revenue share in the HBM market compared to Samsung's 22% in the third quarter of last year. SK hynix reportedly secured over two-thirds of HBM supply orders for Nvidia's next-generation Vera Rubin products.
LPDDR6 offers maximum data transfer speeds of 14.4Gbps, representing a 44% improvement over LPDDR5X chips. Peak bandwidth reaches 38.4GB/s, a 20% increase. Initial LPDDR6 speeds start at 10.7Gbps with 21% higher efficiency compared to LPDDR5, according to Windows Report.
Qualcomm's AI250 accelerator will succeed the current AI200, which already supports up to 768GB of LPDDR memory. The AI250 could push capacity beyond 1TB using LPDDR6X. This approach differs from NVIDIA, AMD, and Huawei, which typically rely on HBM for top-tier AI accelerators.
The U.S. government approved annual export licenses allowing Samsung and SK hynix to ship chipmaking equipment to their manufacturing facilities in China throughout 2026, according to Tom's Hardware. The licenses replace a waiver system that lapsed on December 31.
Samsung plans to expand its production capacity by around 50% in 2026, while SK hynix will increase infrastructure investment by more than four times previous figures, according to Data Center Dynamics. Both companies are constructing new fabs in South Korea to meet AI customer demand.
SK hynix led the HBM market with 62% share in the second quarter of 2025, followed by Micron at 21% and Samsung at 17%, according to Astute Group. Analysts at Counterpoint Research forecast Samsung's position will strengthen as its HBM3E parts qualify with major customers and HBM4 enters full-scale supply in 2026.
"SK Hynix is clearly an outstanding 'AI Winner' in Asia," said MS Hwang, research director at Counterpoint Research. The company's lead in quality and supply of HBMs and other chips used in AI servers has been crucial in the current phase of the AI infrastructure boom.
LPDDR6X remains under development with official specifications not yet finalized by JEDEC. More technical details are expected later this year.
The memory is unlikely to reach mainstream availability until late 2027 or early 2028.
Samsung's sample shipments to Qualcomm signal early ecosystem alignment with key partners. Samsung recently began mass production of HBM4 memory chips, while Micron told investors it expects to sell out its HBM production capacity for 2026 and forecast an HBM annualized revenue run-rate of around $8 billion, according to Reuters.















