OpenAI’s ambitious Stargate data center initiative is poised to become one of the largest single consumers of memory in the world, potentially absorbing as much as 40% of global DRAM output. According to reports citing industry sources, Samsung Electronics and SK hynix have reached preliminary agreements to supply OpenAI with vast quantities of DRAM wafers to support the buildout of Stargate’s AI infrastructure.
Rather than delivering finished DRAM chips or high-bandwidth memory (HBM) stacks, the two South Korean memory giants are expected to provide undiced 300mm DRAM wafers. Estimated volumes could reach up to 900,000 wafers per month, an extraordinary figure in the context of the global semiconductor industry. Analysts note that total worldwide DRAM capacity is projected to reach roughly 2.25 million wafer starts per month in 2025, meaning Stargate alone could account for close to half of that output.
The memory supplied is expected to span both commodity DRAM, such as DDR5, and advanced HBM used in AI accelerators. What remains unclear is where the wafers will ultimately be diced, packaged, and assembled into finished memory products and modules, a process that could involve additional partners.
Stargate, a project backed by OpenAI alongside Oracle and SoftBank, aims to build multiple hyperscale AI data centers around the world. These facilities are expected to house massive numbers of servers equipped with next-generation GPUs, advanced cooling systems, and high-capacity power delivery. The scale of the project is so large that dedicated power plants and unconventional infrastructure, including floating data centers, are reportedly under consideration.
Beyond memory supply, Samsung is exploring broader partnerships with OpenAI. Samsung SDS is expected to collaborate on data center architecture and operations in South Korea, while other Samsung units are evaluating roles in construction, maritime engineering, and enterprise AI deployment. Together, these efforts highlight how Stargate is reshaping global supply chains as AI infrastructure expands at unprecedented speed.
Get in Touch 


