NVIDIA & Meta are reportedly gearing up to expand data center facilities, as the companies demand additional supply of HBM and DDR5 memory from SK Hynix.
NVIDIA & Meta Plan On Accelerating GenAI Developments Through Rapid Expansion of Data Centers, Demand More DRAM Supply From SK hynix
With the rising influence of AI, companies like Meta have shifted their focus toward the development of AI servers in order to capitalize on the hype. A prime example of this is the inauguration of AI Research SuperCluster, which was unveiled by the company a year ago featuring NVIDIA's A100 GPUs. Meta aims at integrating GenAI into its mainstream social media platform, due to which it is now progressing towards acquiring advanced tech to reach the goal.
Industry sources cite that Meta officials paid a visit to SK Hynix's Icheon facility, where the company analyzed HBM and DDR5 memory division. Moreover, Meta had also conducted a quality test for DDR5 memory modules as well, expressing their interest in the process. Meta is already a well-settled partner of SK Hynix and primarily deals in storage solutions for its servers. However, due to current AI hype, Meta is now targeting obtaining DDR5 memory supply. Unfortunately, the company is faced with a massive challenge, which is an "unbalanced" supply chain.
DDR5 and HBM are in short supply compared to demand. Meta and Nvidia are constantly requesting more products from SK Hynix, so quality inspections as well as discussions for additional product supply have been carried out at the Icheon factory.
It is reported that Meta's demand for DDR5 memory has met with reservations from SK Hynix, claiming that the company is facing order backlogs due to the enormous demand. This is indeed evident, due to the fact that every other company in the GenAI race is hustling towards acquiring the technology necessary to operate current-gen datacenters, with DDR5 memory being a crucial part.
Apart from Meta, it is also disclosed that NVIDIA is also planning to visit SK Hynix facilities soon, with discussions of next-gen "HBM3e" memory for upcoming AI GPUs. The HBM industry is currently witnessing an economic rebound, mainly due to their significance in AI GPUs, especially the NVIDIA H100s. NVIDIA sampled the HBM3e memory previously, and by the looks of it, it is ready to be integrated into next-gen AI products.
Since Team Green is planning on massive production upscale, it is certain that suppliers might have to pull down the throttles in order to meet the huge demand. With the aim of shipping 1.5 million to 2 million H100s in 2024, NVIDIA is looking for a steady supply chain, and perhaps an escape from the current order backlog situation.
For suppliers like SK Hynix, the AI boom is a rescue for taking the company out of dwindling financial quarters. The Korean giant has been witnessing economic losses since the beginning of the year, and the company has "bet" its future on the AI hype. Since it currently holds an 80% global share of the HBM industry, we could see huge revenue generation figures in the upcoming quarters.
Refference- https://wccftech.com
0 Comments