Samsung & SK Hynix GPU DRAM Prices Shoot Up: HBM3 5x More Expensive As Demand Grows For NVIDIA GPUs In ChatGPT

NVIDIA's flagship Datacenter GPU, the Hopper H100, has been pictured in all its glory. (Image Credits: CNET)

The increasing demand for NVIDIA GPUs in ChatGPT has caused prices of DRAM including HBM from Samsung & SK Hynix to go up.

NVIDIA GPU Demand For ChatGPT Also Raises Samsung's & SK Hynix's HBM DRAM Prices

We have reported over the last few weeks how NVIDIA GPUs are the most popular choice for AI tools such as ChatGPT and CEO, Jensen Huang, has called it the biggest innovation within the computing landscape. Now, DRAM makers have started to hike up the prices of high-bandwidth memory or HBM solutions which are used to power NVIDIA's AI GPUs.

A report from South Korean outlet, BusinessKorea, reveals that DRAM manufacturers such as SK Hynix and Samsung have raised the prices of their memory solutions including HBM. It is said that NVIDIA has been asking SK Hynix to increase its HBM3 production capacity. But other vendors such as Intel are also looking to integrate HBM3 within their own next-gen products which means that Sk Hynix may not be able to keep up with demand. As such, the prices of HBM memory, especially the latest HBM3 solution, have shot up to 5x.

The advent of ChatGPT, an artificial intelligence (AI) chatbot, is providing opportunities for Korean memory semiconductor makers to create a new business. ChatGTP learns a lot of data through super-large artificial intelligence (AI) and answers questions naturally. DRAM data processing speed has become important for better and faster ChatGTP services. Korean companies are producing all of high-performance DRAMs essential for this.

Nvidia, the world's largest GPU company, has been asking SK Hynix to supply the latest product, HBM3 chips. Intel, the world's No. 1 server CPU company, is also working hard to sell products equipped with SK Hynix's HBM3. An industry insider said, “The price of HBM3 increased up to five times compared to the highest performance DRAM."

via BusinessKorea

But it's not just HBM3 that has gone up, older HBM standards such as HBM2 and HBM2e are also featured on NVIDIA's last-gen GPUs such as Ampere and Volta which also offer leadership Artificial Intelligence capabilities. These chips are also high in demand within the AI industry and useful for tools such as ChatGPT. SK Hynix alone controls the majority of the HBM market with a share of 60-70%.

Recently, analysts and industry insiders have stated that the success of ChatGPT can benefit NVIDIA a lot. OpenAI, the creators of ChatGPT, are already using around 25,000 NVIDIA GPUs to power its current server needs and with the demand increasing and more competing solutions coming up, it looks like NVIDIA's GPUs which are known to offer the best AI capabilities, can definitely become a popular choice which is also why the industry predicts that NVIDIA can see the demand to outstrip the overall supply in the next coming quarters.

The post Samsung & SK Hynix GPU DRAM Prices Shoot Up: HBM3 5x More Expensive As Demand Grows For NVIDIA GPUs In ChatGPT by Hassan Mujtaba appeared first on Wccftech.



Refference- https://wccftech.com

Post a Comment

0 Comments