Implementation of computation and language… Samsung, SK, all-out effort in AI semiconductor super-gap convergence technology
[SEBB리포트-반도체 생태계 지각변동] ① Next-generation leader K-Semiconductor
Samsung develops first AI-combined HBM-PIM
High performance, high capacity, low power keywords
Significant expansion of memory + computation, functions and roles
SK launches language understanding and generation ‘AiMX’ for the first time
Following the commercialization of HBM3E, the 5th generation
Design-production convergence HBM4 implementation challenge
|HBM market share forecast. / Data: Provided by Trend Force|
[대한경제=한형용 기자] Changes in the artificial intelligence (AI) semiconductor ecosystem are expected to be centered around intelligent semiconductor (PIM), which adds computational functions to existing memory developed by Samsung Electronics and SK Hynix. In particular, the development of HBM4, the 6th generation high-bandwidth memory (HBM), also predicted a change in the global semiconductor market landscape as a plan to design and produce by fusing logic semiconductors (GPUs) and memory semiconductors (HBMs) into one was discussed.
According to the industry on the 20th, the applications of memory semiconductors are rapidly expanding beyond smartphones and PCs to include data centers. At the same time, memory semiconductors are also evolving. The key is to serve as a faster and more efficient storage device and even implement some computational functions.
The company that responded most quickly was Samsung Electronics. In February 2021, Samsung Electronics developed the world’s first HBM-PIM, which combines a memory semiconductor and an artificial intelligence processor into one. PIM is a next-generation new concept convergence technology that can solve data movement congestion problems in the field of artificial intelligence (AI) and big data processing by adding computational functions to memory semiconductors.
Samsung Electronics explains that installing HBM-PIM in an AI system can increase performance and reduce energy usage compared to systems using existing HBM. In addition, by supporting the existing HBM interface as is, customers using HBM can build a powerful AI accelerator system through HBM-PIM without changing hardware or software.
Previously, Lee Jeong-bae, head of Samsung Electronics’ memory division (president), selected ‘high performance, high capacity, and low power’ as the key keywords for next-generation memory semiconductors in the Samsung Electronics newsroom on the 17th of last month. At the same time, he emphasized PIM technology, saying, “As data increases rapidly, the role of memory is expanding as additional computational functions are required in addition to data storage, which is the original function of memory.”
SK Hynix developed PIM in February 2022 and introduced a sample of ‘GDDR6-AiM (Accelerator in Memory)’ as the first product to apply PIM. Next, they participated in ‘Supercomputing 2023’ held in Denver, Colorado, USA from the 12th to the 17th and demonstrated AiMX (AiM based Accelerator), a generative AI accelerator. AiMX is a product specialized in understanding human language and implementing large-scale language models (LLMs) that can be created. The product contains SK Hynix’s ‘GDDR6-AiM’ chip based on PIM (Processing in Memory), a next-generation technology that can solve data movement congestion problems by adding its own computing function to memory semiconductors.
In addition, with HBM3E, known as the 5th generation HBM leading the AI boom, being commercialized, a change in the development method of HBM4 was also announced. It is a method of fusing memory semiconductors and GPUs, which are logic semiconductors that were designed separately, into one. Currently, the GPU’s main computational functions are separated from HBM on a separate chip.
According to the industry, starting with HBM4, SK Hynix has challenged the method of implementing memory semiconductors and logic semiconductors together on the same die (a chip made of square pieces that make up a round wafer, with each square having integrated circuits). If this continues, the semiconductor design and production landscape is expected to change as the roles of memory, system, fabless (chip design), and foundry (consignment production) are converged.
An industry insider said, “Semiconductors are evolving in line with AI, and our companies are responding to the changing market one step ahead,” adding, “It is only a matter of time before the commercialization of next-generation memory (such as PIM developed by our companies).”
Reporter Han Hyeong-yong je8day@
〈ⓒ Daehan Economic Daily (www.dnews.co.kr), Reproduction, collection, and redistribution prohibited〉