Samsung Electronics Co., the world’s largest memory chip maker, on Tuesday unveiled new solutions for mainstream DRAM modules and mobile memory empowering artificial intelligence (AI) capabilities with hopes to take the lead in bourgeoning AI chip market.
At Hot Chips 33 – a leading next-generation semiconductor conference – held online on Tuesday, Samsung Electronics introduced multiple solutions and cases applied with processing-in-memory (PIM) technology. PIM is a next-generation intelligent semiconductor that converges memory and system chips enabling computing task inside memory.
PIM is regarded as the core in AI innovations. It enables dramatic improvement in memory performance to boost energy efficiency and performance of AI system such as image classification, voice recognition, and machine translation.
In traditional computing method that involves central processing unit and DRAM, CPU brings DRAM data and sends back to DRAM after computing process for storage to process data. A surge in data volume for processing can cause bottleneck, restricting high-performance, high-efficient AI system. PIM technology combines memory chip and AI processor and allows self-computing process without going through CPU to resolve the bottleneck issue.
At the conference, Samsung Electronics unveiled actual system application examples of PIM-enabled High Bandwidth Memory (HBM-PIM), Acceleration DIMM (AXDIMM) loaded with AI engine on DRAM module, and LPDDR5-PIM technology that combines mobile DRAM and PIM.
AXDIMM, a solution that applies AI engine to DRAM module, expands PIM technology unit from chip to module. AXDIMM enhances energy efficiency and accelerates speed as it requires less data movement between CPU and DRAM module. It can be applied to existing system without alteration.
A test on corporate software company SAP showed that AXDIMM offers approximately twice the performance in AI-based recommendation applications and 40 percent decrease in system-wide energy use, the company said.
Samsung Electronics also unveiled LPDDR5-PIM technology that stretches PIM technology to mobile sector. Based on a simulation, LPDDR5-PIM can more than double performance while reducing energy usage by over 60 percent in applications like voice recognition, translation, and chatbot.
At the conference, the tech giant also presented its test result of applying HBM-PIM unveiled in February to actual system. Samsung Electronics said that applying HBM-PIM to U.S. Xilinx Virtex Ultrascale+ AI accelerator delivered 2.5 times greater system performance and more than 60 percent cut in energy consumption.
Samsung Electronics plans to diversify its PIM product portfolio and partner with various global enterprises to solidify its control over next-generation memory chip ecosystem. It is aiming to shift memory market paradigm with AI and memory integration.
“HBM-PIM is the industry’s first AI-tailored memory solution being tested in customer AI-accelerator systems, demonstrating tremendous commercial potential,” said Kim Nam-sung, senior vice president of DRAM product and technology at Samsung Electronics. “Through standardization of the technology, applications will become numerous, expanding into HBM3 for next-generation supercomputers and AI applications, and even into mobile memory for on-device AI as well as for memory modules used in data centers.”
The AI application sector, meanwhile, that includes metaverse, voice recognition, machine translation, and autonomous driving, has been expanding, leaving room for growth in AI chip sector.
According to Korea Information Society Development Institute, global AI chip market is projected to reach $117.9 billion in 2030, up from $18.45 billion last year.
By Noh Hyun and Lee Eun-joo
[ⓒ Pulse by Maeil Business News Korea & mk.co.kr, All rights reserved]