Friday, September 20, 2024

SK Hynix Displays A New AiMX At The AI Hardware Summit 2024

- Advertisement -

Presenting an upgraded AiMX solution at the AI Hardware & Edge AI Summit in 2024 is SK Hynix.

AI Hardware Summit 2024

An improved Accelerator-in-Memory based Accelerator (AiMX) card was introduced by SK Hynix at the AI Hardware & Edge AI Summit 2024, which took place in San Jose, California, from September. The conference, which is yearly organized by Kisaco Research, brings together professionals from the machine learning and artificial intelligence ecosystem to exchange innovations and industry advances. The event’s main emphasis this year was investigating energy and cost efficiency across the whole technological stack.

- Advertisement -
  • SK Hynix, making its fourth appearance at the summit, demonstrated how its AiM solutions can improve AI performance on edge devices and data centers.
  • Accelerator in Memory (AiM) is the PIM semiconductor product name for GDDR6-AiM from SK Hynix.

Devices on the edge of two networks that regulate data flow are known as edge devices. Edge devices have several functions, but their primary function is to act as a network’s entrance or departure point.

Introducing the New AiMX

High-performance memory solutions are essential to the LLMs’s seamless functioning in the AI age. But as these LLMs evolve and are trained on more and bigger information, the need for more effective solutions is rising. With its PIM product AiMX, an AI accelerator card that combines several GDDR6-AiMs to deliver high bandwidth and exceptional energy efficiency, SK Hynix responds to this need.

Advanced AI systems known as large language models (LLMs) need large datasets to train their models in order to produce language that is both human-like and understandable. Applications like as translation and natural language processing are made possible by it.

Processing-In-Memory (PIM) is a cutting-edge technology that reduces data transmission between the processor and memory by embedding processing capabilities into memory. This increases productivity and speed, particularly for data-intensive jobs like life cycle management (LLMs), where prompt data access and processing are critical.

- Advertisement -

World AI Summit 2024

SK Hynix unveiled their improved 32 GB AiMX prototype at the AI Hardware & Edge AI Summit 2024. This card provides double the capacity of the first card that was shown at the event the year before. An open-source LLM called the Llama 36 70B model was used by SK Hynix to demonstrate the prototype card’s enhanced processing capabilities in a multi-batch5 scenario. The presentation specifically highlighted AiMX‘s potential to function as a very efficient attention accelerator in data centers.

An open-source LLM called the Llama 36 70B model was used by SK Hynix to demonstrate
Image Credit To SK Hynix
  • Multi-batch processing is a kind of computer processing where many jobs are grouped together and processed all at once by the system.
  • Llama 3: An open-source LLM with instruction-tuned and pretrained language models created by Meta.
  • Mechanisms that provide LLMs with textual context are crucial because they reduce the likelihood of misconceptions and enable the model to provide outputs that are more precise and appropriate for the given context.
  • To showcase the enhanced AiMX‘s processing power, the Llama 3 70B model LLM was used for the demonstration.

AiMX tackles the issues of LLMs in terms of cost, performance, and power consumption in edge devices, on-device AI applications, and data centers. For instance, AiMX triples LLM speed while keeping power consumption the same when used in mobile on-device AI applications, as opposed to mobile DRAM.

AI Summit 2024

Accelerating LLM Services from Data Centers to Edge Devices: Featured Presentation

SK Hynix presented on the last day of the conference, explaining why AiMX is the best option for speeding up LLM services in edge devices and data centers. The business wants to build AI solutions for on-device AI based on mobile DRAM, and Euicheol Lim, research fellow and leader of the Solution Advanced Technology team, discussed the vision for AiM going forward. Lim concluded by underlining how crucial it is to work closely with businesses that create and oversee edge systems and data centers in order to further enhance AiMX solutions.

Looking Ahead: SK Hynix’s Prospects for AI-Powered AiMX

The AI Hardware & Edge AI Summit 2024 gave SK Hynix a stage on which to showcase the uses of AiMX in LLMs for edge devices and data centers. With its low power consumption and fast speed, AiMX is expected to be a major player in the development of AI and LLM applications.

- Advertisement -
Drakshi
Drakshi
Since June 2023, Drakshi has been writing articles of Artificial Intelligence for govindhtech. She was a postgraduate in business administration. She was an enthusiast of Artificial Intelligence.
RELATED ARTICLES

Recent Posts

Popular Post

Govindhtech.com Would you like to receive notifications on latest updates? No Yes