Thursday, May 16, 2024

Debut HBM4 Memory’s Better Bandwidth

Latest news on HBM4 memory

The memory bus capacity of HBM4 might theoretically reach up to 2048 bits, as stated in a report published by DigiTimes. If anything like this were to occur, it would open up a wide variety of doors of possibility for those working in the fields of artificial intelligence and graphics processing units (GPU).

It has been revealed that Samsung and SK Hynix are working together on a project named “2000 I/O Ports” for the HBM4 chip. As a consequence of this collaboration, you may expect substantial HPC output from future AI GPUs.

According to Seoul Economy, which DigiTimes cites in their study, the memory bandwidth of the next-generation HBM memory may see a large surge, with a forecasted increase of 2 times over the previous generation. This information is based on predictions made by Seoul Economy.

This is a really large number in terms of its relevance inside the company, and to put it into this perspective, it is vital to understand that there has not been an improvement in memory interface for HBM memory since 2015. In other words, this is a problem that has been there for a very long time. To put it another way, this provides evidence that the amount in issue is one that is of an exceptionally high order.

Even though, on paper, this new development looks to be extremely promising, it is essential to note that it is accompanied by a big number of “if” statements. This is why it is crucial to note that this new development is accompanied by a significant number of “if” statements. The vast bulk of these assertions focus on how manufacturers will deal with the data transfer rate and the necessary modifications in individual memory stacks.

Comparing HBM4 and HBM3 memory

In spite of the fact that, on paper, this latest breakthrough seems to have a great deal of promise, it is essential to keep in mind that it is accompanied by a substantial number of “if” statements. HBM3e is now being integrated by the industry with the most recent generation of AI GPUs, which has the potential to achieve bandwidths of up to 5 TB/s per chip. This change is occurring in the sector now being discussed. This integration has led to considerable performance improvements for NVIDIA’s H100 AI graphics processing units, which are quite popular. These improvements have been brought about by increased efficiency.

According to a report that was made public by DigiTimes, Samsung and SK Hynix are apparently working on adding “2000 I/O” ports on their next-generation HBM4 memory standard. This information was gleaned from a report that was made public by DigiTimes. This information was taken from a piece of reporting that was made available by DigiTimes. This leads one to believe that the method will be able to do substantially more extensive computations and will also be able to provide support for significantly bigger LLMs.

HBM4 Memory
Image Credit to NVIDIA

It is really necessary for you to have a good knowledge of what this means when talked about in words that are more colloquial since it is such an important component of the strategy for the development of genAI in the next generation. We will get there in the end, even though the report hasn’t divulged any official advances just yet; nonetheless, there is still a huge space between where we are now and where we want to go in the future. The causes that lead to this realization are going to be covered in the next paragraph.

A paradigm change is now occurring in the area of artificial intelligence, during which time the capabilities of general artificial intelligence are being integrated into consumer applications. As a result of this, the most powerful firms on the internet are now participating in something that seems to be a “race.” Because of this, there is currently a substantial demand for artificial intelligence graphics processing units (GPUs).

Which need HBM as a main component; therefore, makers of memory are concentrating their efforts on ensuring that an appropriate supply of the product is available. There is no doubt that advancements in the HBM industry are on the horizon, but we shouldn’t anticipate seeing them anytime soon or even in the years to come, unless there is anything “cooking” behind the scenes that we aren’t aware of just yet. Don’t get the wrong idea; saying there won’t be any new developments in the HBM sector in the near future.

Source

agarapuramesh
agarapurameshhttps://govindhtech.com
Agarapu Ramesh was founder of the Govindhtech and Computer Hardware enthusiast. He interested in writing Technews articles. Working as an Editor of Govindhtech for one Year and previously working as a Computer Assembling Technician in G Traders from 2018 in India. His Education Qualification MSc.
RELATED ARTICLES

2 COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Recent Posts

Popular Post

Govindhtech.com Would you like to receive notifications on latest updates? No Yes