Sunday, July 7, 2024

Exclusive: China-Only A800 AI GPU Shift Amidst US Rules

NVIDIA A800 AI GPUs

According to reports, NVIDIA has shifted the supply of its “China-Only” Ampere A800 AI GPUs while supervising the US government’s export ban to unfriendly countries.

After China was hit with a US ban, NVIDIA plans to make A800 AI GPUs available globally, giving businesses focused on AI another option.

As the Biden administration imposes more “vigorous” sanctions on China in an effort to impede the country’s rapidly progressing artificial intelligence, the news breaks. The H800 and A800 AI GPUs, which were already reduced versions created by NVIDIA to abide with US trade laws, will not be available to China.

Given that access to high-end H100s was previously restricted, the US regulation shift is likely to impede the expansion of AI markets in China. The GPU makers, however, are now facing alternatives to using “indirect” retail channels to get rid of their current inventory since they were placed in “hot waters” by this decision.

Nvidia Offers The “Ultimate” Workstation Platform: An A800 40GB PCIe Card

The A800 AI GPU is found in the dual-slot PCIe card known as the A800 40GB Active product from Nvidia. For servers in China, the A800 AI GPU was also offered in the SXM form factor in addition to the PCIe card.

While Nvidia is promoting the A800 40GB Active PCIe card for powerful desktop PCs known as workstations, the PCIe and SXM variants of the A800 were primarily intended to power servers in China.

Said to “bring the power of a supercomputer to your workstation and accelerate end-to-end data science workflows,” the A800 40GB Active GPU is billed as the “ultimate workstation development platform for AI, data science, and high-performance computing” on the Nvidia website.

The Nvidia A800 40GB Active PCIe card, like the A100 40GB, contains 6,912 CUDA cores, 432 Tensor cores, 40GB of high-bandwidth HBM2 memory, and 240 watts.

The A800 can match the A100’s PCIe and SXM form factors with 9.7 and 19.5 teraflops of double- and single-precision performance.

With the GPU boasting an NVLink chip-to-chip bandwidth of 400 GB/s compared to the A100’s 600 GB/s, the primary distinction is the speed at which the A800 can connect with other A800s.

Nvidia stated in marketing materials that the A800 40GB Active GPU is 4.2 times faster for AI inference with the BERT Large model, 90% faster for AI training with the BERT Large model, 90% faster for the GTC benchmark, and 70% faster for the LAMMPS benchmark when compared to the company’s Quadro GV100 PCIe card, which was released in 2018.

Along with other AI chips in its lineup, Nvidia claimed that the A800 40GB Active GPU is also included with a three-year subscription to Nvidia AI Enterprise, the software package that comprises AI frameworks, libraries, pre-trained models, and tools for creating and executing AI applications.

According to reports, NVIDIA has joined forces with US-based partners including PNY and system integrator Colfax International to increase sales of A800 AI GPUs, which were once solely meant for use in China.

To quickly refresh your memory, the A800 is a trimmed-down version of the original A100 GPU. Critical characteristics like memory bandwidth and NVLINK speeds were sacrificed since they weren’t appropriate for mass markets. Ironically, however, in an effort to attract customers, NVIDIA is now marketing the GPU as the “ultimate workstation development platform for AI, data science, and high-performance computing”.

The NVIDIA A800 40 GB “Active GPU” is now available for purchase from PNY in North America, South America, Europe, Africa, and India. It’s crucial to remember that the A800s were only considered a “alternative” to mainstream A800 AI GPUs; nevertheless, considering the current demand from the AI sector, it’s likely that NVIDIA made the correct choice in expanding the A800’s availability. They could show to be a competitive substitute for A800 AI GPUs like the H100s, which have a significant backlog of orders because of supply chain issues and high industry demand.

These kinds of “workarounds” were anticipated, and based on the current circumstances, it is believed that the A800s would sell like “hotcakes” since NVIDIA has reservations far into 2024. This only presents a chance for other AI businesses to take a share of NVIDIA’s AI goldmine.

Source

agarapuramesh
agarapurameshhttps://govindhtech.com
Agarapu Ramesh was founder of the Govindhtech and Computer Hardware enthusiast. He interested in writing Technews articles. Working as an Editor of Govindhtech for one Year and previously working as a Computer Assembling Technician in G Traders from 2018 in India. His Education Qualification MSc.
RELATED ARTICLES

2 COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Recent Posts

Popular Post

Govindhtech.com Would you like to receive notifications on latest updates? No Yes