Wednesday, February 21, 2024

Top 5 This Week

Related Posts

Qualcomm Cloud AI 100 Lifts AWS’s Latest EC2!

Qualcomm Cloud AI 100 in AWS EC2

With the general release of new Amazon Elastic Compute Cloud (Amazon EC2) DL2q instances, the Qualcomm Cloud AI 100 launch, which built on the company’s technological collaboration with AWS, marked the first significant milestone in the joint efforts. The first instances of the Qualcomm artificial intelligence (AI) solution to be deployed in the cloud are the Amazon EC2 DL2q instances.

The Qualcomm Cloud AI 100 accelerator’s multi-core architecture is both scalable and flexible, making it suitable for a broad variety of use-cases, including:

  • Large Language Models (LLMs) and Generative AI: Supporting models with up to 16B parameters on a single card and 8x that in a single DL2q instance, LLMs address use cases related to creativity and productivity.
  • Classic AI: This includes computer vision and natural language processing.

They recently showcased a variety of applications using AWS EC DL2q powered by Qualcomm Cloud AI 100 at this year’s AWS re:Invent 2023:

  1. A conversational AI that makes use of the Llama2 7B parameter LLM model.
  2. Using the Stable Diffusion model, create images from text.
  3. Utilising the Whisper Lite model to simultaneously transcribing multiple audio streams.
  4. Utilising the transformer-based Opus mode to translate between several languages.

Nakul Duggal, SVP & GM, Automotive & Cloud Computing at Qualcomm Technologies, Inc., stated, “Working with AWS is empowering us to build on they established industry leadership in high-performance, low-power deep learning inference acceleration technology.” The work they have done so far shows how well cloud technologies can be integrated into software development and deployment cycles.

An affordable revolution in AI

EC2 customers can run inference on a variety of models with best-in-class performance-per-total cost of ownership (TCO) thanks to the Amazon EC2 DL2q instance. As an illustration:

  • For DL inference models, there is a price-performance advantage of up to 50% when compared to the latest generation of GPU-based Amazon EC2 instances.
  • With CV-based security, there is a reduction in Inference cards of over three times, resulting in a significantly more affordable system solution.
  • allowing for the optimization of 2.5 smaller models, such as models, on Qualcomm Cloud AI 100.

The Qualcomm AI Stack, which offers a consistent developer experience across Qualcomm AI in the cloud and other Qualcomm products, is a feature of the DL2q instance.

The DL2q instances and Qualcomm edge devices are powered by the same Qualcomm AI Stack and base AI technology, giving users a consistent developer experience with a single application programming interface (API) across their:

  1. Cloud,
  2. Automobile,
  3. Computer,
  4. Expanded reality, as well as
  5. Environments for developing smartphones.

Customers can use the AWS Deep Learning AMI (DLAMI), which includes popular machine learning frameworks like PyTorch and TensorFlow along with Qualcomm’s SDK prepackaged.

Since June 2023, Drakshi has been writing articles of Artificial Intelligence for govindhtech. She was a postgraduate in business administration. She was an enthusiast of Artificial Intelligence.


Leave a reply

Please enter your comment!
Please enter your name here Would you like to receive notifications on latest updates? No Yes

Discover more from Govindhtech

Subscribe now to keep reading and get access to the full archive.

Continue reading