Cloud-Based Enterprise AI from Intel and IBM. To assist businesses in scaling AI, Intel and IBM will implement Gaudi 3 AI accelerators on IBM Cloud.
Gaudi 3 AI Accelerator
The worldwide deployment of Intel Gaudi 3 AI accelerators as a service on IBM Cloud is the result of an announcement made by IBM and Intel. Anticipated for release in early 2025, this product seeks to support corporate AI scalability more economically and foster creativity supported by security and resilience.
Support for Gaudi 3 will also be possible because to this partnership with IBM’s Watsonx AI and analytics platform. The first cloud service provider (CSP) to use Gaudi 3 is IBM Cloud, and the product will be offered for on-premises and hybrid setups.
Intel and IBM
“AI’s true potential requires an open, cooperative environment that gives customers alternatives and solutions. are generating new AI capabilities and satisfying the need for reasonably priced, safe, and cutting-edge AI computing solutions by fusing Xeon CPUs and Gaudi 3 AI accelerators with IBM Cloud.
Why This Is Important: Although generative AI may speed up transformation, the amount of computational power needed highlights how important it is for businesses to prioritize availability, performance, cost, energy efficiency, and security. By working together, Intel and IBM want to improve performance while reducing the total cost of ownership for using and scaling AI.
Gaudi 3
Gaudi 3’s integration with 5th generation Xeon simplifies workload and application management by supporting corporate AI workloads in data centers and the cloud. It also gives clients insight and control over their software stack. Performance, security, and resilience are given first priority as clients expand corporate AI workloads more affordably with the aid of IBM Cloud and Gaudi 3.
IBM’s Watsonx AI and data platform will support Gaudi 3 to improve model inferencing price/performance. This will give Watsonx clients access to extra AI infrastructure resources for scaling their AI workloads across hybrid cloud environments.
“IBM is dedicated to supporting Intel customers in driving innovation in AI and hybrid cloud by providing solutions that address their business demands. According to Alan Peacock, general manager of IBM Cloud, “Intel commitment to security and resilience with IBM Cloud has helped fuel IBM’s hybrid cloud and AI strategy for Intel enterprise clients.”
Intel Gaudi 3 AI Accelerator
“The clients will have access to a flexible enterprise AI solution that aims to optimize cost performance by utilizing IBM Cloud and Intel’s Gaudi 3 accelerators.” They are making possible new AI business prospects available to their customers so they may test, develop, and implement AI inferencing solutions more affordably.
IBM and Intel
How It Works: IBM and Intel are working together to provide customers using AI a Gaudi 3 service capability. IBM and Intel want to use IBM Cloud’s security and compliance features to assist customers in a variety of sectors, including highly regulated ones.
Scalability and Flexibility: Clients may modify computing resources as required with the help of scalable and flexible solutions from IBM Cloud and Intel, which may result in cost savings and improved operational effectiveness.
Improved Security and Performance: By integrating Gaudi 3 with IBM Cloud Virtual Servers for VPC, x86-based businesses will be able to execute applications more securely and quickly than they could have before, which will improve user experiences.
What’s Next: Intel and IBM have a long history of working together, starting with the IBM PC and continuing with Gaudi 3 to create corporate AI solutions. General availability of IBM Cloud with Gaudi 3 products is scheduled for early 2025. In the next months, stay out for additional developments from IBM and Intel.
Intel Gaudi 3: The Distinguishing AI
Introducing your new, high-performing choice for every kind of workplace AI task.
An Improved Method for Using Enterprise AI
The Intel Gaudi 3 AI accelerators are designed to withstand rigorous training and inference tasks. They are based on the high-efficiency Intel Gaudi platform, which has shown MLPerf benchmark performance.
Support AI workloads from node to mega cluster in your data center or in the cloud, all running on Ethernet equipment you probably already possess. Intel Gaudi 3 may be crucial to the success of any AI project, regardless of how many accelerators you require one or hundreds.
Developed to Meet AI’s Real-World Needs
With the help of industry-standard Ethernet networking and open, community-based software, you can grow systems more flexibly thanks to the Intel Gaudi 3 AI accelerators.
Adopt Easily
Whether you are beginning from scratch, optimizing pre-made models, or switching from a GPU-based method, using Intel Gaudi 3 AI accelerators is easy.
Designed with developers in mind: To quickly catch up, make use of developer resources and software tools.
Encouragement of Both New and Old Models: Use open source tools, such as Hugging Face resources, to modify reference models, create new ones, or migrate old ones.
Included PyTorch: Continue using the library that your team is already familiar with.
Simple Translation of Models Based on GPUs: With the help of their specially designed software tools, quickly transfer your current solutions.
Ease Development from Start to Finish
Take less time to get from proof of concept to manufacturing. Intel Gaudi 3 AI Accelerators are backed by a robust suite of software tools, resources, and training from migration to implementation. Find out what resources are available to make your AI endeavors easier.
Scale Without Effort: Integrate AI into everyday life. The goal of the Intel Gaudi 3 AI Accelerators is to provide even the biggest and most complicated installations with straightforward, affordable AI scaling.
Increased I/O: Benefit from 33 percent greater I/O connection per accelerator than in H100,4 to allow for huge scale-up and scale-out while maintaining optimal cost effectiveness.
Constructed for Ethernet: Utilize the networking infrastructure you currently have and use conventional Ethernet gear to accommodate growing demands.
Open: Steer clear of hazardous investments in proprietary, locked technologies like NVSwitch, InfiniBand, and NVLink.
Boost Your AI Use Case: Realize the extraordinary on any scale. Modern generative AI and LLMs are supported by Intel Gaudi 3 AI accelerators in the data center. These accelerators work in tandem with Intel Xeon processors, the preferred host CPU for cutting-edge AI systems, to provide enterprise performance and dependability.