Thursday, December 26, 2024

Unveiling Microsoft Azure Integrated HSM And Azure Boost DPU

- Advertisement -

The newest Azure infrastructure silicon updates

Microsoft Azure unveiling its newest in-house security chip, Azure Integrated HSM, strengthens key management to keep encryption and signing keys within the HSM without affecting performance or latency.

Azure also introduced Azure Boost DPU, its first in-house DPU designed for data-centric workloads with high efficiency and low power, capable of absorbing multiple components of a traditional server into a single dedicated silicon. 

- Advertisement -

Azure Integrated HSM

Microsoft Azure is putting a lot of effort into eliminating infrastructure barriers that prevent us from providing customer value, whether they are resource or performance bottlenecks, as part of its all-encompassing strategy to optimize Azure infrastructure. To improve the security, effectiveness, performance, agility, and scale of its infrastructure, it is innovating at every stage of the stack, from silicon to systems to software.

Azure is constructing its infrastructure hardware with several layers of defense and specialized innovations to offer robust protection for Microsoft and its customers. Security is a top concern as part of its systems approach to optimizing every layer in its infrastructure.

Microsoft presented its strategy for Azure Confidential Inferencing and Trustworthy AI in September. The usage of hardware-based Trusted Execution Environments (TEEs) for workload protection and data privacy is emphasized in this vision. Its added new open-source silicon security advancements last month to further this aim. Among these were the announcement of the quantum-resilient accelerator Adams Bridge and its incorporation into Caliptra 2.0, the open-source silicon root of trust (RoT) of the future.

Microsoft’s latest in-house security chip, Azure Integrated HSM, is a dedicated Hardware Security Module (HSM) that enhances key protection by allowing the use of encryption and signing keys while they are contained within an HSM, without incurring the usual network access latencies for HSM access. This is part of Microsoft’s commitment to its Secure Future Initiative (SFI).

- Advertisement -

Enhancing security capabilities for every new Azure server

The stringent Federal Information Processing Standards (FIPS) 140-3 Level 3 Security Requirements for Cryptographic Modules are met by Azure Integrated HSM. While keys and security assets are in use, Azure Integrated HSM safeguards them. Specialized hardware cryptographic accelerators are available in Azure Integrated HSM to carry out encryption, decryption, signing, and verification tasks inside the integrated HSM’s boundaries.

Common cloud HSM services are network-based centralized resources that cloud tenants might set up to provide essential services for their workloads. Although these models offer strong key protection, scaling them with the same flexibility as other resources, like compute, can be challenging. Workloads can also seek the release of their keys from the HSM and import them into their local environment, if their key policy allows, otherwise they can incur network round-trip latency for calls to the network-attached HSM service. The security protections provided may fall short of FIPS 140-3 Level 3 when keys are exported from the HSM and placed into the workload environment.

The traditional trade-off between requesting the release of keys from the remote HSMs or increasing network round-trip latency to remote HSM services is eliminated with Azure Integrated HSM. Azure Integrated HSM offers locally attached HSM services to both general-purpose and confidential virtual machines and containers, acting as a server-local HSM that securely connects to the workload environments. Without the latency issues with round-trip network-attached HSM calls, this offers the advantage of industry-leading in-use key protection.

Security benefits of Azure Integrated HSM 

For Azure clients who need to safeguard their sensitive data and apps in the cloud, Azure Integrated HSM provides a number of advantages.

Enhanced security and locally deployed

Cryptographic keys can be kept separate from software, including host and guest software, with Azure Integrated HSM. Even while the keys are being used, Azure Integrated HSM allows them to stay completely contained within a specialized, hardware-based HSM that complies with FIPS 140-3 Level 3 security standards. This offers robust isolation, tamper detection, and logical and physical tamper prevention.

Locally deployed with minimum latency

Azure Integrated HSM‘s node-integrated connectivity and specialized hardware accelerators allow it to process large volumes of cryptographic requests with the least amount of latency. Dedicated, per-workload HSM partitions are also supported by Azure Integrated HSM. Partitions are isolated by hardware, and keys from the workload environment are only accessible by Oracle.

“Secure by design”, protecting across the Azure hardware fleet

Beginning next year, all new servers in Microsoft’s datacenters will have Azure Integrated HSM deployed to improve security for both general-purpose and confidential applications across Azure’s hardware fleet.

The most recent development in guaranteeing strong and thorough protection of its infrastructure is Azure Integrated HSM. It’s laying the groundwork for the trust and security that Azure offers its clients by including cutting-edge hardware security features like secure control modules and the silicon root of trust. In order to satisfy its clients’ changing needs, Azure is dedicated to constantly improving its cloud hardware security capabilities.

Boosting infrastructure efficiency with Azure Boost DPU

Moving large volumes of data over networks and storing it securely and reliably have become major difficulties in the age of cloud computing and artificial intelligence. In order to overcome these difficulties, a new data-centric processing design that complemented the CPUs and GPUs found in large datacenters was required a few years ago. When it comes to managing highly multiplexed data streams, which correspond to millions of network connections, traditional CPU architectures are inadequate. However, they excel at general-purpose activities. On the other hand, GPUs are less appropriate for data-centric workloads because they are specialized for large-scale vector and matrix calculations that are essential to AI workloads.

Data Processing Units (DPUs) are a new kind of silicon that was developed as a result of these studies. Tasks requiring the handling of large volumes of data at the line rate of the network are best suited for DPUs. For data-centric appliances like storage systems can function as independent processors.

Azure Boost DPU, Microsoft’s first in-house DPU. It is built to execute Azure’s data-centric workloads with low power consumption and excellent efficiency by combining several traditional server components into a single silicon block. Azure Boost DPU is a fully configurable system on a chip that combines network and storage engines, data accelerators, high-speed Ethernet and PCIe interfaces, and security features.

Azure Boost DPU is a hardware-software co-design that was created especially for the Azure infrastructure. It runs a unique, lightweight data-flow operating system to enable flexible platforms that are more efficient, perform better, and use less power than previous implementations. For example, compared to current CPUs, it anticipates that DPUs will operate cloud storage applications at four times the performance and three times the power. Additionally, DPU-based systems raise the bar for security and dependability by incorporating a custom application layer that makes use of the DPU’s tightly integrated data compression, data protection, and cryptography engines.

The newest member of the Azure family of hardware innovations for infrastructure advancements is Azure Boost DPU. It is dedicated to providing innovations that increase the capability, efficiency, and scalability of its infrastructure to satisfy clients’ changing demands as it keep pushing the envelope of what is conceivable.

- Advertisement -
Thota nithya
Thota nithya
Thota Nithya has been writing Cloud Computing articles for govindhtech from APR 2023. She was a science graduate. She was an enthusiast of cloud computing.
RELATED ARTICLES

Recent Posts

Popular Post

Govindhtech.com Would you like to receive notifications on latest updates? No Yes