Thursday, December 5, 2024

Presenting Azure AI Agent Service: Your AI-Powered Assistant

- Advertisement -

Presenting Azure AI Agent Service

Azure has announced managed features that enable developers to create safe, stateful, self-governing AI bots that automate all business procedures.

Organizations require adaptable, safe platforms for the development, deployment, and monitoring of autonomous AI agents in order to fully exploit their potential.

- Advertisement -

Use Azure AI Agent Service to enable autonomous agent capabilities

At Ignite 2024, Azure announced the upcoming public preview of Azure AI Agent Service, a suite of feature-rich, managed capabilities that brings together all the models, data, tools, and services that businesses require to automate any kind of business operation. This announcement is motivated by the needs of its customers and the potential of autonomous AI agents.

Azure AI Agent Service is adaptable and independent of use case. Whether it’s personal productivity agents that send emails and set up meetings, research agents that continuously track market trends and generate reports, sales agents that can investigate leads and automatically qualify them, customer service agents that follow up with personalized messages, or developer agents that can update your code base or evolve a code repository interactively, this represents countless opportunities to automate repetitive tasks and open up new avenues for knowledge work.

What distinguishes Azure AI Agent Service?

After speaking with hundreds of firms, it has discovered that there are four essential components needed to quickly produce safe, dependable agents:

  • Develop and automate processes quickly: In order to carry out deterministic or non-deterministic operations, agents must smoothly interact with the appropriate tools, systems, and APIs.
  • Integrate with knowledge connectors and a large memory: In order to have the appropriate context to finish a task, agents must connect to internal and external knowledge sources and monitor the status of conversations.
  • Flexible model selection: Agents that are constructed using the right model for the job at hand can improve the integration of data from many sources, produce better outcomes for situations unique to the task at hand, and increase cost effectiveness in scaled agent deployments.
  • Built-in enterprise readiness: Agents must be able to scale with an organization’s needs, meet its specific data privacy and compliance requirements, and finish tasks with high quality and dependability.

Azure AI Agent Service offers these components for end-to-end agent development through a single product surface by utilizing the user-friendly interface and extensive toolkit in the Azure AI Foundry SDK and site.

- Advertisement -
Azure AI Agent Service capabilities
Azure AI Agent Service capabilities Image credit to Azure

Let’s now examine the capabilities of Azure AI Agent Service in more detail.

Fast agent development and automation with powerful integrations

Azure AI Agent Service, based on OpenAI’s powerful yet flexible Assistants API, allows rapid agent development with built-in memory management and a sophisticated interface to seamlessly integrate with popular compute platforms and bridge LLM capabilities with general purpose, programmatic actions.

  • Allow your agent to act with 1400+ Azure Logic Apps connectors: Use Logic Apps’ extensive connector ecosystem to allow your agent accomplish tasks and act for users. Logic apps simplify workflow business logic in Azure Portal to connect your agent to external systems, tools, and APIs. Azure App Service, Dynamics365 Customer Voice, Microsoft Teams, M365 Excel, MongoDB, Dropbox, Jira, Gmail, Twilio, SAP, Stripe, ServiceNow, and others are connectors.
  • Use Azure Functions to provide stateless or stateful code-based activities beyond chat mode: Allow your agent to call APIs and transmit and wait for events. Azure Functions and Azure Durable tasks let you execute serverless code for synchronous, asynchronous, long-running, and event-driven tasks like invoice approval with human-in-the-loop, long-term product supply chain monitoring, and more.
  • Code Interpreter lets your agent create and run Python code in a safe environment, handle several data types, and generate data and visual files. This tool lets you use storage data, unlike the Assistants API.
  • Standardize tool library with OpenAPI: Use an OpenAPI 3.0 tool to connect your AI agent to an external API for scaled application compatibility. Custom tools can authenticate access and connections with managed identities (Microsoft Entra ID) for enhanced security, making it perfect for infrastructure or web services integration.
  • Add cloud-hosted tools to Llama Stack agents: The agent protocol is supported by Azure AI Agent Service for Llama Stack SDK developers. Scalable, cloud-hosted, enterprise-grade tools will be wireline compatible with Llama Stack.

Anchor agent outputs with a large knowledge environment

Easily establish a comprehensive ecosystem of enterprise knowledge sources to let agents access and interpret data from different sources, boosting user query responses. These data connectors fit your network characteristics and interact effortlessly with your data. Built-in data sources are:

  • Real-time web data online data grounding with Bing lets your agent give users the latest information. This addresses LLMs’ inability to answer current events prompts like “top news headlines” factually.
  • Microsoft SharePoint private data: SharePoint internal documents can help your agent provide accurate responses. By using on-behalf-of (OBO) authentication, agents can only access SharePoint data that the end user has permissions for.
  • Talk to structured data in Microsoft Fabric: Power data-driven decision making in your organization without SQL or data context knowledge. The built-in Fabric AI Skills allow your agent to develop generative AI-based conversational Q&A systems on Fabric data. Fabric provides secure data connection with OBO authentication.
  • Add private data from Azure AI Search, Azure Blob, and local files to agent outputs: Azure re-invented the File Search tool in Assistants API to let you bring existing Azure AI Search index or develop a new one using Blob Storage or local storage with an inbuilt data ingestion pipeline. With file storage in your Azure storage account and search indexes in your Azure Search Resource, this new file search gives you full control over your private data.
  • Gain a competitive edge with licensed data: Add licensed data from private data suppliers like Tripadvisor to your agent responses to provide them with the latest, best data for your use case. It will add more licensed data sources from other industries and professions.

In addition to enterprise information, AI agents need thread or conversation state management to preserve context, deliver tailored interactions, and improve performance over time. By managing and obtaining conversation history from each end-user, Azure AI Agent Service simplifies thread management and provides consistent context for better interactions. This also helps you overcome AI agent model context window restrictions.

Use GPT-4o, Llama 3, or another model that suits the job

Developers love constructing AI assistants with Azure OpenAI Service Assistants API’s latest OpenAI GPT models. Azure now offers cutting-edge models from top model suppliers so you can design task-specific agents, optimize TCO, and more.

  • Leverage Models-as-a-Service: Azure AI Agent Service will support models from Azure AI Foundry and use cross-model compatible, cloud-hosted tools for code execution, retrieval-augmented generation, and more. The Azure Models-as-a-Service API lets developers create agents with Meta Llama 3.1, Mistral Large, and Cohere Command R+ in addition to Azure OpenAI models.
  • Multi-modal support lets AI agents process and respond to data formats other than text, broadening application cases. GPT-4o‘s picture and audio modalities will be supported so you may analyze and mix data from different forms to gain insights, make decisions, and give user-specific outputs.

For designing secure, enterprise-ready agents from scratch

Azure AI Agent Service provides enterprise tools to protect sensitive data and meet regulatory standards.

  • Bring your own storage: Unlike Assistants API, you can now link enterprise data sources to safely access enterprise data for your agent.
  • BYO virtual network: Design agent apps with strict no-public-egress data traffic to protect network interactions and data privacy.
  • Keyless setup, OBO authentication: Keyless setup and on-behalf-of authentication simplify agent configuration and authentication, easing resource management and deployment.
  • Endless scope: Azure AI Agent Service on provided deployments offers unlimited performance and scaling. Agent-powered apps may now be flexible and have predictable latency and high throughput.
  • Use OpenTelemetry to track agent performance: Understand your AI agent’s reliability and performance. The Azure AI Foundry SDK lets you add OpenTelemetry-compatible metrics to your monitoring dashboard for offline and online agent output review.
  • Content filtering and XPIA mitigation help build responsibly: Azure AI Agent Service detects dangerous content at various severity levels with prebuilt and custom content filters.

Agents are protected from malicious cross-prompt injection attacks by prompt shields. Like Azure OpenAI Service, Azure AI Agent Service prompts and completions are not utilized to train, retrain, or improve Microsoft or 3rd party goods or services without your permission. Customer data can be deleted at will.

Use Azure AI Agent Service to orchestrate effective multi-agent systems

Azure AI Agent Service is pre-configured with multi-agent orchestration frameworks natively compatible with the Assistants API, including Semantic Kernel, an enterprise AI SDK for Python,.NET, and Java, and AutoGen, a cutting-edge research SDK for Python developed by Microsoft Research.

To get the most dependable, scalable, and secure agents while developing a new multi-agent solution, begin by creating singleton agents using Azure AI Agent Service. These agents can then be coordinated by AutoGen, which is always developing to determine the most effective patterns of cooperation for agents (and humans) to cooperate. If you want non-breaking updates and production support, you may then move features that demonstrate production value with AutoGen into Semantic Kernel.

- Advertisement -
Drakshi
Drakshi
Since June 2023, Drakshi has been writing articles of Artificial Intelligence for govindhtech. She was a postgraduate in business administration. She was an enthusiast of Artificial Intelligence.
RELATED ARTICLES

Recent Posts

Popular Post

Govindhtech.com Would you like to receive notifications on latest updates? No Yes