With more context for your tools and services, GitHub Copilot is more agentic now, powered by the best models.
In celebration of MSFT’s 50th anniversary, Microsoft Azure is introducing Visual Studio Code’s agent mode to all users, which now includes MCP support, allowing you to access any context or feature you desire. Furthermore, Microsoft is excited to present a new, local, and open-source GitHub MCP server that enables you to integrate GitHub functionality into any MCP-capable LLM product.
It is extending the availability of Anthropic Claude 3.5, 3.7 Sonnet, 3.7 Sonnet Thinking, Google Gemini 2.0 Flash, and OpenAI o3-mini to all paid Copilot levels through premium requests, in keeping with pledge to provide a variety of models. When using base model, all paid plans get limitless requests for agent mode, context-driven chat, and code completions. These premium requests are on top of those. Individual developers may take full advantage of the newest models with Copilot with the new Pro+ tier.
There is more to the agent waking. Additionally, Microsoft Azure is introducing the Copilot code review agent’s public release. In just over a month, over 1 million developers have used the preview version on GitHub. Additionally, the next change recommendations are publicly available, allowing you to tab tab tab your way to coding glory.
Agent mode in VS Code
In order to reach complete availability for all users in the upcoming weeks, agent mode is gradually being made available to VS Code users in stable. It can now be manually enabled as well. Agent mode is essentially able to take action to turn your thoughts into code, unlike chat or multi-file edits, which let you suggest code changes across several files in your workspace. Agent mode pushes Copilot beyond answering a question with straightforward prompts. Instead, it completes all required subtasks across automatically detected or produced files to guarantee your main objective is met. Agent mode may urge you to run suggested tool calls or terminal commands. Additionally, it analyses run-time errors and has the ability to self-heal.
Developers have been using agent mode for a number of tasks since VS Code Insiders launched in February, including yeeting contributions, creating web apps, and automatically resolving code generation issues.
You can use OpenAI GPT-4o, Google Gemini 2.0 Flash, or Claude 3.5 and 3.7 Sonnet to power agent mode. At the moment, agent mode passes SWE-bench Verified with Claude 3.7 Sonnet with a 56.0% pass rate. As chain of thought reasoning models continue to develop, it expect agent mode to become increasingly powerful.
The public preview of Model Context Protocol (MCP) is now accessible
To complete their work, developers must perform a variety of duties throughout the day, including research, telemetry navigation, infrastructure management, coding, and debugging. And to do this, they employ a variety of tools, known as the engineering stack. Like a USB port for intelligence, MCP lets you give agent mode the context and tools it needs to support you. The model can use various tools to perform activities like comprehending database structure or conducting online queries when you enter a conversation prompt in agent mode in Visual Studio Code. More interactive and context-sensitive coding support is made possible by this configuration.
For instance, agent mode would take the list of all possible MCP tools and the prompt to “Update my GitHub profile to include the title of the PR that was assigned to me yesterday” and ask an LLM what to do next. The agent would gradually keep making iterative tool calls until the work was finished.
You may already explore and utilise the vast and expanding MCP ecosystem on GitHub. With some of the top MCP servers available for use, this repository serves as an excellent community inventory. By giving agent mode useful features like searching through code and repositories, handling issues, and generating PRs, the GitHub local MCP server transforms agent mode into a potent GitHub platform user.
Start by configuring both local and distant MCP servers and utilising Visual Studio Code’s agent mode capabilities. Visit the repository to begin using the GitHub local MCP server, which is now natively supported in Visual Studio Code.
Requests for premium models
Microsoft Azure is included several additional models for conversation, multi-file editing, and now agent mode since GitHub Universe. It is launching a new premium request type as a result of these models becoming widely available. In all paid plans for base model (currently: OpenAI GPT-4o), premium requests are in addition to the limitless requests for agent mode, context-driven chat, and code completions.
Starting on May 5, 2025, 300 monthly premium requests will be sent to Copilot Pro customers. From May 12 to May 19, 2025, Copilot Business and Copilot Enterprise customers will receive 300 and 1000 monthly premium requests, respectively. Use of these premium models is unrestricted until then.
Additionally, It is launching a new Pro+ package for $39 per month that gives users access to the top models, such as GPT-4.5, and 1500 monthly premium requests.
For extra premium request usage, Copilot paying users will also be able to pay as they go. In addition to putting spending limitations on requests to easily monitor expenditures, individuals and organizations can opt in to use more requests above the amount included. Through their Copilot Admin Billing Settings, GitHub Copilot Business and Enterprise administrators can control requests. The cost of additional premium requests is $0.04 each.
You will have unrestricted access to Copilot’s base model while using a more powerful or efficient model when necessary. Each premium model will use a certain amount of premium requests.