Vertex AI Prompt Optimizer
One of the most approachable ways to get a Large Language Model (LLM) to provide meaningful output is through prompt design and engineering. Prompting large language models, however, might resemble negotiating a challenging maze. To get the desired result, you have to try different combinations of examples and directions. Furthermore, there is no assurance that the best prompt template you locate will still produce the best outcomes for a different LLM.
It is difficult to migrate or translate prompts from one LLM to another due to the disparities in behavior between language models. In order to produce meaningful outputs, users require an intelligent prompt optimizer, as simply recycling prompts is futile.
Google Cloud is introducing Vertex AI Prompt Optimizer in Public Preview to help alleviate “prompt fatigue” that customers encounter when developing LLM-based applications.
What is Vertex AI Prompt Optimizer?
You can obtain the ideal prompt (descriptions and instructions) for each desired model on Vertex AI with the aid of Vertex AI Prompt Optimizer. It uses an iterative LLM-based optimization algorithm, based on Google Research’s publication on automatic prompt optimization (APO) methods, which was accepted by NeurIPS 2024. The optimizer model, which generates paraphrased instructions, and the evaluator model, which assesses the chosen instruction and demonstration, collaborate to create and assess candidate prompts.
The user-selected evaluation metrics that Prompt Optimizer then chooses the optimal instructions and examples to optimize against. The prompt template’s task, context, and system instruction are all included in the instructions. The brief examples you include in your prompt to evoke a particular answer style or tone are called demonstrations.
Vertex AI Prompt Optimizer eliminates the need to manually optimize pre-existing prompts each time for a new LLM by finding the ideal prompt (instruction and demos) for the target model with just a few labeled examples and selected optimization settings. With Vertex AI, creating a new prompt for a specific activity or translating an existing prompt between models is now simple. The following are the salient features:
- Simple optimization: Transfer and translate suggestions from any source model to any target Google model quickly and easily.
- Versatile task handling: Supports all text-based tasks, including entity extraction, summarization, question and answer sessions, and categorization. Multimodal task support will soon be expanded.
- Comprehensive assessment: To guarantee ideal rapid performance against the measures you care about, it supports a broad range of evaluation metrics, including model-based, computation-based, and custom metrics.
- Versatile and adaptable: Use different notebook versions based on your skill level and requirements, and adjust the optimization procedure and latency using sophisticated options.
Vertex AI Prompt Optimizer: Why Use It?
Data-driven optimization: A lot of the prompt optimization technologies on the market now concentrate on customizing your prompts to your desired tone and style, but they frequently still need human verification. Beyond this, though, Vertex AI Prompt Optimizer optimizes your prompts according to particular assessment measures, guaranteeing optimal performance for your target model.
Designed specifically for Gemini: Vertex AI Prompt Optimizer is made with the fundamental traits of Gemini in mind if you use it. It’s made especially to adjust to the special qualities of the Gemini and other Google models. With this customized strategy, you can fully utilize Gemini’s potential and produce exceptional outcomes.
How to begin Vertex AI Prompt Optimizer?
You can use the Colab notebook, which has sample code and notebooks for Generative AI on Google Cloud, in the Google Cloud Generative AI repository on Github to begin utilizing Vertex AI Prompt Optimizer. For basic settings, see the UI version; for more complex settings, see the SDK version. In the upcoming weeks, more notebook versions that support multimodal input and configurable metrics will be added. The Vertex AI Studio console is another way for you to access it. Check the console for entry points labeled “optimizer your prompt further” or “prompt optimizer.”
Use Vertex AI Prompt Optimizer by doing the following actions to either optimize or translate prompts:
- Set up the prompt template.
- Enter your data (examples with labels).
- Set up the parameters for your optimization (target model, evaluation metrics, etc.).
- Execute the optimization task.
- Examine the outcomes
Any Google models and evaluation metrics that the Generative AI Evaluation Service provides are supported by Vertex AI Prompt Optimizer.
Access points to the Vertex AI Prompt Optimizer Colab Enterprise Notebook from Vertex AI Studio
A. A new Prompt optimizer button will appear on the Saved prompts page.
B. There will be a new Optimize your prompt further button in the Prompt assist dialog pop-up.