Monday, May 27, 2024

RAG with Cohere Command R and R+ in Amazon Bedrock

Cohere AI

Use Cohere to create sophisticated multilingual applications and enterprise generative AI.

Presenting the Enterprise Foundation Models from Cohere

Cohere RAG

Command R+

Cohere’s most sophisticated large language model, Command R+, was created especially for use in practical business applications. By striking a balance between accuracy and efficiency, Command R+ enables companies to go beyond proof-of-concept and begin integrating AI into regular operations. It performs exceptionally well in retrieval-augmented generation (RAG) use cases and supports ten major business languages.

Command R

For corporate use, Command R provides a strong and adaptable language model. It is perfect for global organisations because it supports ten languages and performs well on long-context activities. The RAG use cases are optimised for Command R, with an emphasis on precision and efficiency. It is a good fit for organisations looking to adopt AI on a large scale, as it handles text-generation jobs with ease.


With cutting-edge performance in more than 100 languages, Cohere Embed is a potent embedding model. This novel model connects texts with comparable meanings by mapping text to a semantic vector space. With this feature, developers can improve retrieval accuracy for any RAG systems and search apps and results that handle noisy data. AWS launched two new Cohere models, Cohere Command Light and Cohere Embed English on Amazon Bedrock in November 2023. AWS is happy to announce that Cohere Command R and Command R+, two additional Cohere devices, are now available at Amazon Bedrock.


128K context window for tokens

The Command R models are perfect for complicated workflows that involve big document input, appropriate citations with enhanced retrieval, and tool use since they can understand and provide responses within a broad context, with a context window of up to 128K tokens.

Multilingual Execution

Ten important business languages, including English, French, Spanish, Italian, German, Portuguese, Japanese, Korean, Arabic, and Chinese, can be generated multilingually using the Command R models.

Use of several steps in a tool

The model can use numerous tools in multiple steps to complete challenging tasks because to Command R+’s support for multi-step tool use. When the model tries to utilise a tool and fails, it can even repair itself. This allows the model to do the task more than once, boosting the success rate overall.

Increases in productivity

The purpose of command R models is to increase productivity by fusing generative AI capabilities into routine apps and workflows. Companies can now increase overall efficiency and streamline their operations, which will improve company results. Businesses may discover new opportunities and improve employee and customer experiences with Command R+.

Security & privacy of data

Cohere gives users total control over their data by implementing strong data privacy safeguards. Businesses may feel safe in the knowledge that their sensitive data is secure and under their control, even with regard to customisation and model inputs and outputs.

Use cases

Research Assistant for Investments

An investment platform developed an AI assistant that allows its clients to ask complex questions and receive synthesised answers from financial reports, analyst research, investor call transcripts, and other data by utilising RAG with Cohere models.

Helper for Technology

A tech support AI assistant was developed by a CRM SaaS company using RAG with Cohere models to deliver conversational answers to frequently asked questions based on corporate knowledge bases and product documentation.

Chief AI Assistant

Cohere and a financial services firm collaborated to use Command and Embed with RAG to produce a seamless solution. This makes it possible for managers and leaders to ask sophisticated queries and retrieve information from previously unreachable data sources. In order to provide more accurate results, the models collaborate to divide activities and data extraction into several parts.

Acquisition Synopsis

To summarise multi-modal materials, such as PowerPoint decks, notes, PDFs, and more, a customer used Command. Using prior contract negotiation materials and summaries produced by knowledgeable global procurement experts, they trained the model.

Expertise and Project Staffing AI Helper

Using Cohere’s Command and Embed concepts, Cohere collaborated with a multinational consultant to develop a customised RAG solution. Consultants can now make inquiries of an intelligent assistant and receive prompt, precise responses complete with citations.

Cohere Command Model

Command R+

The most potent generative language model available from Cohere, Command R+ is designed for long-context activities like retrieval-augmented generation (RAG) and multi-step tool usage.

Maximum tokens: 128K

Languages spoken: Chinese, Japanese, Korean, Arabic, Chinese, French, Spanish, Italian, German, Portuguese, and Japanese

Use cases that are supported include conversation, knowledge assistants, Q&A, RAG, text production, and text summarization.

Command R

Cohere’s generative language model, Command R, is designed for long-context activities like tools and retrieval-augmented generation (RAG), as well as heavy production workloads.

Maximum tokens: 128K

Languages spoken: Chinese, Japanese, Korean, Arabic, Chinese, French, Spanish, Italian, German, Portuguese, and Japanese

Use cases that are supported include conversation, knowledge assistants, Q&A, RAG, text production, and text summarization.


Cohere’s generative large language model (LLM) is called Command.

Maximum tokens: 4,000

Languages Spoken: English

Use cases supported include text production, text summarization, and chat.

Cohere Command Light

Cohere’s generative LLM, Command, comes in a smaller version called Command Light.

Maximum tokens: 4,000

Languages Spoken: English

Use cases supported include text production, text summarization, and chat.

Embed – English

The text representation paradigm used by Cohere is called embeddings.
Only English is supported in this version.

Measurements: 1024

Languages Spoken: English

Use cases that are supported include classification, clustering, retrieval augmented generation (RAG), and semantic search.

Embed– Multilingual

The text representation paradigm used by Cohere is called embeddings.
Multiple languages are supported by this version.

Measurements: 1024

Multilingual (supported in more than 100 languages)

Use cases that are supported include classification, clustering, retrieval-augmented generation (RAG), and semantic search.

Cohere Command LLM

To safely interact with data stored in their business data sources, organisations require generative artificial intelligence (generative AI) models. Large language models (LLMs) like Command R and Command R+ are robust and scalable, designed for enterprise-level workloads in the real world. These multilingual models are concentrated on striking a balance between high efficiency and strong accuracy to excel at capabilities like Retrieval-Augmented Generation (RAG) and tool utilisation to help businesses employ artificial intelligence (AI) to move beyond proof-of-concept (POC) and into production.

Command R is a multilingual generative model that is scalable and aimed at RAG, with the goal of enabling companies to apply AI in production. Modern RAG-optimized models like Command R+ are made to handle enterprise-level workloads and enhance corporate AI applications. Because in-line citations are a standard feature of this paradigm, Command R+ is optimized for advanced RAG to deliver responses that are extremely dependable, verifiable, and enterprise-ready.

You can swiftly locate the most pertinent information to assist jobs across company functions like finance, human resources (HR), sales, marketing, and customer service, among others, in a variety of business sectors, with these new Cohere models in Bedrock. You can scale with AI to achieve this. Command R+ also offers the ability to employ tools. Like Command R, Command R+ is a robust multilingual model with a tokenizer that compresses non-English text far more effectively than tokenizers seen in other models on the market.

Introduction to Command R and Command R+

You must first have access to the models in order to begin using any of the two models in Amazon Bedrock. Select Model access, followed by Manage model access in the Amazon Bedrock console. Next, select the model or models that you like most, and then select Save changes. As you can see, you now have more options and flexibility to choose the best Cohere models for your unique company needs with six models available in Amazon Bedrock, including Command R and Command R+.

You can utilise your chosen model in Amazon Bedrock after you have access to it. To get the most recent state, refresh the table of basic models.

The models have undergone training in order to react in the user’s preferred language, which includes Arabic, English, French, Spanish, Italian, German, Brazilian Portuguese, Japanese, and Korean. As an illustration, consider this:

Engage in programming using Command R and Command R+

Additionally, you can make other calls using Amazon Bedrock APIs by using the AWS Software Development Kit (SDK) and AWS Command Line Interface (CLI). Here is an example of Python code that uses the AWS SDK to interface with the Amazon Bedrock Runtime APIs. Here’s how it appears when used programmatically using the same text generation prompt that used previously.

Currently accessible

Check the complete Region list for future updates. The Command R and Command R+ models, in addition to other Cohere models, are currently available in Amazon Bedrock in the US West (Oregon) and US East (North Virginia) Regions.

Since June 2023, Drakshi has been writing articles of Artificial Intelligence for govindhtech. She was a postgraduate in business administration. She was an enthusiast of Artificial Intelligence.


Please enter your comment!
Please enter your name here

Recent Posts

Popular Post Would you like to receive notifications on latest updates? No Yes