Utilise Amazon Bedrock with AWS Step Functions to create generative AI applications
AWS are pleased to announce today two newly improved integrations between Amazon Bedrock and AWS Step Functions. Step Functions is a visual workflow tool that aids in the creation of data and machine learning (ML) pipelines, distributed application development, process automation, and micro service orchestration.
AWS released Amazon Bedrock, the simplest method for creating and expanding generative AI applications using foundation models (FMs), in September.
A wide range of features required by clients to develop generative AI applications while upholding security and privacy are provided by Bedrock, including a selection of foundation models from top suppliers such as Anthropic, Cohere, Stability AI, and Amazon.
Amazon Management Console, AWS Command Line Interface (AWS CLI), and AWS SDKs are the ways in which you may use Amazon Bedrock.
With Amazon Bedrock, you can create generative AI applications by coordinating activities and integrating with more than 220 AWS services thanks to the recently released Step Functions optimized connectors. You can create, review, and audit your processes graphically using Step Functions. To utilize Amazon Bedrock formerly required calling an AWS Lambda function from your workflows, which added code to maintain and raised the application’s cost.
Two newly optimized API operations for Amazon Bedrock are offered by Step Functions:
InvokeModel: With the help of this integration, you may call a model and use the parameters’ input to perform inferences. Run embedding, text, and picture model inferences using this API activity.
This integration generates a fine-tuning task for customizing a basic model, called Create Model Customization Job. The foundation model and the training data location are specified in the parameters. Once the task is finished, your personalized model is prepared for use.
This integration enables Step Functions to execute an asynchronous API call and wait for its completion before moving on to the next stage. This indicates that the state machine execution will stop to allow the generate model customisation operation to finish, then it will automatically restart.
Requests and answers may be up to 25 MB in size when using the InvokeModel API action. Nevertheless, the maximum state payload input and output for Step Functions is 256 kB. With this integration, you may provide an Amazon Simple Storage Service (Amazon S3) bucket where the InvokeModel API receives data and stores the result in order to handle bigger payloads. The API action configuration parameters section has a parameter section where these setups may be supplied.
How to use AWS Step Functions with Amazon Bedrock to get started
Make sure you establish the state machine in a region where Amazon Bedrock is accessible before you begin. Use US East in this example.
Establish a new state machine using the AWS Management Console. The two possible API actions will show up when you search for “bedrock.” The InvokeModel may be dragged to the state machine.
The menu on the right now allows you to customize that condition. You may choose the base model you want to utilize first. Either choose a model from the list or dynamically generate the model based on the input.
Next, you must set up the model’s parameters. You have the option to import the parameters from Amazon S3 or input the inference parameters in the text field.
You may select further configuration parameters for the API, such the S3 destination bucket, if you continue scrolling through the API action settings. If this field is filled in, the API action saves the answer to the API in the designated bucket rather than sending it back to the state output. Additionally, you may choose the kind of material for both requests and answers here.
Once your state machine configuration is complete, you may build and execute it. You may choose the Amazon Bedrock state, see the execution details, and examine the inputs and outputs of the state machine as it is operating.
Step Functions allow you to construct state machines as complex as required, integrating various services to address a wide range of issues. For instance, you may construct apps with prompt chaining by combining Step Functions with Amazon Bedrock.
By giving the FM many shorter, more manageable suggestions rather than one lengthy, intricate one, this strategy allows developers to create sophisticated generative AI systems. You may construct a state machine that makes repeated calls to Amazon Bedrock to get an inference for each of the smaller prompts, therefore constructing a prompt chain. This can all be done in parallel using the parallel state. Then, using an AWS Lambda function, you can combine all of the parallel task answers into a single response and provide a result.