AI/ML, AWS, Cloud Computing, Data Analytics

3 Mins Read

Build Flexible AI Agents with AWS Strands Agents SDK

Voiced by Amazon Polly

Overview

Strands Agents is a newly launched open-source SDK from AWS that simplifies building an AI agent using an LLM model-driven approach. It allows developers to build powerful, flexible agents in just a few lines of code. Supporting tools, models, and multi-agent systems, Strands Agents is designed for everything from prototyping to full-scale production. This blog post dives into what Strands Agents are, why it matters, and how you can get started.

Pioneers in Cloud Consulting & Migration Services

  • Reduced infrastructural costs
  • Accelerated application deployment
Get Started

Introduction

AI agents are becoming a crucial part of modern applications, helping users write code, analyzing documents, automating workflows, or troubleshooting infrastructure. However, building these agents can be a time-consuming and complex task. Traditional frameworks often require elaborate orchestration logic, detailed workflows, and months of tuning before an agent is production ready.

To address these challenges, AWS has introduced Strands Agents, a flexible, open-source SDK designed to speed up and simplify AI agent development. Inspired by the progress of large language models (LLMs) and their improved reasoning capabilities, Strands let developers focus on what matters: designing smarter agents without all the overhead.

Strands Agents

Strands Agents is an SDK that helps developers define and run AI agents by bringing together three core components:

  1. A Model – Claude family model from Anthropic, Llama model from Meta provider, or other models from Amazon Bedrock.
  2. Tools – This allows the model to perform actions like making API calls, searching documents, or executing code.
  3. A Prompt – The task or instruction that the agent will carry out.

Think of it like giving the AI a brain (the model), hands (tools), and a goal (the prompt). The agent then enters a loop where it thinks, plans, uses tools and responds until the task is complete.

Why Strands?

While earlier agent frameworks required intricate scaffolding and manual tool selection logic, Strands leverages the native abilities of modern LLMs to reason and choose actions dynamically. This results in agents that are easier to build, faster to deploy, and more reliable in production.

Teams inside AWS, like those working on Amazon Q Developer, AWS Glue, and VPC Reachability Analyzer, are already using Strands in production environments.

How It Works?

Here’s how you create an agent using Strands:

  • Step 1: Import Strands and define your tools using Python decorators or pre-built MCP (Model Context Protocol) tools.
  • Step 2: Choose a model provider (such as Meta, OpenAI, Bedrock, Anthropic, or Ollama).
  • Step 3: Write a system prompt that defines the behavior of your agent.
  • Step 4: Run the AI agent by passing it a user query.

For example, the Strands team built a naming assistant agent that checks for available domain names and GitHub orgs for project naming suggestions using just a few lines of code.

Key Features of Strands Agents

  • Model-Driven Logic: Let the model plan and make decisions without complex control flows.
  • Customizable Tool Use: Easily add or remove tools depending on your use case.
  • Multi-Agent Collaboration: Build workflows and graph-based agents using sub-agents.
  • Flexible Deployment: Use it locally, behind the APIs, or in distributed environments with AWS Lambda or AWS Fargate.
  • Observability Built-in: Uses OpenTelemetry to track agent performance in production.
  • Extensible: Add support for more models or tools as needed.

Deployment Options

Strands are built to be production-ready. You can deploy agents in several ways:

  • Locally on a Client – Great for development and simple tasks.
  • Behind an API – Host agents via AWS Lambda, AWS Fargate, or Amazon EC2 for scalable apps.
  • Tool Isolation – Run tools separately in secure backends while the agent operates elsewhere.
  • Return-of-Control Model – Let clients decide how to run or trigger tools while the agent focuses on reasoning.

No matter how you deploy, Strands gives you the tools to monitor, trace, and debug agents easily.

Getting Started

Here’s what you need to start building:

  • Python environment with strands-agents and strands-agents-tools are installed.
  • AWS credentials and access to Amazon Bedrock (especially Anthropic Claude 3.7 Sonnet).
  • A GitHub token if using GitHub tools.
  • Optionally, a local MCP server for faster development.

After that, you can start building agents in just a few minutes. The full source code and documentation of the strand agents are available on GitHub.

Conclusion

Strands Agents marks a major step forward in making AI agent development more accessible and efficient. Simplifying orchestration and focusing on model-driven workflows empowers developers to build, test, and deploy agents in days rather than months.

Whether you’re building internal tools, developer assistants, or complex automation agents, Strands gives you the flexibility, scalability, and control you need.

Drop a query if you have any questions regarding Strands Agents and we will get back to you quickly.

Making IT Networks Enterprise-ready – Cloud Management Services

  • Accelerated cloud migration
  • End-to-end view of the cloud environment
Get Started

About CloudThat

CloudThat is a leading provider of Cloud Training and Consulting services with a global presence in India, the USA, Asia, Europe, and Africa. Specializing in AWS, Microsoft Azure, GCP, VMware, Databricks, and more, the company serves mid-market and enterprise clients, offering comprehensive expertise in Cloud Migration, Data Platforms, DevOps, IoT, AI/ML, and more.

CloudThat is the first Indian Company to win the prestigious Microsoft Partner 2024 Award and is recognized as a top-tier partner with AWS and Microsoft, including the prestigious ‘Think Big’ partner award from AWS and the Microsoft Superstars FY 2023 award in Asia & India. Having trained 850k+ professionals in 600+ cloud certifications and completed 500+ consulting projects globally, CloudThat is an official AWS Advanced Consulting Partner, Microsoft Gold Partner, AWS Training PartnerAWS Migration PartnerAWS Data and Analytics PartnerAWS DevOps Competency PartnerAWS GenAI Competency PartnerAmazon QuickSight Service Delivery PartnerAmazon EKS Service Delivery Partner AWS Microsoft Workload PartnersAmazon EC2 Service Delivery PartnerAmazon ECS Service Delivery PartnerAWS Glue Service Delivery PartnerAmazon Redshift Service Delivery PartnerAWS Control Tower Service Delivery PartnerAWS WAF Service Delivery PartnerAmazon CloudFront Service Delivery PartnerAmazon OpenSearch Service Delivery PartnerAWS DMS Service Delivery PartnerAWS Systems Manager Service Delivery PartnerAmazon RDS Service Delivery PartnerAWS CloudFormation Service Delivery PartnerAWS ConfigAmazon EMR and many more.

FAQs

1. What makes Strands different from other agent frameworks?

ANS: – Strands uses a model-driven approach, relying on the reasoning capabilities of modern LLMs rather than pre-defined workflows. This reduces complexity and speeds up development.

2. Can I use models from OpenAI or other LLM model providers?

ANS: – Yes! Strands support models from many providers, including OpenAI (via LiteLLM), Anthropic, Meta, and Amazon Bedrock. You can even add your custom provider.

WRITTEN BY Aditya Kumar

Aditya Kumar works as a Research Associate at CloudThat. His expertise lies in Data Analytics. He is learning and gaining practical experience in AWS and Data Analytics. Aditya is also passionate about continuously expanding his skill set and knowledge to learn new skills. He is keen to learn new technology.

Share

Comments

    Click to Comment

Get The Most Out Of Us

Our support doesn't end here. We have monthly newsletters, study guides, practice questions, and more to assist you in upgrading your cloud career. Subscribe to get them all!