Voiced by Amazon Polly |
Overview
In today’s fast-evolving AI landscape, businesses increasingly rely on conversational interfaces to engage with users. Amazon Lex, an AI-powered service for building conversational interfaces, combined with AWS Lambda and Amazon Bedrock, offers a powerful solution for dynamic and context-aware interactions. This blog will explore how to integrate Amazon Lex with AWS Lambda, where the Amazon Bedrock code is written to generate and return dynamic responses to user prompts.
Pioneers in Cloud Consulting & Migration Services
- Reduced infrastructural costs
- Accelerated application deployment
Introduction
In this blog, we will explore how to integrate Amazon Lex, AWS Lambda, and Amazon Bedrock to build intelligent conversational interfaces. Amazon Lex, a service for creating chatbots and voice assistants, enables businesses to interact with users using natural language. By integrating Amazon Lex with AWS Lambda, which handles backend logic, and Amazon Bedrock, which provides access to powerful pre-built AI models, we can create dynamic, context-aware conversations.
This setup is ideal for creating smart bots for customer support, personalized recommendations, and real-time data generation tasks.
Steps for Creating Bot and Integration
Step 1 – Create an Amazon Lex Bot and Intent.
Step 2 – Setting Up AWS Lambda Function
Create an AWS Lambda function and use the code below to invoke the Amazon Bedrock model and send the dynamic responses to the user input in the Amazon Lex bot.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 |
from anthropic.anthropic import * def lambda_handler(event, context): slots = event['sessionState']['intent']['slots'] intent = event['sessionState']['intent']['name'] transcript=event['inputTranscript'] input_agent_id='12348' maxTokens= 120 temperature=1 topP=0.5 topK=250 if event['inputTranscript']: input_prompt = f""" Act as a End user as You are experiencing an issue with the shipment of a product you ordered online. Reply to the troubleshooting agent to resolve your problem. Respond End User name as some female person name when name is asked by the support-agent. order details are below, Pick any from the list Shipping ID / Tracking number : [694176859452, 854846775273, 345887299912, 423196321092, 584798753106, 163093876498, 450941415746, 293371977560, 20851103199, 873369788204] Product Name: ["Fridge", "Television", "Fan", "Microwave", "Washing Machine", "Laptop", "Smartphone", "Air Conditioner", "Vacuum Cleaner", "Blender"] """ final_transcript = f"\n\nHuman:{input_prompt} Sales-agent:{transcript}\n\nAssistant:" print("Input_prompt",input_prompt) customer_response= call_anthropic(final_transcript,maxTokens,temperature,topP,topK) return { "sessionState": { "dialogAction": { "type": "ElicitIntent" }, "intent": { "name": intent, "state": "InProgress" } }, "messages": [ { "contentType": "PlainText", "content": customer_response } ] } import json import boto3 def call_anthropic(final_transcript,s_maxTokens,s_temperature,s_topP,s_topK): bedrock = boto3.client('bedrock-runtime',region_name="xx-xxxx-2") body = json.dumps({"prompt": final_transcript,"max_tokens_to_sample": s_maxTokens, "temperature": s_temperature,"top_p":s_topP,"top_k": s_topK,}) modelId = 'anthropic.claude-v2' accept = 'application/json' contentType = 'application/json' response = bedrock.invoke_model(body=body, modelId=modelId, accept=accept, contentType=contentType) response_body = json.loads(response.get('body').read()) claude_out= response_body["completion"] return claude_out |
Step 3 – Configure AWS Lambda Trigger in Amazon Lex.
Follow the steps below to enable the invocation of the AWS Lambda Function.
Click the Build button to build the Amazon Lex Bot with the changes.
Now click the Test button to test the conversation and select the AWS Lambda function and version.
Provide the below input in Amazon Lex and the output received from the Amazon Bedrock Model used in the AWS Lambda function. The prompt is written for the customer care conversation use case.
Step 4 – Interacting AWS Lambda with Amazon Bedrock Model
Use the above Python script to invoke the Amazon Bedrock Model, which generates conversation based on your prompts and use cases.
Step 5 – Returning Amazon Bedrock Response as output to Amazon Lex Bot
The Amazon Lex expects the below return statement structure in the same way so that Amazon Lex can show the Amazon Bedrock response in the Amazon Lex as output.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
return { "sessionState": { "dialogAction": { "type": "ElicitIntent" }, "intent": { "name": intent, "state": "InProgress" } }, "messages": [ { "contentType": "PlainText", "content": customer_response } ] } |
Conclusion
Integrating Amazon Lex with AWS Lambda, leveraging the capabilities of Amazon Bedrock, enables you to build dynamic, intelligent conversational agents. By using Lex to capture user input and AWS Lambda to process the input through Amazon Bedrock’s powerful generative models, businesses can enhance the accuracy and effectiveness of their chatbot experiences. This seamless integration creates new possibilities for creating more natural, context-aware interactions, ultimately improving user satisfaction and operational efficiency. With the flexibility and scalability provided by AWS services, this architecture can be adapted to a wide range of applications, from customer service to personalized recommendations, driving innovation and automation across industries.
Drop a query if you have any questions regarding Amazon Lex, AWS Lambda or Amazon Bedrock and we will get back to you quickly.
Experience Effortless Cloud Migration with Our Expert Solutions
- Stronger security
- Accessible backup
- Reduced expenses
About CloudThat
CloudThat is an award-winning company and the first in India to offer cloud training and consulting services worldwide. As a Microsoft Solutions Partner, AWS Advanced Tier Training Partner, and Google Cloud Platform Partner, CloudThat has empowered over 850,000 professionals through 600+ cloud certifications winning global recognition for its training excellence including 20 MCT Trainers in Microsoft’s Global Top 100 and an impressive 12 awards in the last 8 years. CloudThat specializes in Cloud Migration, Data Platforms, DevOps, IoT, and cutting-edge technologies like Gen AI & AI/ML. It has delivered over 500 consulting projects for 250+ organizations in 30+ countries as it continues to empower professionals and enterprises to thrive in the digital-first world.
FAQs
1. How does the AWS Lambda function receive input from Lex?
ANS: – The AWS Lambda function receives the user input from Lex in a JSON format, including parameters like the intentName, slots (user-provided data), and session attributes. The function can then process this input to determine the appropriate response or take action.
2. What are the costs involved in this integration?
ANS: – Costs can vary based on the usage of Amazon Lex, AWS Lambda, and Amazon Bedrock. You’ll be charged for Amazon Lex requests, the AWS Lambda compute time, and the API calls made to Bedrock. AWS provides detailed pricing for each service on its respective pages.

WRITTEN BY Sridhar Andavarapu
Sridhar Andavarapu is a Senior Research Associate at CloudThat, specializing in AWS, Python, SQL, data analytics, and Generative AI. With extensive experience in building scalable data pipelines, interactive dashboards, and AI-driven analytics solutions, he helps businesses transform complex datasets into actionable insights. Passionate about emerging technologies, Sridhar actively researches and shares insights on AI, cloud analytics, and business intelligence. Through his work, he aims to bridge the gap between data and strategy, helping enterprises unlock the full potential of their analytics infrastructure.
Comments