AI agents are rapidly transforming enterprise operations. Although a single agent can perform specific tasks effectively, complex business processes often span multiple systems, requiring data retrieval, analysis, decision-making, and action execution across different systems. With multi-agent collaboration, specialized AI agents can work together to automate intricate workflows.
This post explores a practical collaboration, integrating Salesforce Agentforce with Amazon Bedrock Agents and Amazon Redshift, to automate enterprise workflows.
Enterprise environments today are complex, featuring diverse technologies across multiple systems. Salesforce and AWS provide distinct advantages to customers. Many organizations already maintain significant infrastructure on AWS, including data, AI, and various business applications such as ERP, finance, supply chain, HRMS, and workforce management systems. Agentforce delivers powerful AI-driven agent capabilities that are grounded in enterprise context and data. While Salesforce provides a rich source of trusted business data, customers increasingly need agents that can access and act on information across multiple systems. By integrating AWS-powered AI services into Agentforce, organizations can orchestrate intelligent agents that operate across Salesforce and AWS, unlocking the strengths of both.
Agentforce and Amazon Bedrock Agents can work together in flexible ways, leveraging the unique strengths of both platforms to deliver smarter, more comprehensive AI workflows. Example collaboration models include:
This integration creates a more powerful solution that maximizes the benefits of both Salesforce and AWS, so you can achieve better business outcomes through enhanced AI capabilities and cross-system functionality.
Agentforce brings digital labor to every employee, department, and business process, augmenting teams and elevating customer experiences.It works seamlessly with your existing applications, data, and business logic to take meaningful action across the enterprise. And because it’s built on the trusted Salesforce platform, your data stays secure, governed, and in your control. With Agentforce, you can:
Amazon Bedrock is a fully managed AWS service offering access to high-performing foundation models (FMs) from various AI companies through a single API. In this post, we discuss the following features:
Agentforce can call Amazon Bedrock agents in different ways, allowing flexibility to build different architectures. The following diagram illustrates synchronous and asynchronous patterns.
For a synchronous or request-reply interaction, Agentforce uses custom agent actions facilitated by External Services, Apex Invocable Methods, or Flow to call an Amazon Bedrock agent. The authentication to AWS is facilitated using named credentials. Named credentials are designed to securely manage authentication details for external services integrated with Salesforce. They alleviate the need to hardcode sensitive information like user names and passwords, minimizing the risk of exposure and potential data breaches. This separation of credentials from the application code can significantly enhance security posture. Named credentials streamline integration by providing a centralized and consistent method for handling authentication, reducing complexity and potential errors. You can use Salesforce Private Connect to provide a secure private connection with AWS using AWS PrivateLink. Refer to Private Integration Between Salesforce and Amazon API Gateway for additional details.
For asynchronous calls, Agentforce uses Salesforce Event Relay and Flow with Amazon EventBridge to call an Amazon Bedrock agent.
In this post, we discuss the synchronous call pattern. We encourage you to explore Salesforce Event Relay with EventBridge to build event-driven agentic AI workflows. Agentforce also offers the Agent API, which makes it straightforward to call an Agentforce agent from an Amazon Bedrock agent, using EventBridge API destinations, for bi-directional agentic AI workflows.
To illustrate the multi-agent collaboration between Agentforce and AWS, we use the following architecture, which provides access to Internet of Things (IoT) sensor data to the Agentforce agent and handles potentially erroneous sensor readings using a multi-agent approach.
The example workflow consists of the following steps:
This workflow demonstrates how Amazon Bedrock Agents orchestrates tasks, using Amazon Bedrock Knowledge Bases for context and action groups (through Lambda) to interact with Agentforce to complete the end-to-end process.
Before building this architecture, make sure you have the following:
Make sure your data is structured and available in your Redshift instance. Note the database name, credentials, and table and column names.
For this post, we create two IAM roles:
custom_AmazonBedrockExecutionRoleForAgents:
AmazonBedrockFullAccessAmazonRedshiftDataFullAccess{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AmazonBedrockAgentBedrockFoundationModelPolicyProd",
"Effect": "Allow",
"Principal": {
"Service": "bedrock.amazonaws.com"
},
"Action": "sts:AssumeRole",
"Condition": {
"StringEquals": {
"aws:SourceAccount": "YOUR_ACCOUNT_ID"
}
}
}
]
}
custom_AWSLambdaExecutionRole:
AmazonBedrockFullAccessAmazonLambdaBasicExecutionRole{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": "lambda.amazonaws.com"
},
"Action": "sts:AssumeRole",
"Condition": {
"StringEquals": {
"aws:SourceAccount": "YOUR_ACCOUNT_ID"
}
}
}
]
}
Complete the following steps to create an Amazon Bedrock knowledge base:
custom_AmazonBedrockExecutionRoleForAgents.


CREATE USER "IAMR:custom_AmazonBedrockExecutionRoleForAgents" WITH PASSWORD DISABLE;
GRANT SELECT ON ALL TABLES IN SCHEMA dev.knowledgebase TO "IAMR:custom_AmazonBedrockExecutionRoleForAgents";
GRANT USAGE ON SCHEMA dev.knowledgebase TO "IAMR:custom_AmazonBedrockExecutionRoleForAgents";
For more information, refer to set up your query engine and permissions for creating a knowledge base with structured data store.
Make sure the status shows as Complete before moving to the next steps.
Complete the following steps to create an Amazon Bedrock agent:
custom_AmazonBedrockExecutionRoleForAgents.You are an IoT device monitoring and alerting agent.
You have access to the structured data containing reading, maintenance, threshold data for IoT devices.
You answer questions about device reading, maintenance schedule and thresholds.
You can also create case via Agentforce.
When you receive comma separated values parse them as device_id, temperature, voltage, connectivity and error_code.
First check if the temperature is less than min temperature, more than max temperature and connectivity is more than the connectivity threshold for the product associated with the device id.
If there is an error code, send information to agentforce to create case. The information sent to agentforce should include device readings such as device id, error code.
It should also include the threshold values related to the product associated with the device such as min temperature, max temperature and connectivity,
In response to your call to agentforce just return the summary of the information provided with all the attributes provided.
Do not omit any information in the response. Do not include the word escalated in agent.

Complete the following steps to create a Lambda function to receive requests from Agentforce:
import boto3
import uuid
import json
import pprint
import traceback
import time
import logging
from agent_utils import invoke_agent_generate_response
logger = logging.getLogger()
logger.setLevel(logging.INFO)
bedrock_agent_runtime_client = boto3.client(
service_name="bedrock-agent-runtime",
region_name="REGION_NAME", # replace with the region name from your account
)
def lambda_handler(event, context):
logger.info(event)
body = event['body']
input_text = json.loads(body)['inputText']
agent_id = 'XXXXXXXX' # replace with the agent id from your account
agent_alias_id = 'XXXXXXX' # replace with the alias id from your account
session_id:str = str(uuid.uuid1()) # random identifier
enable_trace:bool = False
end_session:bool = False
final_answer = None
response = call_agent(input_text, agent_id, agent_alias_id)
print("response : ")
print(response)
return {
'headers': {
'Content-Type' : 'application/json',
'Access-Control-Allow-Headers': '*',
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': '*'
},
'statusCode': 200,
'body': json.dumps({"outputText" : response })
}
def call_agent(inputText, agentId, agentAliasId):
session_id = str(uuid.uuid1())
enable_trace = False
end_session = False
while True:
try:
agent_response = bedrock_agent_runtime_client.invoke_agent(
inputText=inputText,
agentId=agentId,
agentAliasId=agentAliasId,
sessionId=session_id,
enableTrace=enable_trace,
endSession=end_session
)
logger.info("Agent raw response:")
pprint.pprint(agent_response)
if 'completion' not in agent_response:
raise ValueError("Missing 'completion' in agent response")
for event in agent_response['completion']:
chunk = event.get('chunk')
# print('chunk: ', chunk)
if chunk:
decoded_bytes = chunk.get("bytes").decode()
# print('bytes: ', decoded_bytes)
return decoded_bytes
except Exception as e:
print(traceback.format_exc())
return f"Error: {str(e)}"
custom_AWSLambdaExecutionRole.Complete the following steps to create a REST API in API Gateway:
Now that you have created an Amazon Bedrock agent with an API Gateway endpoint and Lambda wrapper, let’s complete the configuration on the Salesforce side. Complete the following steps:
Now you can grant access to the agent user to access these credentials.
You can optionally use Salesforce Private Connect with PrivateLink to provide a secure private connection with. This allows critical data to flow from the Salesforce environment to AWS without using the public internet.
Complete the following steps to add an external service in Salesforce:
openapi: 3.0.0
info:
title: Bedrock Agent Wrapper API
version: 1.0.0
description: Bedrock Agent Wrapper API
paths:
/proxy:
post:
operationId: call-bedrock-agent
summary: Call Bedrock Agent
description: Call Bedrock Agent
requestBody:
description: input
required: true
content:
application/json:
schema:
$ref: '#/components/schemas/input'
responses:
'200':
description: Successful response
content:
application/json:
schema:
$ref: '#/components/schemas/output'
'500':
description: Server error
components:
schemas:
input:
type: object
properties:
inputText:
type: string
agentId:
type: string
agentAlias:
type: string
output:
type: object
properties:
outputText:
type: string
Complete the following steps to create an Agentforce agent action:
Complete the following steps to configure the Agentforce agent to use the agent action:
If a user asks for device readings or sensor readings, provide the information.
If a user asks for device maintenance or sensor maintenance, provide the information.
When searching for device information, include the device or sensor id and any relevant keywords in your search query.
The agent will indicate that it is calling the Amazon Bedrock agent, as shown in the following screenshot.
The agent will fetch the information using the Amazon Bedrock agent from the associated knowledge base.

To avoid additional costs, delete the resources that you created when you no longer need them:
In this post, we described an architecture that demonstrates the power of combining AI services on AWS with Agentforce. By using Amazon Bedrock Agents and Amazon Bedrock Knowledge Bases for contextual understanding through RAG, and Lambda functions and API Gateway to bridge interactions with Agentforce, businesses can build sophisticated, automated workflows. As AI capabilities continue to grow, such collaborative multi-agent systems will become increasingly central to enterprise automation strategies. In an upcoming post, we will show you how to build the asynchronous integration pattern from Agentforce to Amazon Bedrock using Salesforce Event Relay.
To get started, see Become an Agentblazer Innovator and refer to How Amazon Bedrock Agents works.
Yogesh Dhimate is a Sr. Partner Solutions Architect at AWS, leading technology partnership with Salesforce. Prior to joining AWS, Yogesh worked with leading companies including Salesforce driving their industry solution initiatives. With over 20 years of experience in product management and solutions architecture Yogesh brings unique perspective in cloud computing and artificial intelligence.
Kranthi Pullagurla has over 20+ years’ experience across Application Integration and Cloud Migrations across Multiple Cloud providers. He works with AWS Partners to build solutions on AWS that our joint customers can use. Prior to joining AWS, Kranthi was a strategic advisor at MuleSoft (now Salesforce). Kranthi has experience advising C-level customer executives on their digital transformation journey in the cloud.
Shitij Agarwal is a Partner Solutions Architect at AWS. He creates joint solutions with strategic ISV partners to deliver value to customers. When not at work, he is busy exploring New York city and the hiking trails that surround it, and going on bike rides.
Ross Belmont is a Senior Director of Product Management at Salesforce covering Platform Data Services. He has more than 15 years of experience with the Salesforce ecosystem.
Sharda Rao is a Senior Director of Product Management at Salesforce covering Agentforce Go To Market strategy
Hunter Reh is an AI Architect at Salesforce and a passionate builder who has developed over 100 agents since the launch of Agentforce. Outside of work, he enjoys exploring new trails on his bike or getting lost in a great book.
Manuel Rioux est fièrement propulsé par WordPress