-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to get streaming response from BedrockChat in a custom agent #5917
Comments
Hey there, @MrHertal! I'm here to help you with the issue you've encountered. I'm a bot designed to assist with bug-solving, answering questions, and guiding potential contributors. Let's work together to resolve this issue! You are unable to get a streaming response from BedrockChat in a custom agent and instead receive a single token with the full response because the To enable streaming, you need to set the import { BedrockChat } from "@langchain/community/chat_models/bedrock";
import { HumanMessage } from "@langchain/core/messages";
async function runStreaming() {
// Instantiate the BedrockChat model with the desired configuration
const model = new BedrockChat({
model: "anthropic.claude-3-sonnet-20240229-v1:0",
region: "us-east-1",
credentials: {
accessKeyId: process.env.BEDROCK_AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.BEDROCK_AWS_SECRET_ACCESS_KEY!,
},
maxTokens: 150,
temperature: 0.7,
stopSequences: ["\n", " Human:", " Assistant:"],
streaming: true, // Enable streaming
trace: "ENABLED",
guardrailIdentifier: "your-guardrail-id",
guardrailVersion: "1.0",
guardrailConfig: {
tagSuffix: "example",
streamProcessingMode: "SYNCHRONOUS",
},
});
// Prepare the message to be sent to the model
const message = new HumanMessage({ content: "Tell me a joke" });
// Stream the response from the model
const stream = await model.stream([message]);
for await (const chunk of stream) {
// Output each chunk of the response
console.log(chunk.content);
}
}
runStreaming().catch(console.error); This configuration ensures that the |
Sorry @dosu but this is not helping. I'm using BedrockChat in the context of a custom agent, so I don't run the model directly. |
To enable proper token-by-token streaming responses from import { BedrockChat } from "@langchain/community/chat_models/bedrock";
import { BufferMemory } from "@langchain/memory";
import { DynamoDBChatMessageHistory } from "@langchain/memory/dynamodb";
import { createToolCallingAgent, AgentExecutor } from "@langchain/agents";
import { HumanMessage } from "@langchain/core/messages";
const llm = new BedrockChat({
model: "anthropic.claude-3-sonnet-20240229-v1:0",
modelKwargs: {
temperature: 0,
},
streaming: true,
callbacks: [
{
async handleLLMNewToken(token) {
console.log({ token });
},
},
],
});
const memory = new BufferMemory({
chatHistory: new DynamoDBChatMessageHistory({
tableName: memoryTable,
partitionKey: "SessionId",
sessionId,
}),
returnMessages: true,
memoryKey: "chat_history",
outputKey: "output",
});
const agent = createToolCallingAgent({
llm,
tools,
prompt,
streamRunnable: true, // Enable streaming
});
const agentExecutor = new AgentExecutor({
agent,
tools,
memory,
});
const result = await agentExecutor.invoke({
input: message,
}); By setting |
By default |
Can you try the following code? const result = await agentExecutor.stream({
input: message,
}); |
@jeasonnow thanks but it was not working. I finally made it work by implementing this solution: https://github.com/aws-samples/langchain-agents/tree/main/bedrock/langchain-js-stream-agent |
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
No response
Description
Hi,
I'm following this example: https://github.com/langchain-ai/langchainjs/blob/main/examples/src/agents/streaming.ts
to enable streaming in an agent response.
I'm using BedrockChat as the model with the following options:
I'm expecting to see a response from the agent streaming in the console, token by token.
But i always get a single token with the full response:
Am I missing something?
System Info
The text was updated successfully, but these errors were encountered: