Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue with ChatBedrockConverse and Bedrock Claude Model: ValidationException on Blank Text Field #5960

Closed
5 tasks done
Jalmeida1994 opened this issue Jul 2, 2024 · 10 comments · Fixed by #6043
Closed
5 tasks done
Assignees
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@Jalmeida1994
Copy link

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain.js documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain.js rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

import {RunnableConfig} from '@langchain/core/runnables'
import {ChatBedrockConverse} from '@langchain/aws'

interface ChatState {
  messages: BaseMessage[]
  sourceDocuments: DocumentInterface[]
}

 const model = new ChatBedrockConverse({
          model: "anthropic.claude-3-sonnet-20240229-v1:0"
          region: this.aws_region,
          credentials: credentialsProvider,
          maxTokens: 2500,
          temperature: 0.05
          })

  private async agentNode(
    state: ChatState,
    config?: RunnableConfig
  ): Promise<Partial<ChatState>> {
    const tools = [retrieverTool]
    const model = model.bindTools(tools)

    const response = await model.invoke(state.messages, config)
    logger.debug(`Model response: ${JSON.stringify(response)}`)

    return {messages: [response]}
  }

Maybe any example from LanggraphJS Examples that uses tools, but with "ChatBedrockConverse" and model "anthropic.claude-3-sonnet-20240229-v1:0"?

Error Message and Stack Trace (if applicable)

Error in CompletionController.generateCompletion() The text field in the ContentBlock object at messages.1.content.0 is blank. Add text to the text field, and try again.","name":"ValidationException","stack":"ValidationException: The text field in the ContentBlock object at messages.1.content.0 is blank. Add text to the text field, and try again.\n at de_ValidationExceptionRes (/Users/test/code/bot/node_modules/.pnpm/@AWS-SDK+client-bedrock-runtime@3.606.0/node_modules/@aws-sdk/client-bedrock-runtime/dist-cjs/index.js:1082:21)\n at de_CommandError (/Users/test/code/bot/node_modules/.pnpm/@AWS-SDK+client-bedrock-runtime@3.606.0/node_modules/@aws-sdk/client-bedrock-runtime/dist-cjs/index.js:937:19)\n at processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async /Users/test/code/bot/node_modules/.pnpm/@smithy+middleware-serde@3.0.3/node_modules/@smithy/middleware-serde/dist-cjs/index.js:35:20\n at async /Users/test/code/bot/node_modules/.pnpm/@smithy+core@2.2.4/node_modules/@smithy/core/dist-cjs/index.js:165:18\n at async /Users/test/code/bot/node_modules/.pnpm/@smithy+middleware-retry@3.0.7/node_modules/@smithy/middleware-retry/dist-cjs/index.js:320:38\n at async /Users/test/code/bot/node_modules/.pnpm/@AWS-SDK+middleware-logger@3.598.0/node_modules/@aws-sdk/middleware-logger/dist-cjs/index.js:34:22\n at async ChatBedrockConverse.generateNonStreaming (/Users/test/code/bot/node_modules/.pnpm/@langchain+aws@0.0.1@AWS-SDK+client-sso-oidc@3.606.0_@aws-sdk+client-sts@3.606.0__@aws-sdk+c_wops3cctew2kpyygicqzgjk7su/node_modules/@langchain/aws/dist/chat_models.cjs:222:26)\n at async Promise.allSettled (index 0)\n at async ChatBedrockConverse.generateUncached (/Users/test/code/bot/node_modules/.pnpm/@langchain+core@0.2.12_langchain@0.2.8@AWS-SDK+credential-provider-node@3.600.0_@aws-sdk+cli_6wi7pugnjf7kzne7hkijzr5xv4/node_modules/@langchain/core/dist/language_models/chat_models.cjs:177:29)","timestamp":"2024-07-02T13:59:04.567Z

Description

I'm encountering an issue when invoking the Bedrock Claude model via the Converse API using LangChain's ChatBedrockConverse. The error message is:

ValidationException: The text field in the ContentBlock object at messages.1.content.0 is blank. Add text to the text field, and try again.

This issue does not occur when using ChatOpenAI or BedrockChat. It seems specific to ChatBedrockConverse.

Steps to Reproduce:

  1. Use Langchain/Langgraph to create a conversation with the Bedrock Claude model via the Converse API.
  2. Implement a flow that involves tool usage, leading to empty content messages.
  3. Observe the error: ValidationException due to a blank text field in the ContentBlock object.

Expected Behavior:
The model should handle tool usage without resulting in empty content messages, or the API should handle such cases gracefully - which I know it's not with Langchain.

Actual Behavior:
The process fails with a ValidationException when an empty content message is encountered.

Additional Information:

  • This issue started occurring after upgrading to ChatBedrockConverse.
  • Manually adding text to the empty content AI messages results in a different error: The model returned the following errors: Your API request included an 'assistant' message in the final position, which would pre-fill the 'assistant' response. When using tools, pre-filling the 'assistant' response is not supported.

Example AI message with empty content calling a tool:

{
  "AIMessage": {
    "lc_serializable": true,
    "lc_kwargs": {
      "content": "",
      "tool_calls": [{ "id": "tooluse_e68eeY31Rju83TWq5n-e1A", "name": "retrieve_internal_documents", "args": {}}],
      "response_metadata": {
        "$metadata": { "httpStatusCode": 200, "requestId": "fba07d1c-cfda-4609-a603-ad589da3374a", "attempts": 1, "totalRetryDelay": 0 },
        "metrics": { "latencyMs": 1718 },
        "stopReason": "tool_use",
        "usage": { "inputTokens": 446, "outputTokens": 56, "totalTokens": 502 }
      },
      "usage_metadata": { "input_tokens": 446, "output_tokens": 56, "total_tokens": 502 },
      "id": "fba07d1c-cfda-4609-a603-ad589da3374a",
      "invalid_tool_calls": [],
      "additional_kwargs": {}
    }
  }
}

PS: If you feel this issue might be better shared with the LangGraph JS maintainers or if it involves specific nuances of the AWS Bedrock Converse API, please let me know. I initially reported it here since ChatBedrockConverse is hosted within LangChain.

System Info

$ pnpm info langchain

langchain@0.2.8 | MIT | deps: 16 | versions: 280
Typescript bindings for langchain
https://github.com/langchain-ai/langchainjs/tree/main/langchain/

keywords: llm, ai, gpt3, chain, prompt, prompt engineering, chatgpt, machine learning, ml, openai, embeddings, vectorstores

dist
.tarball: https://<internal-proxy>/langchain/-/langchain-0.2.8.tgz
.shasum: 9bd77f5c12071d0ccb637c04fc33415e5369e5aa
.integrity: sha512-kb2IOMA71xH8e6EXFg0l4S+QSMC/c796pj1+7mPBkR91HHwoyHZhFRrBaZv4tV+Td+Ba91J2uEDBmySklZLpNQ==

dependencies:
@langchain/core: >=0.2.9 <0.3.0   js-tiktoken: ^1.0.12              langsmith: ~0.1.30                uuid: ^9.0.0
@langchain/openai: >=0.1.0 <0.3.0 js-yaml: ^4.1.0                   ml-distance: ^4.0.0               yaml: ^2.2.1
@langchain/textsplitters: ~0.0.0  jsonpointer: ^5.0.1               openapi-types: ^12.1.3            zod-to-json-schema: ^3.22.3
binary-extensions: ^2.2.0         langchainhub: ~0.0.8              p-retry: 4                        zod: ^3.22.4

maintainers:
- nfcampos <nuno@boringbits.io>
- jacoblee93 <jacoblee93@gmail.com>
- andrewnguonly <andrewnguonly@gmail.com>
- davidduong <david@duong.cz>
- hwchase17 <hw.chase.17@gmail.com>
- basproul <braceasproul@gmail.com>

dist-tags:
latest: 0.2.8     next: 0.2.3-rc.0

published 4 days ago by basproul <braceasproul@gmail.com>
  • platform: Mac arm64
  • Node version: v20.15.0
  • yarn version: N/A (ppm version: 9.4.0)
@dosubot dosubot bot added the auto:bug Related to a bug, vulnerability, unexpected error with an existing feature label Jul 2, 2024
Copy link

dosubot bot commented Jul 2, 2024

Hey @Jalmeida1994! 👋 I'm here to help you out with any bugs, questions, or contributions while we wait for a human maintainer. Let's squash those bugs together!

To resolve the ValidationException error indicating that the text field in the ContentBlock object at messages.1.content.0 is blank, you need to ensure that all messages passed to the invoke method contain non-empty content.

Here's an example of how you can invoke the ChatBedrockConverse model with properly populated messages:

import { ChatBedrockConverse } from "@langchain/aws";
import { HumanMessage } from "@langchain/core/messages";

const model = new ChatBedrockConverse({
  model: "anthropic.claude-3-sonnet-20240229-v1:0",
  region: "us-east-1",
  credentials: {
    accessKeyId: process.env.BEDROCK_AWS_ACCESS_KEY_ID!,
    secretAccessKey: process.env.BEDROCK_AWS_SECRET_ACCESS_KEY!,
  },
});

const messages = [
  new HumanMessage({ content: "Print hello world" }),
  new HumanMessage({ content: "What is the weather today?" })
];

const res = await model.invoke(messages);
console.log(res);

Ensure that each HumanMessage or any other message type you use has non-empty content. This should prevent the ValidationException error related to blank text fields.

Additionally, the convertToConverseMessages function in the libs/langchain-aws/src/common.ts file ensures that the content of each message is properly converted to the required format. Make sure that your messages are correctly formatted and contain valid content blocks, as shown in the example [1][2][3].

To continue talking to Dosu, mention @dosu.

@Jalmeida1994
Copy link
Author

Jalmeida1994 commented Jul 2, 2024

Hey @dosu, you friendly bot!

I understand the need to ensure that all messages passed to the invoke method contain non-empty content. However, the issue arises from the agent itself adding the empty content message. The content is empty because of the stopReason: "tool_use".

When the agent is set to use a tool, it stops and returns the tool that it will use. It executes the tool correctly, but in the messages history, we get a ToolMessage with the correct content. However, before that, we receive an AIMessage indicating tool usage but with empty content.

Example of the message history:

"HumanMessage": {
  "lc_serializable": true,
  "lc_kwargs": {
    "content": "Use a tool please",
    "additional_kwargs": {},
    "response_metadata": {}
  },
  "lc_namespace": [ "langchain_core", "messages" ],
  "content": "Use a tool please",
  "name": "undefined",
  "additional_kwargs": {},
  "response_metadata": {},
  "id": "undefined"
},
{
  "AIMessage": {
    "lc_serializable": true,
    "lc_kwargs": {
      "content": "",
      "tool_calls": [{ "id": "tooluse_e68eeY31Rju83TWq5n-e1A", "name": "retrieve_internal_documents", "args": {}}],
      "response_metadata": {
        "$metadata": { "httpStatusCode": 200, "requestId": "fba07d1c-cfda-4609-a603-ad589da3374a", "attempts": 1, "totalRetryDelay": 0 },
        "metrics": { "latencyMs": 1718 },
        "stopReason": "tool_use",
        "usage": { "inputTokens": 446, "outputTokens": 56, "totalTokens": 502 }
      },
      "usage_metadata": { "input_tokens": 446, "output_tokens": 56, "total_tokens": 502 },
      "id": "fba07d1c-cfda-4609-a603-ad589da3374a",
      "invalid_tool_calls": [],
      "additional_kwargs": {}
    }
  }
},
"ToolMessage": {
  "lc_serializable": true,
  "lc_kwargs": {
    "content": "bla bla"
<rest of correct output from tool>

This issue didn't occur with ChatOpenAI or BedrockChat, and only started after upgrading to ChatBedrockConverse. I believe the agent is correctly performing the tool usage but somehow leaves an empty AIMessage behind and the Converse API does not like that.

Copy link

dosubot bot commented Jul 2, 2024

The issue you're encountering with ChatBedrockConverse seems to be related to how the agent handles tool usage, resulting in an empty AIMessage content. This behavior is specific to ChatBedrockConverse and wasn't present with ChatOpenAI or BedrockChat.

To address this, you can filter out any AIMessage objects with empty content before invoking the model. Here's an example of how you can modify your code to achieve this:

import { ChatBedrockConverse } from "@langchain/aws";
import { HumanMessage, AIMessage } from "@langchain/core/messages";

const model = new ChatBedrockConverse({
  model: "anthropic.claude-3-sonnet-20240229-v1:0",
  region: "us-east-1",
  credentials: {
    accessKeyId: process.env.BEDROCK_AWS_ACCESS_KEY_ID!,
    secretAccessKey: process.env.BEDROCK_AWS_SECRET_ACCESS_KEY!,
  },
});

const filterEmptyMessages = (messages) => {
  return messages.filter(message => {
    if (message instanceof AIMessage && !message.content) {
      return false;
    }
    return true;
  });
};

const messages = [
  new HumanMessage({ content: "Use a tool please" }),
  new AIMessage({
    content: "",
    tool_calls: [{ id: "tooluse_e68eeY31Rju83TWq5n-e1A", name: "retrieve_internal_documents", args: {} }],
    response_metadata: {
      $metadata: { httpStatusCode: 200, requestId: "fba07d1c-cfda-4609-a603-ad589da3374a", attempts: 1, totalRetryDelay: 0 },
      metrics: { latencyMs: 1718 },
      stopReason: "tool_use",
      usage: { inputTokens: 446, outputTokens: 56, totalTokens: 502 }
    },
    usage_metadata: { input_tokens: 446, output_tokens: 56, total_tokens: 502 },
    id: "fba07d1c-cfda-4609-a603-ad589da3374a",
    invalid_tool_calls: [],
    additional_kwargs: {}
  }),
  new HumanMessage({ content: "What is the weather today?" })
];

const filteredMessages = filterEmptyMessages(messages);

const res = await model.invoke(filteredMessages);
console.log(res);

This code filters out any AIMessage objects with empty content before passing the messages to the invoke method, which should prevent the ValidationException error [1][2][3].

To continue talking to Dosu, mention @dosu.

@Jalmeida1994
Copy link
Author

Jalmeida1994 commented Jul 2, 2024

We tried filtering out any AIMessage objects with empty content before invoking the model. However, we encounter another issue when we do this. Here’s the error message we receive:

ValidationException: The model returned the following errors: Your API request included an `assistant` message in the final position, which would pre-fill the `assistant` response. When using tools, pre-filling the `assistant` response is not supported.
    at de_ValidationExceptionRes (/Users/test/code/bot/node_modules/.pnpm/@aws-sdk+client-bedrock-runtime@3.606.0/node_modules/@aws-sdk/client-bedrock-runtime/dist-cjs/index.js:1082:21)
    at de_CommandError (/Users/test/code/bot/node_modules/.pnpm/@aws-sdk+client-bedrock-runtime@3.606.0/node_modules/@aws-sdk/client-bedrock-runtime/dist-cjs/index.js:937:19)
    at processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async /Users/test/code/bot/node_modules/.pnpm/@smithy+middleware-serde@3.0.3/node_modules/@smithy/middleware-serde/dist-cjs/index.js:35:20
    at async /Users/test/code/bot/node_modules/.pnpm/@smithy+core@2.2.4/node_modules/@smithy/core/dist-cjs/index.js:165:18
    at async /Users/test/code/bot/node_modules/.pnpm/@smithy+middleware-retry@3.0.7/node_modules/@smithy/middleware-retry/dist-cjs/index.js:320:38
    at async /Users/test/code/bot/node_modules/.pnpm/@aws-sdk+middleware-logger@3.598.0/node_modules/@aws-sdk/middleware-logger/dist-cjs/index.js:34:22
    at async ChatBedrockConverse._generateNonStreaming (/Users/test/code/bot/node_modules/.pnpm/@langchain+aws@0.0.1_@aws-sdk+client-sso-oidc@3.606.0_@aws-sdk+client-sts@3.606.0__@aws-sdk+c_wops3cctew2kpyygicqzgjk7su/node_modules/@langchain/aws/dist/chat_models.cjs:222:26)
    at async Promise.allSettled (index 0)
    at async ChatBedrockConverse._generateUncached (/Users/test/code/bot/node_modules/.pnpm/@langchain+core@0.2.12_langchain@0.2.8_@aws-sdk+credential-provider-node@3.600.0_@aws-sdk+cli_6wi7pugnjf7kzne7hkijzr5xv4/node_modules/@langchain/core/dist/language_models/chat_models.cjs:177:29)

It seems that when a ToolMessage is present as the last message, the Converse API does not generate a completion because the last message needs to be a human message - or maybe it's a completely unrelated issue ahah. This behavior didn't occur with ChatOpenAI or BedrockChat, and only started after upgrading to ChatBedrockConverse.

@crcorbett
Copy link

+1

1 similar comment
@dsunjka
Copy link

dsunjka commented Jul 9, 2024

+1

@danny-avila
Copy link

Same issue as @Jalmeida1994 when a ToolMessage is present as the last message

@dsunjka
Copy link

dsunjka commented Jul 9, 2024

FYI - not related to langchain but I had this problem when using the Bedrock Converse API. Switched to raw InvokeModel API and didnt have this anymore

@MarcoLooy
Copy link

@dsunjka if you don't mind, would you please be so kind to share your code snippet that solved this problem?

@dsaks9
Copy link

dsaks9 commented Jul 16, 2024

Was this fix also made to Python client?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants