Skip to content

fix(tools): implement dynamic tool output truncation to prevent LLM context overflow#317

Open
Adar5 wants to merge 1 commit intojenkinsci:mainfrom
Adar5:fix/tool-output-final
Open

fix(tools): implement dynamic tool output truncation to prevent LLM context overflow#317
Adar5 wants to merge 1 commit intojenkinsci:mainfrom
Adar5:fix/tool-output-final

Conversation

@Adar5
Copy link
Copy Markdown
Contributor

@Adar5 Adar5 commented Mar 28, 2026

The Problem

By default, LangChain attempts to inject the raw string output of any executed tool directly into the context window. If a tool returns a massive un-chunked payload (e.g., a large plugin document or a future Jenkins log fetcher), it instantly crashes the LLM provider with a TokenLimitExceeded or OOM error, freezing the chat session.

The Solution

This PR implements a lightweight, dynamically configured defensive wrapper (truncate_tool_output decorator) around the TOOL_REGISTRY.

Key Changes:

Configuration: Added max_tool_output_length to config.yaml (default: 4000), allowing server admins to control memory limits without touching Python code.

Interception: The decorator intercepts all tool executions before they reach the LangChain agent, hard-capping string outputs at the configured limit.

Observability: Added standard server-side logging (logger.warning) so DevOps is notified when a truncation event occurs.

Context Awareness: Appends a clean [SYSTEM WARNING] so the LLM context is explicitly aware that the data was truncated and can inform the user.

Testing

[x] Added test_tool_overflow.py to assert massive 50,000+ character strings are safely truncated based on the config.yaml limit without blocking the event loop.

@Adar5 Adar5 requested a review from a team as a code owner March 28, 2026 06:40
@Adar5 Adar5 force-pushed the fix/tool-output-final branch from 0e37ac6 to 882c508 Compare March 28, 2026 06:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant