LangChain
Adds instrumentation for LangChain.
This integration works in the Node.js, Cloudflare Workers, Vercel Edge Functions, and browser runtimes. It requires SDK version 10.22.0 or higher.
Import name: Sentry.langChainIntegration
The langChainIntegration adds instrumentation for LangChain to capture spans by automatically wrapping LangChain operations and recording AI agent interactions with configurable input/output recording.
For Cloudflare Workers, you need to manually instrument LangChain operations using the createLangChainCallbackHandler helper:
import * as Sentry from "@sentry/cloudflare";
import { ChatAnthropic } from "@langchain/anthropic";
// Create a LangChain callback handler
const callbackHandler = Sentry.createLangChainCallbackHandler({
recordInputs: true, // Optional: record input prompts/messages
recordOutputs: true, // Optional: record output responses
});
// Use with chat models
const model = new ChatAnthropic({
model: "claude-3-5-sonnet-20241022",
apiKey: process.env.ANTHROPIC_API_KEY,
});
await model.invoke("Tell me a joke", {
callbacks: [callbackHandler],
});
Type: boolean
Records inputs to LangChain operations (such as prompts and messages).
Defaults to true if sendDefaultPii is true.
Sentry.init({
integrations: [Sentry.langChainIntegration({ recordInputs: true })],
});
Type: boolean
Records outputs from LangChain operations (such as generated text and responses).
Defaults to true if sendDefaultPii is true.
Sentry.init({
integrations: [Sentry.langChainIntegration({ recordOutputs: true })],
});
By default this integration adds tracing support for LangChain operations including:
- Chat model invocations (
gen_ai.chat) - Captures spans for chat model calls - LLM invocations (
gen_ai.pipeline) - Captures spans for LLM pipeline executions - Chain executions (
gen_ai.invoke_agent) - Captures spans for chain invocations - Tool executions (
gen_ai.execute_tool) - Captures spans for tool calls
The integration automatically instruments the following LangChain runnable methods:
invoke()- Single executionstream()- Streaming executionbatch()- Batch execution
langchain:>=0.1.0 <1.0.0
Our documentation is open source and available on GitHub. Your contributions are welcome, whether fixing a typo (drat!) or suggesting an update ("yeah, this would be better").