OpenAI
Adds instrumentation for OpenAI API.
This integration works in the Node.js, Cloudflare Workers, Vercel Edge Functions runtimes. It requires SDK version 10.2.0
or higher.
Import name: Sentry.openAIIntegration
The openAIIntegration
adds instrumentation for the openai
API to capture spans by automatically wrapping OpenAI client calls and recording LLM interactions with configurable input/output recording.
For Cloudflare Workers, you need to manually instrument the OpenAI client using the instrumentOpenAiClient
helper:
import * as Sentry from "@sentry/cloudflare";
import OpenAI from "openai";
const openai = new OpenAI();
const client = Sentry.instrumentOpenAiClient(openai, {
recordInputs: true,
recordOutputs: true,
});
// Use the wrapped client instead of the original openai instance
const response = await client.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Hello!" }],
});
Type: boolean
Records inputs to OpenAI API method calls (such as prompts and messages).
Defaults to true
if sendDefaultPii
is true
.
Sentry.init({
integrations: [Sentry.openAIIntegration({ recordInputs: true })],
});
Type: boolean
Records outputs from OpenAI API method calls (such as generated text and responses).
Defaults to true
if sendDefaultPii
is true
.
Sentry.init({
integrations: [Sentry.openAIIntegration({ recordOutputs: true })],
});
By default this integration adds tracing support to OpenAI API method calls including:
chat.completions.create()
- Chat completion requestsresponses.create()
- Response API requests
The integration will automatically detect streaming vs non-streaming requests and handle them appropriately.
openai
:>=4.0.0 <6
Our documentation is open source and available on GitHub. Your contributions are welcome, whether fixing a typo (drat!) or suggesting an update ("yeah, this would be better").