Vercel AI
Adds instrumentation for Vercel AI SDK.
This integration only works in the Node.js, Cloudflare Workers, Vercel Edge Functions and Bun runtimes. Requires SDK version 9.33.0
or higher.
Import name: Sentry.vercelAIIntegration
The vercelAIIntegration
adds instrumentation for the ai
SDK by Vercel to capture spans using the AI SDK's built-in Telemetry
.
This integration is not enabled by default. You need to manually enable it by passing Sentry.vercelAIIntegration()
to Sentry.init
:
Sentry.init({
dsn: ""
tracesSampleRate: 1.0,
integrations: [Sentry.vercelAIIntegration()],
});
To correctly capture spans, pass the experimental_telemetry
object with isEnabled: true
to every generateText
, generateObject
, and streamText
function call. For more details, see the AI SDK Telemetry Metadata docs.
const result = await generateText({
model: openai("gpt-4o"),
experimental_telemetry: {
isEnabled: true,
recordInputs: true,
recordOutputs: true,
},
});
In order to make it easier to correlate captured spans with the function calls we recommend setting functionId
in experimental_telemetry
in all generation function calls:
const result = await generateText({
model: openai("gpt-4o"),
experimental_telemetry: {
isEnabled: true,
functionId: "my-awesome-function",
},
});
By default this integration adds tracing support to all ai
function callsites. If you need to disable span collection for a specific call, you can do so by setting experimental_telemetry.isEnabled
to false
in the first argument of the function call.
const result = await generateText({
model: openai("gpt-4o"),
experimental_telemetry: { isEnabled: false },
});
If you set experimental_telemetry.recordInputs
and experimental_telemetry.recordOutputs
it will override the default behavior of collecting inputs and outputs for that function call.
const result = await generateText({
model: openai("gpt-4o"),
experimental_telemetry: {
isEnabled: true,
recordInputs: true,
recordOutputs: true,
},
});
ai
:>=3.0.0 <5
Our documentation is open source and available on GitHub. Your contributions are welcome, whether fixing a typo (drat!) or suggesting an update ("yeah, this would be better").