---
title: "Vercel AI"
description: "Adds instrumentation for Vercel AI SDK."
url: https://docs.sentry.io/platforms/javascript/guides/cloudflare/configuration/integrations/vercelai/
---

# Vercel AI | Sentry for Cloudflare

Requires SDK version `10.6.0` or higher for Node.js, Cloudflare Workers, Vercel Edge Functions and Bun. Requires SDK version `10.12.0` or higher for Deno.

*Import name: `Sentry.vercelAIIntegration`*

The `vercelAIIntegration` adds instrumentation for the [`ai`](https://www.npmjs.com/package/ai) SDK by Vercel to capture spans using the [`AI SDK's built-in Telemetry`](https://sdk.vercel.ai/docs/ai-sdk-core/telemetry).

This integration is not enabled by default. You need to manually enable it by passing `Sentry.vercelAIIntegration()` to `Sentry.init`:

```javascript
Sentry.init({
  dsn: "____PUBLIC_DSN____"
  tracesSampleRate: 1.0,
  integrations: [Sentry.vercelAIIntegration()],
});
```

To correctly capture spans, pass the `experimental_telemetry` object with `isEnabled: true` to every `generateText`, `generateObject`, and `streamText` function call. For more details, see the [AI SDK Telemetry Metadata docs](https://sdk.vercel.ai/docs/ai-sdk-core/telemetry#telemetry-metadata).

```javascript
const result = await generateText({
  model: openai("gpt-4o"),
  experimental_telemetry: {
    isEnabled: true,
    recordInputs: true,
    recordOutputs: true,
  },
});
```

## [Identifying functions](https://docs.sentry.io/platforms/javascript/guides/cloudflare/configuration/integrations/vercelai.md#identifying-functions)

In order to make it easier to correlate captured spans with the function calls we recommend setting `functionId` in `experimental_telemetry` in all generation function calls:

```javascript
const result = await generateText({
  model: openai("gpt-4o"),
  experimental_telemetry: {
    isEnabled: true,
    functionId: "my-awesome-function",
  },
});
```

## [Configuration](https://docs.sentry.io/platforms/javascript/guides/cloudflare/configuration/integrations/vercelai.md#configuration)

By default this integration adds tracing support to all `ai` function callsites. If you need to disable span collection for a specific call, you can do so by setting `experimental_telemetry.isEnabled` to `false` in the first argument of the function call.

```javascript
const result = await generateText({
  model: openai("gpt-4o"),
  experimental_telemetry: { isEnabled: false },
});
```

If you set `experimental_telemetry.recordInputs` and `experimental_telemetry.recordOutputs` it will override the default behavior of collecting inputs and outputs for that function call.

```javascript
const result = await generateText({
  model: openai("gpt-4o"),
  experimental_telemetry: {
    isEnabled: true,
    recordInputs: true,
    recordOutputs: true,
  },
});
```

## [Supported Versions](https://docs.sentry.io/platforms/javascript/guides/cloudflare/configuration/integrations/vercelai.md#supported-versions)

* `ai`: `>=3.0.0 <=6`

## [Troubleshooting](https://docs.sentry.io/platforms/javascript/guides/cloudflare/configuration/integrations/vercelai.md#troubleshooting)
