---
title: "OpenAI"
description: "Manually instrument the OpenAI SDK in React Native to capture spans and LLM interactions."
url: https://docs.sentry.io/platforms/react-native/integrations/openai/
---

# OpenAI | Sentry for React Native

*Import name: `Sentry.instrumentOpenAiClient`*

The `instrumentOpenAiClient` helper adds instrumentation for the [`openai`](https://www.npmjs.com/package/openai) SDK by wrapping an OpenAI client instance and recording LLM interactions with configurable input/output capture. The OpenTelemetry-based automatic integration available for Node.js does not work in React Native, so wrapping the client manually is the only supported path.

## [Usage](https://docs.sentry.io/platforms/react-native/integrations/openai.md#usage)

Make sure [tracing is enabled](https://docs.sentry.io/platforms/react-native/tracing.md) for the spans produced by this integration to be captured.

## [Configuration](https://docs.sentry.io/platforms/react-native/integrations/openai.md#configuration)

### [Options](https://docs.sentry.io/platforms/react-native/integrations/openai.md#options)

The following options control what data is captured from OpenAI SDK calls:

#### [`recordInputs`](https://docs.sentry.io/platforms/react-native/integrations/openai.md#recordinputs)

*Type: `boolean` (optional)*

Records inputs to OpenAI SDK calls (such as prompts and messages).

Defaults to `true` if `sendDefaultPii` is `true`.

#### [`recordOutputs`](https://docs.sentry.io/platforms/react-native/integrations/openai.md#recordoutputs)

*Type: `boolean` (optional)*

Records outputs from OpenAI SDK calls (such as generated text and responses).

Defaults to `true` if `sendDefaultPii` is `true`.

## [Supported Operations](https://docs.sentry.io/platforms/react-native/integrations/openai.md#supported-operations)

By default, tracing support is added to the following OpenAI SDK calls:

* `chat.completions.create()` - Chat completion requests
* `responses.create()` - Response SDK requests

Streaming and non-streaming requests are automatically detected and handled appropriately.

When using OpenAI's streaming API, you must also pass `stream_options: { include_usage: true }` to receive token usage data. Without this option, OpenAI does not include `prompt_tokens` or `completion_tokens` in streamed responses, and Sentry will be unable to capture `gen_ai.usage.input_tokens` / `gen_ai.usage.output_tokens` on the resulting span. This is an OpenAI API behavior, not a Sentry limitation. See [OpenAI docs on stream options](https://platform.openai.com/docs/api-reference/chat/create#chat-create-stream_options).

## [Supported Versions](https://docs.sentry.io/platforms/react-native/integrations/openai.md#supported-versions)

* `openai`: `>=4.0.0 <7`
