---
title: "OpenAI Agents"
description: "Learn about using Sentry for OpenAI Agents SDK."
url: https://docs.sentry.io/platforms/python/integrations/openai-agents/
---

# OpenAI Agents | Sentry for Python

##### Beta

The support for **OpenAI Agents SDK** is in its beta phase. Please test locally before using in production.

This integration connects Sentry with the [OpenAI Python SDK](https://openai.github.io/openai-agents-python/). The integration has been confirmed to work with OpenAI Agents version 0.0.19.

Once you've installed this SDK, you can use [Sentry AI Agents Insights](https://sentry.io/orgredirect/organizations/:orgslug/insights/ai/agents/), a Sentry dashboard that helps you understand what's going on with your AI agents.

Sentry AI Agents monitoring will automatically collect information about agents, tools, prompts, tokens, and models.

## [Install](https://docs.sentry.io/platforms/python/integrations/openai-agents.md#install)

Install `sentry-sdk` from PyPI:

```bash
pip install "sentry-sdk"
```

## [Configure](https://docs.sentry.io/platforms/python/integrations/openai-agents.md#configure)

If you have the `agents` package in your dependencies, the OpenAI Agents integration will be enabled automatically when you initialize the Sentry SDK.

Error Monitoring\[ ]Tracing\[ ]Profiling\[ ]Logs

```python
import sentry_sdk

sentry_sdk.init(
    dsn="___PUBLIC_DSN___",
    # Add data like request headers and IP for users, if applicable;
    # see https://docs.sentry.io/platforms/python/data-management/data-collected/ for more info
    send_default_pii=True,
    # ___PRODUCT_OPTION_START___ performance
    # Set traces_sample_rate to 1.0 to capture 100%
    # of transactions for tracing.
    traces_sample_rate=1.0,
    # ___PRODUCT_OPTION_END___ performance
    # ___PRODUCT_OPTION_START___ profiling
    # To collect profiles for all profile sessions,
    # set `profile_session_sample_rate` to 1.0.
    profile_session_sample_rate=1.0,
    # Profiles will be automatically collected while
    # there is an active span.
    profile_lifecycle="trace",
    # ___PRODUCT_OPTION_END___ profiling
    # ___PRODUCT_OPTION_START___ logs

    # Enable logs to be sent to Sentry
    enable_logs=True,
    # ___PRODUCT_OPTION_END___ logs
)
```

## [Verify](https://docs.sentry.io/platforms/python/integrations/openai-agents.md#verify)

Verify that the integration works by running an AI agent. The resulting data should show up in your AI Agents Insights dashboard.

```python
import asyncio
import random

import sentry_sdk
import agents
from pydantic import BaseModel  # installed by openai-agents

@agents.function_tool
def random_number(max: int) -> int:
    return random.randint(0, max)

class FinalResult(BaseModel):
    number: int

random_number_agent = agents.Agent(
    name="Random Number Agent",
    instructions="Generate a random number.",
    tools=[random_number, ],
    output_type=FinalResult,
    model="gpt-4o-mini",
)

async def main() -> None:
    sentry_sdk.init(
        dsn="___PUBLIC_DSN___",
        traces_sample_rate=1.0,
        # Add data like LLM and tool inputs/outputs;
        # see https://docs.sentry.io/platforms/python/data-management/data-collected/ for more info
        send_default_pii=True,
    )

    await agents.Runner.run(
        random_number_agent,
        input=f"Generate a random number between 0 and {10}.",
    )

if __name__ == "__main__":
    asyncio.run(main())
```

It may take a couple of moments for the data to appear in [sentry.io](https://sentry.io).

## [Behavior](https://docs.sentry.io/platforms/python/integrations/openai-agents.md#behavior)

Data on the following will be collected:

* AI agents invocations
* execution of tools
* number of input and output tokens used
* LLM models usage

Sentry considers LLM and tool inputs/outputs as PII and doesn't include PII data by default. If you want to include the data, set `send_default_pii=True` in the `sentry_sdk.init()` call. To explicitly exclude prompts and outputs despite `send_default_pii=True`, configure the integration with `include_prompts=False` as shown in the [Options section](https://docs.sentry.io/platforms/python/integrations/openai-agents.md#options) below.

## [Options](https://docs.sentry.io/platforms/python/integrations/openai-agents.md#options)

By adding `OpenAIAgentsIntegration` to your `sentry_sdk.init()` call explicitly, you can set options for `OpenAIAgentsIntegration` to change its behavior:

```python
import sentry_sdk
from sentry_sdk.integrations.openai_agents import OpenAIAgentsIntegration

sentry_sdk.init(
    # ...
    # Add data like inputs and responses;
    # see https://docs.sentry.io/platforms/python/data-management/data-collected/ for more info
    send_default_pii=True,
    integrations=[
        OpenAIAgentsIntegration(
            include_prompts=False,  # LLM and tool inputs/outputs will be not sent to Sentry, despite send_default_pii=True
        ),
    ],
)
```

You can pass the following keyword arguments to `OpenAIAgentsIntegration()`:

* `include_prompts`:

  Whether LLM and tool inputs and outputs should be sent to Sentry. Sentry considers this data personal identifiable data (PII) by default. If you want to include the data, set `send_default_pii=True` in the `sentry_sdk.init()` call. To explicitly exclude prompts and outputs despite `send_default_pii=True`, configure the integration with `include_prompts=False`.

  The default is `True`.

## [Supported Versions](https://docs.sentry.io/platforms/python/integrations/openai-agents.md#supported-versions)

* OpenAI Agents SDK: 0.0.19+
* Python: 3.9+
