Celery

The Celery integration adds support for the Celery Task Queue System.

Install sentry-sdk from PyPI with the celery extra:

Copied
pip install --upgrade 'sentry-sdk[celery]'

If you have the celery package in your dependencies, the Celery integration will be enabled automatically when you initialize the Sentry SDK.

Make sure that the call to init is loaded on worker startup, and not only in the module where your tasks are defined. Otherwise, the initialization happens too late and events might end up not being reported.

Copied
import sentry_sdk

sentry_sdk.init(
    dsn='https://examplePublicKey@o0.ingest.sentry.io/0',
    enable_tracing=True
)

If you're using Celery standalone, there are two ways to set this up:

  • Initializing the SDK in the configuration file loaded with Celery's --config parameter

  • Initializing the SDK by hooking it to either the celeryd_init or worker_init signals

    Copied
    import sentry_sdk
    from celery import Celery, signals
    
    app = Celery("myapp")
    
    #@signals.worker_init.connect
    @signals.celeryd_init.connect
    def init_sentry(**_kwargs):
        sentry_sdk.init(...)  # same as above
    

If you're using Celery with Django in a conventional setup, have already initialized the SDK in your settings.py file, and have Celery using the same settings with config_from_object, you don't need to initialize the SDK separately for Celery.

To verify if your SDK is initialized on worker start, you can pass debug=True to sentry_sdk.init() to see extra output when the SDK is initialized. If the output appears during worker startup and not only after a task has started, then it's working properly.

To set options on CeleryIntegration to change its behavior, add it explicitly to your sentry_sdk.init():

Copied
import sentry_sdk
from sentry_sdk.integrations.celery import CeleryIntegration

sentry_sdk.init(
    dsn="https://examplePublicKey@o0.ingest.sentry.io/0",
    enable_tracing=True,
    # ...
    integrations=[
        CeleryIntegration(
            monitor_beat_tasks=True,
            exclude_beat_tasks=[
                "unimportant-task",
                "payment-check-.*"
            ],
        ),
    ],
)

You can pass the following keyword arguments to CeleryIntegration():

  • propagate_traces

Propagate Sentry tracing information to the Celery task. This makes it possible to link Celery task errors to the function that triggered the task.

If this is set to False:

  • errors in Celery tasks won't be matched to the triggering function.
  • your Celery tasks will start a new trace and won't be connected to the trace in the calling function.

The default is True.

  • monitor_beat_tasks:

    Turn auto-instrumentation on or off for Celery Beat tasks using Sentry Crons.

    See Celery Beat Auto Discovery to learn more.

    The default is False.

  • exclude_beat_tasks:

    A list of Celery Beat tasks that should be excluded from auto-instrumentation using Sentry Crons. Only applied if monitor_beat_tasks is set to True.

    The list can contain strings with the names of tasks in the Celery Beat schedule to be excluded. It can also include regular expressions to match multiple tasks. For example, if you include "payment-check-.*" every task starting with payment-check- will be excluded from auto-instrumentation.

    See Celery Beat Auto Discovery to learn more.

    The default is None.

Distributed tracing extends the trace from the code that's running your Celery task so that it includes the code that initiated the task.

You can disable this globally with the propagate_traces parameter, documented above. If you set propagate_traces to False, all Celery tasks will start their own trace.

If you want to have more fine-grained control over trace distribution, you can override the propagate_traces option by passing the sentry-propagate-traces header when starting the Celery task:

Copied
import sentry_sdk

# Enable global distributed traces (this is the default, just to be explicit.)
sentry_sdk.init(
    dsn="https://examplePublicKey@o0.ingest.sentry.io/0",
    enable_tracing=True,
    integrations=[
        CeleryIntegration(
            propagate_traces=True
        ),
    ],
)

# This will propagate the trace:
my_task_a.delay("some parameter")

# This will propagate the trace:
my_task_b.apply_async(
    args=("some_parameter", )
)

# This will NOT propagate the trace. (The task will start its own trace):
my_task_b.apply_async(
    args=("some_parameter", ),
    headers={"sentry-propagate-traces": False},
)

# Note: overriding the tracing behaviour using `task_x.delay()` is not possible.

  • Celery: 3.0+
  • Python: 2.7+ (Celery 3+), 3.6+ (Celery 5.0+), 3.7+ (Celery 5.1+)
Help improve this content
Our documentation is open source and available on GitHub. Your contributions are welcome, whether fixing a typo (drat!) or suggesting an update ("yeah, this would be better").