Menu

Celery: What Is It and How to Use It with Django?

5 min read Mis à jour le 05 Apr 2026

Définition

Celery is an open-source distributed task queue for Python, enabling the execution of asynchronous and scheduled background operations. Widely used with Django, it delegates long-running tasks (email sending, report generation, data processing) to independent worker processes, preserving web application responsiveness.

What is Celery?

Celery is an asynchronous, distributed task queue written in Python. It enables background execution of operations outside the normal HTTP request-response cycle of a web application. When a user triggers a time-consuming action — sending an email, generating a PDF report, processing a 50,000-line CSV file, or calling an external API — Celery handles this operation in a separate process, allowing the web server to respond immediately to the user.

Celery operates on a producer-consumer model. The web application (the producer) places messages in a queue managed by a message broker, typically Redis or RabbitMQ. Worker processes (the consumers) retrieve these messages and execute the corresponding tasks. This decoupling between task submission and execution is fundamental to the scalability and resilience of modern web applications.

At KERN-IT, Celery is a standard component of our Django stack. We use it in the majority of our projects to handle asynchronous tasks: sending transactional emails, generating documents, synchronising data with third-party systems, image processing, and scheduled maintenance tasks. Celery, combined with Redis as the broker and Docker Compose for orchestration, forms a proven and reliable architecture.

Why Celery Matters

Without Celery, any long-running operation would block the web server, degrading user experience and limiting the application's ability to handle concurrent traffic. Celery solves this fundamental problem and brings additional benefits.

  • Application responsiveness: by delegating long operations to workers, the web server (Gunicorn) remains available to respond to HTTP requests. The user does not wait 30 seconds for a report to generate: they receive instant confirmation and will be notified when the report is ready.
  • Horizontal scalability: the number of Celery workers can be increased independently of the number of web processes. If background processing becomes a bottleneck, simply add workers on additional machines.
  • Scheduled tasks: Celery Beat, the built-in scheduler, enables periodic task execution (Python cron jobs): data cleanup, daily report generation, nightly synchronisation with external APIs, weekly newsletter dispatch.
  • Resilience: if a worker crashes during task execution, Celery can automatically retry the task on another worker. Retry mechanisms with exponential backoff gracefully handle temporary errors (API timeouts, temporarily unavailable database).
  • Monitoring: Flower, Celery's monitoring tool, provides a real-time web dashboard for tracking worker status, running tasks, failed tasks, and performance metrics.

How It Works

The Celery architecture comprises three main components. The Django application submits tasks by calling my_task.delay(arguments). This method serialises the task arguments into a message (typically JSON) and places it in the broker's queue (Redis). The Celery worker, an independent process running alongside the web server, retrieves the message, deserialises the arguments, and executes the associated Python function.

Redis is the most commonly used broker with Celery in the Django ecosystem. It serves as the message queue for tasks and can also store execution results (result backend). Redis is fast, lightweight, and easy to deploy, making it an ideal choice for the majority of projects.

Celery tasks are defined as Python functions decorated with @shared_task or @app.task. They accept serialisable arguments (strings, numbers, lists, dictionaries) and can return results. It is important never to pass Django objects (such as model instances) as task arguments: pass the identifier (ID) and retrieve the object from the database within the task itself.

Concrete Example

In a management platform project developed by KERN-IT, Celery handles several types of tasks. When an administrator imports a CSV file of 10,000 contacts, the Django view validates the file and submits a Celery task that processes each row in the background, creating or updating contacts in the database. The administrator sees a real-time progress bar via WebSocket updates.

Celery Beat also runs scheduled tasks: every night, a task synchronises data with the client's CRM via its REST API. Every Monday morning, a task generates a weekly PDF report and sends it by email to managers. Every hour, a task checks deadlines and sends automatic reminders. All these tasks run on a Celery worker in a dedicated Docker container, orchestrated by Docker Compose alongside the Django application, Redis, and PostgreSQL.

Implementation

  1. Install Celery and Redis: add celery[redis] to your Python dependencies. Install Redis locally or use it via Docker Compose. Create the celery.py file in your Django project to configure the Celery application.
  2. Configure Django: add Celery settings to settings.py: CELERY_BROKER_URL (Redis URL), CELERY_RESULT_BACKEND, JSON serialiser, timezone. Import the Celery configuration in the project's __init__.py.
  3. Write tasks: decorate your functions with @shared_task. Keep tasks idempotent (executable multiple times without side effects) and only pass serialisable arguments.
  4. Configure Docker Compose: add a Celery worker service and a Redis service to your docker-compose.yml. The worker uses the same image as the Django application but with the command celery -A myproject worker.
  5. Configure scheduled tasks: if you need periodic tasks, add a Celery Beat service in Docker Compose and define the schedule in CELERY_BEAT_SCHEDULE in Django settings.
  6. Monitor: deploy Flower (celery -A myproject flower) for a web dashboard. In production, configure alerts on failed tasks and the number of pending tasks in the queue.

Associated Technologies and Tools

  • Redis: the most commonly used message broker and result backend with Celery.
  • Django: Python web framework with which Celery integrates natively.
  • Docker Compose: orchestration of Django, Celery worker, Celery Beat, and Redis services.
  • Flower: real-time web monitoring tool for Celery.
  • RabbitMQ: alternative message broker to Redis, more robust for very high volumes.
  • celery-beat: periodic task scheduler built into Celery.

Conclusion

Celery is an indispensable component of any Django application that needs to handle long-running, asynchronous, or scheduled tasks. By delegating these operations to independent workers, Celery preserves web server responsiveness and enables horizontal scalability. At KERN-IT, we systematically integrate Celery with Redis in our Django projects, orchestrated via Docker Compose for reproducible deployment. Whether for email sending, report generation, data synchronisation, or scheduled maintenance tasks, Celery provides a reliable and proven solution for background processing.

Conseil Pro

Always make your Celery tasks idempotent: a task must be executable multiple times without undesirable side effects. Only pass identifiers (IDs) as arguments, never Django objects — the worker will fetch the fresh object from the database. This prevents stale data issues and facilitates automatic retries on errors.

Un projet en tête ?

Discutons de comment nous pouvons vous aider à concrétiser vos idées.