Self-Hosting Open Chat Studio
This section covers deploying Open Chat Studio in production for third-party hosters.
Architecture Overview
A production deployment requires three process types and two backing services:
flowchart TD
LB[Load Balancer / TLS]
WEB["web<br>gunicorn"]
CW["celery_worker<br>Background tasks"]
CB["celery_beat<br>Scheduled tasks"]
PG[("PostgreSQL<br>+ pgvector")]
RD[("Redis<br>Broker / Cache")]
LB --> WEB
WEB --> RD
WEB --> PG
RD --> CW
RD --> CB
CW --> PG
CB --> PG
Infrastructure Requirements
| Component | Minimum | Notes |
|---|---|---|
| PostgreSQL | 14+ | Must have the pgvector extension (≥ 0.7.0). Use pgvector/pgvector:pg16 Docker image or enable the extension on a managed database. |
| Redis | 6+ | Used as Celery broker, result backend, and Django cache. |
| Object Storage | Optional | AWS S3 (or compatible) for user media uploads and WhatsApp audio files. Without S3, files are stored on the local filesystem — not suitable for multi-instance deployments. |
| Required | Mailgun or Amazon SES via django-anymail. | |
| HTTPS / TLS | Required | Terminate TLS at a reverse proxy or load balancer. The app redirects HTTP → HTTPS in production. |
Process Types
| Process | Command | Notes |
|---|---|---|
web |
gunicorn --bind 0.0.0.0:$PORT --workers 2 --threads 8 --timeout 0 config.wsgi:application |
Scale horizontally. |
celery_worker |
celery -A config worker -l INFO --pool gevent --concurrency 100 |
Handles all async tasks (LLM calls, messaging, evaluations). |
celery_beat |
celery -A config beat -l INFO |
Scheduled/periodic tasks. Run exactly one instance. |
Docker Image
The production Dockerfile is a multi-stage build:
- Python stage — installs dependencies via
uvinto/code/.venv - Node stage — compiles JS and CSS assets
- Runtime stage —
python:3.13-slim-bullseyewith pre-built assets baked in
The image runs as a non-root django user.
docker build -t open-chat-studio:latest .
Health Check
The app exposes a /status endpoint. Secure it by setting HEALTH_CHECK_TOKENS to a comma-separated list of secret tokens. Requests must include the token as a query parameter (?token=...).
Deployment Options
- Docker Compose — simplest path for a single-server or small-scale deployment
- Kamal — deploy Docker containers to any server via SSH with zero-downtime deploys
- Heroku — Platform-as-a-Service with minimal infrastructure management
- AWS Fargate — container-native deployment on AWS, with full automation via
ocs-deploy
First-time Setup
After deploying the database and running migrations, create a superuser:
python manage.py createsuperuser
You will then need to create a Team in the Django admin before the app is usable.