A production-ready Django Ninja API boilerplate with Celery, Channels, and Docker support.
- Django 5 with Django Ninja for fast API development
- API Key Authentication using Django 5's GeneratedField
- Real-time WebSockets with Django Channels
- Background Tasks with Celery (Redis broker)
- Containerized with Docker and Docker Compose
- Admin Interface with Django Jazzmin
- Database Logging with django-db-logger
- Task Monitoring with Celery Flower
- Hot Reload for development
api-boilerplate/
├── core/ # Django project
│ ├── apps/ # Custom apps
│ │ ├── auth/ # Authentication with API keys
│ │ └── example/ # Example app with all features
│ ├── core/ # Django settings
│ ├── static/ # Static files
│ └── manage.py
├── docker/ # Docker configuration
└── pyproject.toml # Poetry dependencies
Create a .env file in the project root with the following content:
# Django Core
DJANGO_SETTINGS_MODULE=core.settings.development
DJANGO_SECRET_KEY=development-secret-key-change-in-production
DEBUG=True
ALLOWED_HOSTS=localhost,127.0.0.1,0.0.0.0
# Database
DATABASE_URL=postgresql://django:secure_postgres_password_123@postgres:5432/postgres
POSTGRES_PASSWORD=secure_postgres_password_123
# Redis & Celery
REDIS_URL=redis://:redis_secure_password_456@redis:6379/0
REDIS_PASSWORD=redis_secure_password_456
CELERY_BROKER_URL=redis://:redis_secure_password_456@redis:6379/0
CELERY_RESULT_BACKEND=redis://:redis_secure_password_456@redis:6379/0
# Celery Workers Configuration
CELERY_IO_WORKERS=1000 # Gevent workers for I/O tasks
CELERY_CPU_WORKERS=4 # Process workers for CPU tasks
# Logging
DJANGO_DB_LOGGER_ADMIN_LIST_PER_PAGE=50
DJANGO_DB_LOGGER_ENABLE_FORMATTER=True
# Security
SECURE_SSL_REDIRECT=False
SECURE_BROWSER_XSS_FILTER=True
SECURE_CONTENT_TYPE_NOSNIFF=True
X_FRAME_OPTIONS=DENY# Build and start all services
docker-compose -f docker/docker-compose.yml up --build
# Or run in background
docker-compose -f docker/docker-compose.yml up -d --build- API Documentation: http://localhost:8000/api/docs
- Admin Panel: http://localhost:8000/admin/
- Flower (Task Monitor): http://localhost:5555 (requires authentication)
- WebSocket: ws://localhost:8000/ws/test/
Check the Docker logs to get the admin user's API key:
docker-compose -f docker/docker-compose.yml logs web | grep "API key"All API endpoints require an API key in the header:
curl -H "X-API-Key: your_api_key_here" http://localhost:8000/api/example/test# Get user profile
GET /api/auth/me
# Test endpoint
GET /api/example/test
# Trigger streaming task
POST /api/example/trigger-task
# Get user info
GET /api/example/user-info
# Health check (no auth required)
GET /api/healthConnect to the WebSocket and authenticate:
const ws = new WebSocket('ws://localhost:8000/ws/test/');
ws.onopen = function() {
// Authenticate with API key
ws.send(JSON.stringify({
type: 'auth',
api_key: 'your_api_key_here'
}));
};
ws.onmessage = function(event) {
const data = JSON.parse(event.data);
console.log('Received:', data);
};
// Send ping
ws.send(JSON.stringify({
type: 'ping',
timestamp: Date.now()
}));-
Install dependencies:
poetry install
-
Set up environment variables in your shell or
.envfile -
Run services:
# Terminal 1: Django server cd core poetry run python manage.py runserver # Terminal 2: Celery worker cd core poetry run celery -A core worker -Q io_queue,cpu_queue --loglevel=info # Terminal 3: Celery beat cd core poetry run celery -A core beat --scheduler django_celery_beat.schedulers:DatabaseScheduler # Terminal 4: Flower cd core poetry run celery -A core flower
# Create migrations
docker-compose -f docker/docker-compose.yml exec web python manage.py makemigrations
# Apply migrations
docker-compose -f docker/docker-compose.yml exec web python manage.py migratedocker-compose -f docker/docker-compose.yml exec web python manage.py createsuperuserapps/auth/: User authentication with API key generationapps/example/: Example app demonstrating all features
core/settings/: Environment-based settings (dev/prod)core/celery.py: Celery configuration with queue routingcore/asgi.py: ASGI application for HTTP + WebSocketcore/urls.py: Main URL configuration with API routes
web: Django + Channels ASGI servercelery-worker: Background task processingcelery-beat: Periodic task schedulingcelery-flower: Task monitoring dashboardpostgres: PostgreSQL databaseredis: Redis (broker + channel layer)
# In Django shell
from django.contrib.auth import get_user_model
User = get_user_model()
user = User.objects.get(username='admin')
user.regenerate_api_key()
print(f"New API key: {user.api_key}")user.is_api_key_active = False
user.save()io_queue: I/O bound tasks (API calls, file operations)cpu_queue: CPU intensive tasks (data processing)
from apps.example.tasks import streaming_task
# Trigger task
task = streaming_task.delay()
print(f"Task ID: {task.id}")Access Flower at http://localhost:5555 to monitor:
- Active tasks
- Worker status
- Queue lengths
- Task history
Flower requires authentication using the credentials defined in your .env file (ADMIN_USERNAME and ADMIN_PASSWORD).
View application logs in the Django admin at /admin/django_db_logger/statuslog/
- Update environment variables for production
- Use production settings:
DJANGO_SETTINGS_MODULE=core.settings.production
- Enable HTTPS and update security settings
- Scale services as needed:
docker-compose up --scale celery-worker=4
- Port conflicts: Ensure ports 8000, 5432, 6379, 5555 are available
- Permission errors: Check Docker user permissions
- Database connection: Verify PostgreSQL is running and accessible
- Redis connection: Check Redis authentication and connectivity
# View all logs
docker-compose -f docker/docker-compose.yml logs
# View specific service logs
docker-compose -f docker/docker-compose.yml logs web
docker-compose -f docker/docker-compose.yml logs celery-workerThis project is open source and available under the MIT License.