**1. Install Celery and Redis**:
pip install celery[redis]
**2. Configure Celery**:
# celery.py
import os
from celery import Celery
# Set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'your_project.settings')
app = Celery('your_project')
# Load task modules from all registered Django app configs.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Set concurrency to 1 for sequential execution
app.conf.worker_concurrency = 1
# Discover tasks in all installed apps (inside tasks.py files).
app.autodiscover_tasks()
```
**3. Configure Celery in Django Settings**:
```python
# settings.py
# Celery settings
CELERY_BROKER_URL = 'redis://localhost:6379/0'
CELERY_TIMEZONE = 'UTC'
# Example periodic task schedule (runs every 10 minutes)
CELERY_BEAT_SCHEDULE = {
'my_periodic_task': {
'task': 'myapp.tasks.my_task_function',
'schedule': 600, # 10 minutes (in seconds)
},
}
```
**4. Create a Task Function**:
```python
# myapp/tasks.py
from celery import shared_task
from datetime import datetime
@shared_task
def my_task_function():
# Your periodic task logic here
now = datetime.now()
print(f"My periodic task ran at {now}")
```
**5. Start Celery Beat**:
```bash
celery -A your_project beat
```
**6. Start Celery Worker**:
In a separate terminal, start the Celery worker to execute the tasks:
```bash
celery -A LSP_System beat --loglevel=DEBUG
celery -A LSP_System worker --loglevel=info --concurrency=1
for background processing
nohup celery -A LSP_System beat --loglevel=DEBUG &
nohup celery -A LSP_System worker --loglevel=info &
'''You can check the logs in nohub.out or by running the tail command
tail -f nohup.out
```
pkill -f 'celery -A LSP_System worker'
pkill -f 'celery -A LSP_System beat'