Development Guide¶
This guide covers setting up a development environment and contributing to Squirrel Backend.
Project Structure¶
squirrel-backend/
├── app/
│ ├── main.py # API entry point
│ ├── monitor_main.py # PV Monitor entry point
│ ├── worker.py # Arq worker configuration
│ ├── config.py # Configuration settings
│ ├── api/v1/ # API endpoints
│ ├── models/ # SQLAlchemy models
│ ├── schemas/ # Pydantic schemas (DTOs)
│ ├── services/ # Business logic layer
│ ├── repositories/ # Data access layer
│ ├── tasks/ # Arq task definitions
│ └── db/ # Database session management
├── alembic/ # Database migrations
├── tests/ # Test suite
├── docker/ # Docker configuration
└── scripts/ # Utility scripts
Setting Up Development Environment¶
1. Start Infrastructure¶
2. Set Up Python Environment¶
cd ..
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -e ".[dev]"
Or use the setup script:
3. Configure Environment¶
4. Run Migrations¶
5. Load Test Data¶
6. Start Services¶
Run each in a separate terminal:
Development Workflow¶
Making Changes¶
- Create a feature branch
- Make your changes
- Run tests:
pytest - Run linting:
ruff check . - Run type checking:
mypy app/ - Submit a pull request
Hot Reload¶
The API server supports hot reload with --reload:
Changes to Python files will automatically restart the server.
Debugging¶
Enable debug logging:
Performance Benchmarking¶
# Start the backend first, then run:
python -m scripts.benchmark
# With more iterations
python -m scripts.benchmark --iterations 10
# Skip restore benchmark (no EPICS writes)
python -m scripts.benchmark --skip-restore
Utility Scripts¶
seed_pvs.py¶
Generate test PV data:
# Create 1000 test PVs with tags
python -m scripts.seed_pvs --count 1000
# Create 50K PVs for performance testing
python -m scripts.seed_pvs --count 50000 --batch-size 5000
# Clear existing data first
python -m scripts.seed_pvs --count 1000 --clear
upload_csv.py¶
Import PVs from CSV:
# Dry run
python -m scripts.upload_csv your_pvs.csv --dry-run
# Full upload
python -m scripts.upload_csv your_pvs.csv
benchmark.py¶
Performance testing:
IDE Setup¶
VS Code¶
Recommended extensions:
- Python
- Pylance
- Ruff
Settings (.vscode/settings.json):
{
"python.defaultInterpreterPath": "./venv/bin/python",
"python.analysis.typeCheckingMode": "basic",
"[python]": {
"editor.formatOnSave": true,
"editor.codeActionsOnSave": {
"source.fixAll": "explicit",
"source.organizeImports": "explicit"
},
"editor.defaultFormatter": "charliermarsh.ruff"
}
}
PyCharm¶
- Set project interpreter to
./venv/bin/python - Enable Ruff plugin
- Configure pytest as default test runner
Pre-commit Hooks¶
The project uses pre-commit hooks for code quality:
Next Steps¶
- Testing - Running and writing tests
- Database Migrations - Managing schema changes
- Code Quality - Linting and formatting