14 KiB
Testing Guide
Comprehensive testing strategy for Brief Bench FastAPI, covering unit tests, integration tests, and end-to-end tests.
Test Pyramid
/\
/ \ E2E Tests (Slow, Full Stack)
/____\
/ \
/ Integ. \ Integration Tests (Medium, DB API)
/__________\
/ \
/ Unit \ Unit Tests (Fast, Isolated)
/________________\
Three Test Levels
-
Unit Tests (119 tests, 99% coverage)
- Fast, isolated tests
- Mock all external dependencies
- Test business logic in isolation
- Run during development
-
Integration Tests (DB API integration)
- Test integration with DB API
- Require DB API service running
- Validate data flow and API contracts
- Run before commits
-
End-to-End Tests (Full stack)
- Test complete user workflows
- Require all services (FastAPI, DB API, RAG backends)
- Validate entire system integration
- Run before deployment
Quick Start
Run All Tests (Unit + Integration)
# Windows
.\run_all_tests.bat
# Linux/Mac
./run_all_tests.sh
Run Test Category
# Unit tests only (fast)
.\run_unit_tests.bat
# Integration tests (requires DB API)
.\run_integration_tests.bat
# E2E tests (requires all services)
.\run_e2e_tests.bat
Run Specific Tests
# Activate virtual environment first
.venv\Scripts\activate # Windows
source .venv/bin/activate # Linux/Mac
# Run by marker
pytest -m unit # Unit tests only
pytest -m integration # Integration tests only
pytest -m e2e # E2E tests only
# Run by file
pytest tests/unit/test_auth_service.py
pytest tests/integration/test_auth_integration.py
pytest tests/e2e/test_full_flow_e2e.py
# Run specific test function
pytest tests/unit/test_auth_service.py::TestAuthService::test_generate_token
Unit Tests
Location: tests/unit/
Coverage: 99% of application code
Speed: Very fast (< 1 second)
Dependencies: None (all mocked)
What They Test
- Service layer business logic
- Model validation
- Utility functions
- Error handling
- Edge cases
Key Features
- All external APIs are mocked (httpx, JWT, etc.)
- No real network calls
- No database required
- Deterministic and repeatable
- Run in parallel
Running Unit Tests
# Run with coverage report
.\run_unit_tests.bat
# Or manually
pytest tests/unit/ -v --cov=app --cov-report=html
# View coverage report
start htmlcov/index.html # Windows
open htmlcov/index.html # Mac
Coverage Report
After running unit tests, coverage report is available:
- Terminal: Printed to console
- HTML:
htmlcov/index.html - XML:
coverage.xml(for CI/CD)
Test Files
tests/unit/
├── conftest.py # Unit test fixtures
├── test_auth_service.py # Authentication logic
├── test_settings_service.py # Settings management
├── test_analysis_service.py # Analysis sessions
├── test_query_service.py # Query processing
├── test_rag_service.py # RAG backend communication
├── test_db_api_client.py # DB API client
├── test_jwt_utils.py # JWT utilities
├── test_models.py # Pydantic models
└── test_dependencies.py # Dependency injection
Integration Tests
Location: tests/integration/
Coverage: DB API integration
Speed: Medium (few seconds)
Dependencies: DB API service
What They Test
- FastAPI endpoints with real DB API
- Authentication flow
- Settings CRUD operations
- Analysis session management
- Error handling from external service
Prerequisites
DB API must be running at http://localhost:8081
Check health:
curl http://localhost:8081/health
Configuration
Create tests/integration/.env.integration:
cp tests/integration/.env.integration.example tests/integration/.env.integration
Edit with your test credentials:
TEST_DB_API_URL=http://localhost:8081/api/v1
TEST_LOGIN=99999999 # 8-digit test user
Running Integration Tests
# Check prerequisites and run
.\run_integration_tests.bat
# Or manually
pytest tests/integration/ -v -m integration
Test Files
tests/integration/
├── conftest.py # Integration fixtures
├── .env.integration.example # Config template
├── .env.integration # Your config (not in git)
├── README.md # Integration test docs
├── test_auth_integration.py # Auth with DB API
├── test_settings_integration.py # Settings with DB API
├── test_analysis_integration.py # Sessions with DB API
└── test_query_integration.py # Query endpoints (DB API part)
What's NOT Tested
Integration tests do not call RAG backends:
- RAG queries are mocked
- Only DB API integration is tested
- Use E2E tests for full RAG testing
End-to-End Tests
Location: tests/e2e/
Coverage: Complete user workflows
Speed: Slow (minutes)
Dependencies: All services (FastAPI + DB API + RAG backends)
What They Test
- Complete authentication → query → save → retrieve flow
- Real RAG backend calls (IFT, PSI, PROD)
- Cross-environment functionality
- Data persistence end-to-end
- User isolation and security
- Error scenarios with real services
Prerequisites
All services must be running:
- ✅ DB API at
http://localhost:8081 - ✅ IFT RAG backend (configured host)
- ✅ PSI RAG backend (configured host)
- ✅ PROD RAG backend (configured host)
- ✅ Test user exists in DB API
- ✅ Valid bearer tokens for RAG backends
Configuration
Create tests/e2e/.env.e2e:
cp tests/e2e/.env.e2e.example tests/e2e/.env.e2e
Edit with your configuration (see .env.e2e.example for all variables):
E2E_DB_API_URL=http://localhost:8081/api/v1
E2E_TEST_LOGIN=99999999
E2E_IFT_RAG_HOST=ift-rag.example.com
E2E_IFT_BEARER_TOKEN=your_token_here
# ... more config
⚠️ Security: .env.e2e contains real credentials - never commit to git!
Running E2E Tests
# Check prerequisites and run all E2E
.\run_e2e_tests.bat
# Or manually run all E2E tests
pytest tests/e2e/ -v -m e2e
# Run environment-specific tests
pytest tests/e2e/ -v -m e2e_ift # IFT only
pytest tests/e2e/ -v -m e2e_psi # PSI only
pytest tests/e2e/ -v -m e2e_prod # PROD only
# Run specific test suite
pytest tests/e2e/test_full_flow_e2e.py -v # Workflows
pytest tests/e2e/test_rag_backends_e2e.py -v # RAG backends
pytest tests/e2e/test_error_scenarios_e2e.py -v # Error cases
Test Files
tests/e2e/
├── conftest.py # E2E fixtures
├── .env.e2e.example # Config template
├── .env.e2e # Your config (not in git)
├── README.md # E2E test docs (detailed)
├── test_full_flow_e2e.py # Complete workflows
├── test_rag_backends_e2e.py # RAG integration
└── test_error_scenarios_e2e.py # Error handling
What They Test
Complete Workflows:
- User login → JWT token
- Get/update settings
- Bench mode queries to RAG
- Backend mode queries with sessions
- Save analysis sessions
- Retrieve and delete sessions
RAG Backend Integration:
- IFT RAG (bench mode)
- PSI RAG (backend mode)
- PROD RAG (bench mode)
- Session reset functionality
- Cross-environment queries
Error Scenarios:
- Authentication failures
- Validation errors
- Mode compatibility issues
- Resource not found
- Edge cases (long questions, special chars, etc.)
Timeouts
E2E tests use realistic timeouts:
- RAG queries: 120 seconds (2 minutes)
- Large batches: 180 seconds (3 minutes)
- DB API calls: 30 seconds
Cleanup
E2E tests automatically clean up after themselves. However, if tests fail catastrophically, you may need to manually delete test sessions.
Test Markers
Use pytest markers to run specific test categories:
# Unit tests only
pytest -m unit
# Integration tests only
pytest -m integration
# All E2E tests
pytest -m e2e
# E2E tests for specific environment
pytest -m e2e_ift
pytest -m e2e_psi
pytest -m e2e_prod
# Slow tests only
pytest -m slow
# Everything except E2E
pytest -m "not e2e"
# Unit and integration (no E2E)
pytest -m "unit or integration"
CI/CD Integration
Recommended CI/CD Pipeline
# .github/workflows/test.yml example
name: Tests
on: [push, pull_request]
jobs:
unit-tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.11'
- name: Install dependencies
run: pip install -r requirements.txt
- name: Run unit tests
run: pytest tests/unit/ -v -m unit --cov=app
- name: Upload coverage
uses: codecov/codecov-action@v3
integration-tests:
runs-on: ubuntu-latest
services:
db-api:
image: your-db-api:latest
ports:
- 8081:8081
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.11'
- name: Install dependencies
run: pip install -r requirements.txt
- name: Create .env.integration
run: |
echo "TEST_DB_API_URL=http://localhost:8081/api/v1" > tests/integration/.env.integration
echo "TEST_LOGIN=${{ secrets.TEST_LOGIN }}" >> tests/integration/.env.integration
- name: Run integration tests
run: pytest tests/integration/ -v -m integration
e2e-tests:
runs-on: ubuntu-latest
# Only run E2E on main branch or releases
if: github.ref == 'refs/heads/main'
services:
db-api:
image: your-db-api:latest
ports:
- 8081:8081
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.11'
- name: Install dependencies
run: pip install -r requirements.txt
- name: Create .env.e2e
run: |
echo "E2E_DB_API_URL=${{ secrets.E2E_DB_API_URL }}" > tests/e2e/.env.e2e
echo "E2E_TEST_LOGIN=${{ secrets.E2E_TEST_LOGIN }}" >> tests/e2e/.env.e2e
# ... add all other secrets
- name: Run E2E tests
run: pytest tests/e2e/ -v -m e2e
CI/CD Best Practices
- Always run unit tests (fast, no dependencies)
- Run integration tests if DB API is available
- Run E2E tests only on main branch or before deployment
- Use secrets for test credentials
- Cache dependencies to speed up builds
- Parallel execution for unit tests
- Generate coverage reports for unit tests only
Troubleshooting
Unit Tests Failing
Check:
- Virtual environment activated
- All dependencies installed:
pip install -r requirements.txt - No syntax errors in test files
Integration Tests Skipped
Cause: DB API not running or not configured
Fix:
- Start DB API:
docker-compose up db-api - Check health:
curl http://localhost:8081/health - Verify
.env.integrationexists and is correct
E2E Tests Skipped
Cause: Required services not running
Fix:
- Check DB API health
- Verify RAG backends are accessible
- Confirm
.env.e2eis configured - Ensure test user exists in DB API
Timeout Errors
Cause: RAG backends slow or unavailable
Fix:
- Check RAG backend health
- Verify bearer tokens are valid
- Check network connectivity
- Increase timeout if needed (in test files)
Authentication Failures
Cause: Invalid credentials or test user doesn't exist
Fix:
- Verify test user exists:
curl -X POST http://localhost:8081/api/v1/users/login?login=99999999 - Check bearer tokens are valid
- Ensure JWT_SECRET_KEY matches between environments
Best Practices
During Development
- Write unit tests first (TDD approach)
- Run unit tests frequently (on every change)
- Use
--lfflag to run last failed tests:pytest --lf - Use
-xflag to stop on first failure:pytest -x - Use
-kflag to run matching tests:pytest -k "test_auth"
Before Committing
- ✅ Run all unit tests
- ✅ Check coverage (should be > 90%)
- ✅ Run integration tests if DB API available
- ✅ Fix any failing tests
- ✅ Review coverage report for gaps
Before Deploying
- ✅ Run all unit tests
- ✅ Run all integration tests
- ✅ Run all E2E tests
- ✅ Verify all tests pass
- ✅ Check for any warnings
- ✅ Review test results summary
Writing New Tests
For new features:
- Start with unit tests (business logic)
- Add integration tests (DB API interaction)
- Add E2E test (complete workflow)
For bug fixes:
- Write failing test that reproduces bug
- Fix the bug
- Verify test passes
- Add related edge case tests
Test Coverage Goals
- Unit Test Coverage: > 95%
- Integration Test Coverage: All DB API endpoints
- E2E Test Coverage: All critical user workflows
Current coverage:
- ✅ Unit: 99%
- ✅ Integration: All DB API integration points
- ✅ E2E: Complete workflows + error scenarios
Related Documentation
- Unit Tests - Fast isolated tests
- Integration Tests - DB API integration
- E2E Tests - Full stack testing
- DB API Contract - External API spec
- PROJECT_STATUS.md - Implementation status
Summary
┌─────────────────────────────────────────────────────────┐
│ Test Type │ Speed │ Dependencies │ When to Run │
├─────────────────────────────────────────────────────────┤
│ Unit │ Fast │ None │ Always │
│ Integration │ Med │ DB API │ Before commit │
│ E2E │ Slow │ All services │ Before deploy │
└─────────────────────────────────────────────────────────┘
Run unit tests constantly ⚡
Run integration tests regularly 🔄
Run E2E tests before deployment 🚀