Shadow Testing
This story demonstrates the Shadow Testing pattern: how to perform final verification before deployment by mirroring production traffic.
Why We Needed This
Section titled “Why We Needed This”Tests as Contracts passed. But there are areas tests can’t cover:
- Bugs that only occur with specific data combinations
- Performance issues dependent on traffic patterns
- Issues that only surface at production data scale
- Clients using behaviors not specified in contracts
These are conditions you can’t create in test environments.
How It Works
Section titled “How It Works”Nginx Ingress in front of production Actionbase logs all requests and responses as Access Logs. These logs go to Kafka, and the same requests are replayed to the test environment. Since it’s log-based, there’s no impact on live service traffic.
flowchart LR
Client[Client] --> Nginx[Nginx Ingress]
Nginx --> Prod[Production Actionbase]
Nginx -->|Access Log| Kafka[Kafka]
Kafka -->|Replay| Test[Test Actionbase]
Prod -.->|Prod Response| Compare{Match?}
Test -.->|Test Response| Compare
Compare -->|Yes| Deploy[Deploy]
Compare -->|No| Block[Blocked]
- Capture: Nginx Ingress logs requests and responses to Kafka as Access Logs
- Replay: Same requests replayed to test environment
- Compare: Compare production response with test response
- Gate: Block deployment on mismatch
Deployment Process
Section titled “Deployment Process”flowchart LR
Dev[Development] --> Contract[Tests as Contracts]
Contract --> Shadow[Shadow Testing]
Shadow --> Deploy[Deploy]
Both stages must pass before deployment proceeds.
What We Learned
Section titled “What We Learned”- Real traffic is irreplaceable. No matter how sophisticated test scenarios are, they can’t perfectly reproduce production traffic patterns.
- Final gate before deployment. Both Tests as Contracts and Shadow Testing must pass before deployment.