Skip to content

Quality Gates & Release Policy

Why quality gates matter

ML systems fail not only due to bugs, but due to data issues, silent regressions, and configuration drift.

Quality gates prevent unsafe changes from reaching production.


Implemented quality gates

Code quality

  • linting and formatting checks,
  • static analysis.

Testing

  • unit tests for data and ML logic,
  • integration tests for pipelines and API.

Data contracts

  • Great Expectations checks,
  • blocking on schema or critical constraint violations.

ML sanity checks

  • pipeline smoke runs on reduced datasets,
  • metric sanity thresholds.

Release policy

A release is created only if: - all quality gates pass, - images are successfully built and pushed, - deployment verification succeeds.


Artifact traceability

Every production deployment can be traced to: - git commit, - Docker image digest, - dataset version, - model version.


Failure handling

If a quality gate fails: - the pipeline stops, - no partial deployment occurs, - the failure is visible and actionable.