This checklist is part of the Agentic SDLC — How to Deliver with Confidence guide. It is designed to be used independently. No prior reading is required.
This checklist provides step-by-step actions for teams establishing governance over AI-generated code for the first time. Each item is a concrete action. Complete them in order.
1. Capture Existing Decisions
- Identify the architecture decisions your team already follows implicitly (patterns, conventions, boundaries)
- Write each decision as a short record: what was decided, why, and what it constrains
- Store decision records in a single, known location accessible to all engineers
- Identify the team members who made or own each decision (for future review and updates)
- Review each decision with at least one other engineer to confirm it reflects current practice, not aspirational intent
2. Define Invariants
- List the hard constraints that must never be violated (e.g., “no direct database access from controllers,” “all API endpoints require authentication”)
- Write each invariant as an unambiguous, testable statement
- Confirm each invariant with the team — invariants are not suggestions, they are rules
- Separate invariants from conventions: invariants are non-negotiable, conventions are defaults that may be overridden with justification
- Store invariants alongside decision records, clearly labeled as invariants
3. Document Conventions
- List the coding conventions the team follows: naming patterns, file structure, error handling approach, logging standards, test organization
- Write each convention clearly enough that an engineer new to the team could follow it without asking questions
- Include the “why” for each convention — conventions without rationale are harder to enforce and easier to ignore
- Organize conventions by scope: project-wide conventions, language-specific conventions, and framework-specific conventions
4. Compile Decisions into Enforcement
- Select the 5-10 highest-impact decisions and invariants to compile first (do not attempt to compile everything at once)
- For each selected decision, determine the enforcement mechanism: rule file for the generator, pre-merge gate, post-generation scan, or a combination
- Create rule files that express the selected decisions in a format the AI generator can consume at prompt time
- Configure pre-merge checks that verify compliance with the selected invariants
- Test each enforcement mechanism: intentionally violate a constraint and confirm it is detected
- Document which decisions are compiled and which are documentation-only — make the gap visible
5. Establish the Review Cadence
- Define when governance is reviewed: after every major feature, monthly, or quarterly (pick one and commit)
- Assign ownership: who is responsible for maintaining the decision records, invariants, and compiled rules?
- Create a lightweight template for new decisions so that capturing them is fast, not bureaucratic
- Run the first governance review: are all existing decisions still accurate? Are any decisions missing? Are compiled rules still aligned with the records they enforce?
6. Validate the Setup
- Generate a small, well-understood piece of code under the compiled governance constraints
- Verify that the generator followed the compiled rules
- Run the pre-merge checks and confirm they pass for compliant code and fail for non-compliant code
- Have one engineer who was not involved in governance setup review the compiled rules and confirm they are understandable
- Document any gaps found during validation and add them to the governance backlog