Select the search type
  • Site
  • Web
Search

 
 
✓ Featured Content

Claude Code Videos

A curated playlist of specific YouTube content.

Hands-on Workshop

Ready to Transform Your Scrum Team with AI?

Join the Generative AI for Scrum Teams Workshop

Stop wondering how AI fits into your Agile workflow. In this hands-on workshop, you'll learn exactly how to integrate AI tools into every sprint ceremony, backlog refinement session, and delivery cycle—without disrupting the Scrum framework that already works for your team.

What You'll Master:

  • AI-powered user story creation and refinement techniques
  • Automated test generation and code review strategies
  • Sprint planning acceleration with AI assistance
  • Real-world prompt engineering for development teams
  • Ethical AI integration within Scrum values

Perfect for: Scrum Masters, Product Owners, Development Teams, and Agile Coaches who want to boost productivity while maintaining team collaboration and quality.

Taught by Rod Claar, Certified Scrum Trainer with 30+ years of development experience and specialized AI-Enhanced Scrum methodology.

Search Results

Rod Claar

Step 1: Set Up Your AI-Assisted Workflow

By the end of this step, you will have a repeatable AI workflow that produces consistent, reviewable outputs and slots cleanly into your existing development practices (branching, PRs, CI, code review).

1.1 Define the “contract” for AI use

Treat AI like a service with a clear interface.

  • Allowed work (good fits)

    • Drafting code scaffolds and tests

    • Refactoring suggestions

    • Generating acceptance criteria, edge cases, and test data

    • Explaining unfamiliar code paths

  • Disallowed work (requires human ownership)

    • Final security decisions

    • Anything involving secrets, keys, customer data

    • Unreviewed direct commits to main

Deliverable: a short “AI Use Policy” section in your repo README or engineering handbook.

1.2 Create a standard prompt structure (your “prompt template”)

Use the same headings every time so outputs are predictable and comparable.

Prompt Template

  1. Goal: what you want (single sentence)

  2. Context: relevant code/design constraints, definitions, domain rules

  3. Inputs: files/snippets/data (only what’s needed)

  4. Constraints: libraries, style guides, performance/security requirements

  5. Output format: exact structure (diff, checklist, test plan, ADR, etc.)

  6. Quality bar: tests required, linting, complexity limits, edge cases

  7. Assumptions & questions: what to do if information is missing

Guardrail rule: If missing info prevents correctness, the AI must list assumptions explicitly instead of guessing.

 

1.3 Add “reviewability” guardrails

Make every response easy to inspect.

Require the AI to produce:

  • A small, bounded change set (no “rewrite everything”)

  • Rationale per change (1–2 lines each)

  • Risk notes (what might break)

  • Test impact (new/updated tests, how to run)

  • Checklist for reviewers

Example output formats

  • “Provide a unified diff”

  • “Return a PR description: Summary / Changes / Tests / Risks”

  • “Return an acceptance test plan in Gherkin”

  • “Return a table: Edge case | Expected behavior | Test approach”

1.4 Integrate into the normal dev flow (PR-first)

Keep AI outputs inside the same governance you already trust.

Recommended workflow:

  1. Create a branch (human-owned)

  2. Use AI to draft code/tests/docs

  3. Run tests and linters locally

  4. Open PR with AI-generated summary + your review notes

  5. CI gates + human review

  6. Merge

Key principle: AI can propose; humans approve.

1.5 Build your “context pack” (reusable, minimal)

A context pack is the small set of material you feed repeatedly.

Include:

  • Architecture summary (1 page)

  • Coding standards (lint rules, formatting)

  • Domain glossary (terms, invariants)

  • Test conventions (naming, fixtures, patterns)

  • Security constraints (red lines)

Keep it short enough to paste or reference reliably.

1.6 Step completion checklist

You’re done with Step 1 when you have:

  • A written AI use policy (what’s allowed/not allowed)

  • A prompt template used by the team

  • Standard output formats (diff, PR summary, test plan)

  • A PR-first integration workflow

  • A reusable context pack


Step 1 “artifact” you can reuse (copy/paste)

Definition of Done for AI outputs

  • Must list assumptions explicitly

  • Must provide bounded changes (no unscoped rewrites)

  • Must include rationale + risks

  • Must include tests and how to run them

  • Must be suitable for PR review

 

 

 

Previous Article Step 2: Requirements to Testable Stories (Fast, Not Sloppy)
Next Article Step 1: How AI Fits Into a Dev Team — Without Creating Chaos
Print
51 Rate this article:
No rating
Please login or register to post comments.

Search

Calendar

«March 2026»
SunMonTueWedThuFriSat
22232425
262728
123456
7
891011121314
1516
17181920
21
2223
2425262728
2930311234

Upcoming events Events RSSiCalendar export

Categories