What Is a Definition of Done?
A Definition of Done (DoD) is a shared checklist that defines what "complete" means for your team. It's the quality standard that every user story, task, or feature must meet before being considered done. Without a DoD, "done" means different things to different people—leading to incomplete work, technical debt, and friction during sprint reviews.
Think of it as your team's quality contract. If a story doesn't meet the DoD, it's not done—period.
Is This Story Done?
Before marking any story complete, verify it passes your team's Definition of Done
Definition of Done vs Acceptance Criteria
These terms often get confused, but they serve different purposes. Here's how they differ:
Definition of Done
- Universal: Applies to every story
- Process-focused: Quality gates and standards
- Team-owned: Entire team agrees
- Stable: Changes rarely
Example: "All code must be reviewed, tested, and deployed to staging"
Acceptance Criteria
- Specific: Unique to each story
- Feature-focused: What the feature must do
- PO-owned: Product Owner defines
- Variable: Different per story
Example: "Users can reset password via email and receive confirmation within 5 minutes"
| Aspect | Definition of Done | Acceptance Criteria |
|---|---|---|
| Scope | Applies to ALL work items | Specific to ONE user story |
| Changes | Stable across sprints | Different for each story |
| Owner | Entire team agrees | Product Owner defines |
| Focus | Quality and process | Functionality and behavior |
Sample Definition of Done by Team Type
Your DoD should reflect your team's context and workflow. Here are examples for different types of teams:
Software Development
- ✓Code reviewed by at least one peer
- ✓All unit tests pass (>80% coverage)
- ✓Integration tests pass
- ✓No critical security vulnerabilities
- ✓Documentation updated
- ✓Deployed to staging environment
- ✓Product Owner accepts functionality
Marketing Team
- ✓Content reviewed by editor
- ✓Brand guidelines followed
- ✓SEO metadata added
- ✓Images optimized and alt text added
- ✓Legal/compliance review complete
- ✓Published to production
- ✓Analytics tracking implemented
Design Team
- ✓Design critique completed
- ✓Accessibility standards met (WCAG)
- ✓Design system components used
- ✓Responsive designs reviewed
- ✓Developer handoff documentation complete
- ✓Stakeholder approval received
- ✓Design files organized in shared library
Common DoD Items by Category
Use these categories as a starting point when creating your team's Definition of Done:
✓Code Quality
- Code follows team style guide
- No new linter warnings
- Code reviewed and approved
- Technical debt documented
- Performance benchmarks met
✓Testing
- Unit tests written and passing
- Integration tests pass
- Regression tests pass
- Edge cases tested
- Tested in multiple browsers/devices
✓Documentation
- README updated if needed
- API documentation current
- Inline code comments added
- User-facing docs updated
- Change log entry added
✓Deployment
- Deployed to staging successfully
- Database migrations tested
- Feature flags configured
- Monitoring/alerts set up
- Rollback plan documented
✓Review & Acceptance
- Product Owner reviewed and accepted
- UX/Design sign-off received
- Security review completed
- Meets acceptance criteria
- Demo-ready
How to Create Your Team's Definition of Done
Start with a Workshop
Gather your whole team (developers, testers, product owner, designers) for a dedicated session. Ask: "What does done mean to us?"
- →Block 90 minutes for initial creation
- →Use a whiteboard or collaborative doc
- →Start with brainstorming—no filtering yet
Identify Common Quality Gates
What quality checks should EVERY story pass? Think code review, testing, documentation, deployment.
- →Look at your last 5 completed stories
- →What steps were consistently needed?
- →What problems could have been caught earlier?
Keep It Practical
Your DoD should be achievable within a sprint. If items can't be done every sprint, they don't belong in the DoD.
- →Aim for 5-10 items maximum initially
- →Make each item verifiable (yes/no)
- →Avoid vague terms like "sufficient" or "adequate"
Get Explicit Agreement
Everyone on the team must commit to the DoD. It's a shared quality standard, not a suggestion.
- →Have each person verbally agree
- →Post it visibly (physical or digital board)
- →Make it part of your sprint review checklist
Evolve Over Time
Your DoD isn't permanent. As your team matures, raise the bar. Review and refine quarterly.
- →Add items as your process improves
- →Remove items that become automatic
- →Adjust based on retrospective insights
Common DoD Mistakes to Avoid
Making it too long
Problem: A 30-item DoD becomes a burden, not a standard. Teams skip it or check boxes without thinking.
Solution: Start with 5-7 essential items. Add more only when current items become second nature.
Confusing with acceptance criteria
Problem: DoD is universal; AC is story-specific. Mixing them creates confusion about what applies when.
Solution: DoD = "Every story must..." / AC = "This story must..."
Setting unrealistic standards
Problem: If your DoD requires 100% test coverage and production deployment every sprint, but your team can't achieve that, it becomes aspirational fiction.
Solution: Be honest about current capabilities. Improve incrementally.
Never updating it
Problem: A DoD created two years ago may no longer reflect your process, tools, or quality standards.
Solution: Review DoD quarterly. Add items as you mature, remove what's redundant.
✓ Key Takeaways
- 1DoD is a universal quality checklist that applies to every story your team completes
- 2DoD focuses on process and quality gates; acceptance criteria focuses on functionality
- 3Start with 5-7 essential items and evolve your DoD as your team matures
- 4Make each DoD item verifiable with a clear yes/no answer—no vague criteria
- 5Review and refine your DoD quarterly based on retrospective insights
Ready to Improve Your Team's Quality Standards?
Use our free planning poker tool to estimate stories and ensure they meet your Definition of Done.
Start Free Planning Poker Session



