Secret ingredients to quality software

SSW Foursquare

Done - Do you go beyond 'Done' and follow a 'Definition of Done'?

Last updated by Tiago Araújo [SSW] on 10 Jan 2022 11:49 pm (5 months ago) See History

Having a clear Definition of Done for your team is critical to your success and quality management in Scrum.

Every team is different, but all need to agree on which items are in their "Definition of Done".

There are 3 levels of 'Done' in communication

Level 1

Level 2

  • Sending a "Done" email
  • Screenshots
  • Code

Level 3

  • Sending a "Done" email
  • Recording a quick and dirty "Done Video"
  • Code (showing a full scenario e.g. a user story)

level 3 done
Figure – Coded UI Test passes in Visual Studio

There are 8 levels of 'Done' in software quality

Start with these examples showing typical "Definitions of Done" from beginner teams to more mature teams:

Team - Level 1

  • The code compiles
  • All tasks are updated and closed
  • No high priority defects/bugs are on that user story

Team - Level 2

  • All of the above, plus
  • All unit tests passed
  • Greater than 1% code coverage (not earth shattering, but you need to start somewhere)

Team - Level 3

  • All of the above, plus
  • Successful build on the Build Server
  • Git Branch Policies
    Azure DevOps Check in Policy - Change set Comments Policy (all check-ins must have a comment)
  • Azure DevOps Check in Policy - Work Items (all check-ins must be associated with a work item)
  • Code reviewed by one other team member (e.g. Checked by Bill)
  • Sending a Done email with screenshots

Figure: Good example - Add check in policies to enforce your Definition of Done

Team - Level 4

  • All of the above, plus
  • All acceptance criteria have been met
  • All acceptance criteria have an associated test passing (aka. Automated functional testing with Web Tests (Selenium), Coded UI Tests, or Telerik Tests)
  • Tip: Use Microsoft | Azure Test Plans
  • Sending a Done email (with video recording using SnagIt)

TestPlanning 1
Figure: Organize tests in suites with built-in E2E traceability across requirements, test artifacts and defects

Figure: Use the client, Microsoft Test Manager, to run tests and not just capture the pass/fail of steps, comments/attachments and bugs, but also capture diagnostic data during execution, such as screen recording, system info, image action log etc

XT 3
Figure: Explore your web applications, find and submit bugs directly from your Chrome browser – no need for predefined test cases or test steps

Figure: Good example - Done video showing the features worked on

Team - Level 5

  • All of the above, plus
  • Deployed to UAT (ideally using Continuous Deployment)
  • Complex code is documented (removing technical debt)
  • Product Owner acceptance

Team - Level 6

  • All of the above, plus
  • Multiple environments automatically tested using Lab Management

Figure: Good example - A tester Lab Management to create VMs for testing the application, then defines a test plan for that application with Test Case Management

Team - Level 7

  • All of the above, plus
  • Automated Load Testing
  • Continuous Deployment

Figure: Good example - Load testing involves multiple test agents running Web Performance Tests and pounding the application (simulating the behavior of many simultaneous users)

Team - Level 8 (Gold)

  • All of the above, plus
  • Deployed to Production

Congratulations! You are frequently deploying to production. This is called “Continuous Delivery” and allows you to gather quick feedback from your end users.

You might have everything deployed to production, but it might not yet be visible to the end user. This can be achieved by having “Feature toggles” in place. The actual release of the functionality is a decision that the Product Owner and business takes.

Adam CoganAdam Cogan
Peter GfaderPeter Gfader
Paul NeumeyerPaul Neumeyer
Damian BradyDamian Brady

We open source. Powered by GitHub