A sample result

Below is a sample result page. It is derived from the replies your team has given in the assessment.

Your results will accurately represent the levels your team believes you are at. You can use it to have conversations about where you are and where you want to be and to identify what kind of skills you should be looking to hire for.

The results are organised into 5 major categories reflected by a variation of the Capability Maturity Model. We're not claiming that this is the one tue way to measure the maturity of your teams, but it is one that we've tried many times before and found useful. The statements in the questionnaire and the results are abstract enough to be applicable to any team, regardless of the tech stack or the industry.

Your Results

Team DynamicsQuality AdvocacyTechnical CraftsmanshipProduct OwnershipRisk Management
Team DynamicsQuality AdvocacyTechnical CraftsmanshipProduct OwnershipRisk Management
Initial
The team lacks consistency and needs training to get all team members aligned.
Just Started
The team has a basic level of agility but the processes are not fully defined. Members with different roles are not yet fully in sync.
Defined
The team is applying well-defined processes and are consistently delivering each sprint.
Measured
The team is measuring code quality and other key indicators. Primary focus is an engineering/development maturity.
Optimal
The team reliably delivers their commitments. They self-organise, are adaptive to change, sustainable and are practicing continuous improvement with continuous integration and deployment.
Below are statements that need to be true for the team to advance to the next level.

Team Dynamics

Defined
to
Measured
  • The team uses metrics such as the number of conflicts, the time it takes to resolve conflicts, and the satisfaction of the parties involved to measure the effectiveness of their conflict resolution process.

  • The team utilises standups to efficiently plan their tasks, setting daily goals and outlining actions to be accomplished.

  • The team embraces a comprehensive team approach. They assess team performance through metrics such as team velocity and collaboration effectiveness, employing feedback to adapt their processes. Regular retrospectives illuminate team effectiveness.

  • Retrospectives drive innovation in the team's processes, facilitated by defined metrics like sprint velocity, defect rate, and customer satisfaction.

Quality Advocacy

Measured
to
Optimal
  • The team actively experiments with pair programming, test-driven development, and continuous integration, along with practices such as code reviews and refactoring, all while measuring metrics such as defect rate, cycle time, and feature delivery speed to gauge the effects on enhancing internal quality.

  • The team continuously and proactively drives quality improvements, using metrics like defect density, Mean Time to Repair (MTTR), and customer satisfaction

  • The team actively explores innovative methodologies, such as session-based testing or scenario-driven testing, and incorporates concrete metrics like defect discovery rate, exploratory test coverage, and release stability.

Technical Craftsmanship

Just Started
to
Defined
  • Test-driven development is embraced for most newly written production code. Essential concepts like mocks and stubs are understood and implemented. Automation tests are gradually enveloping legacy code.

  • Automated tests have been created to assess non-functional aspects.

  • The team excels in spiking, adeptly adhering to time boxes and precise definitions of completion. Post-spike reviews guide subsequent steps, and the use of spike code in production is eliminated.

  • Releases are executed at regular intervals, incorporating semi-automated and automated workflows.

  • Software is crafted through test-driven development, with routine testing and frequent refactoring. Most code is well-structured. Continuous integration is practiced without a dedicated CI server.

  • Consistent pairing sessions are executed seamlessly, marked by fluid and enjoyable teamwork. Occasional pair changes occur within a story.

Product Ownership

Defined
to
Measured
  • The team is capturing metrics about the number of action items that are completed on time or the percentage of meeting time that is spent on productive discussion and their ability to follow through on outstanding meeting actions and decisions.

  • The team is actively monitoring the involvement of their stakeholders and experimenting with how they interact with them. They are measuring the effect of this on their metrics, such as the number of defects found in production, the time to market, and the customer satisfaction score.

  • The team is single-sizing stories and has their own criteria for the appropriateness of a story. This criteria may include the quality of the acceptance criteria, the estimated effort, or the risk associated with the story.

  • The team aggressively gathers user feedback, including usability testing with end users. Feedback is incorporated into the processes to evaluate and prioritize features and in ideating on new features. The team quantitatively measures the effects of the feedback on the product, such as using metrics like customer satisfaction scores and the number of bugs found in production.

Risk Management

Defined
to
Measured
  • The team is actively monitoring known risks and measuring how effective their risk monitoring process is.