In this article, I will share a Continuous Delivery Assessment, a tool applicable to all organisations that are seeking improvements on digital products creation via excellent software delivery practices.
- Digital transformation depends on Continuous Delivery
- What is Continuous Delivery?
- The importance to assess Continuous Delivery
- The Continuous Delivery Assessment
- The Continuous Delivery Assessment Dimensions
- How to download and use the Continuous Delivery assessment shared excel
>> Download the Continuous Delivery Assessment Google Excel
Digital transformation depends on Continuous Delivery
The high speed of Digital transformation requires digital products to be delivered faster and more often.
The ability to get digital products features and enhancements very quickly into the hands of the customers is imperative for organizations that want to stay competitive in the digital world.
What is Continuous Delivery?
Continuous Delivery is a concept that was first described in the 2010 book (of the same name as the concept) by Jez Humble and David Farley. Continuous Delivery provides a standard language for the activities of building, testing, and deploying software on the production path.
“ Continuous Delivery is a software development discipline where you build software in such a way that the software can be released to production at any time. “ – Martin Fowler
The importance to assess Continuous Delivery
Every organisation that is interested in the concept of Continuous Delivery, putting it into practice, inevitably undergoes major changes in the way it builds and makes its software available to its customers.
The perception of adopting Continuous Delivery can be extremely subjective. Although it is very important to visualize the progress in adopting and improving such practice.
The Continuous Delivery Assessment
The Continuous Delivery Assessment is a tool capable of measuring the improvements achieved by the adoption of Continuous Delivery. It is an excel spreadsheet that visually depicts 8 dimensions of maturity for the delivery process:
- Delivery Process
- Visibility
- Organisational Alignment
- Technical Architecture
- Data storage
- Environments and Deployments
- Configuration Management
- Quality Control
Typically, there is an assessment period, with many interviews using the questions written in the excel to guide the conversation, that undergoes for a given product context and then the result is visible in a spider chart, as in the example below:
The entire assessment process should be conducted by a person with good experience in Continuous Delivery concepts and practices, the interviewer.
Typically, the interviewer spends a few days conducting question-and-answer sessions and lots of conversation with all professionals representing the areas involved in the creation of the software product in question.
Using notes, and according to the answers obtained during this period, the interviewer qualifies each dimension on a scale of 0-5.
As a result, a graph will be generated, as shown above, for a visual understanding of the current stage in each of the continuous delivery dimensions.
Continuous Delivery for new and experienced organisations
For organisations new to Continuous Delivery practices, the assessment result is useful to help plan investments and process improvements. The results provide the first steps for this journey.
For organisations that already do continuous delivery, the assessment can help identify the parts that need attention, or that have not yet adapted to the processes.
Track each Continuous Delivery dimension score
Use the Continuous Delivery shared Google Excel with the questions to interview each of the interviewees. In the spreadsheet, it will be possible to score the answers for each of the following Continuous Delivery dimensions: Delivery Process, Visibility, Organisational Alignment, Technical, Architecture, Data storage, Environments and Deployments, Configuration Control, Quality Control). Each dimension score is a vertex in the spider chart graph.
The Continuous Delivery shared Google Excel has 9 tabs: one for the result and eight for the each of the Continuous Delivery dimensions. You should fill up the tabs for each dimension.
The result tab has a table and the spider chart graph depicting the Continuous Delivery score for each dimension. It is automatically generated.

sample result: table and graph
For each Continuous Delivery aspect there are a few notes for the levels 0, 3 and 5; 5 being the best possible result.
These notes are to guide the conversation. Based on the conversation, you will decide the score, from 0 to 5.
Take note of the Continuous Delivery interviews
During conversations, take note of everything. This is a time to listen and understand the process. In addition to writing down the scores (0-5) for each dimension, the notes will help explaining the result of the assessment. Remember that the result is specific to the organisation, team, or product under assessment. It is not a generic result. Your notes should provide that specificity.
Interview everyone in the product context
Seek representatives from all areas involved in delivering the specific. For example, product managers, designers, developers, operations, and support people.
Be aware of people schedules
Schedule conversations in advance. Make sure to be flexible with people agenda and daily challenges. People fall behind, and priorities can change (especially in large organisations). Leave some interview times vacant to help these people and their schedules.
Leave space between conversations
Leave space between the scheduled conversations so you have time to reflect and organize your notes. This break between interview slots allows people to be late, arrive earlier, or to extend the conversation when desirable.
Understand the day-to-day activities
Ask questions wanting to understand the daily life of each person interviewed. Understand how she handles her activities related to delivering the product in question. Make it clear that you are not looking for culprits or are interested in hearing complaints from colleagues and areas of the company, but rather in understanding the process.
Structure and present your notes
Once you gathered data from all interviews, the next step is to structure your notes succinctly. Bullet Points and Mind Maps are good tools for this.
The Excel spider chart graph with the dimensions score is important, but the notes and the reasoning behind them is even more important. Most likely, besides an excel you will also create some other artefact with a summary of your structured notes.
Recognise the Continuous Delivery progress so far
You should bring visibility to the actions and initiatives that are already taking place in the organisation. Whether there is always room for improvements and extra stuff, the assessment result is an excellent opportunity to reinforce the message and amplify the visibility of the journey so far.
Repeat the Continuous Delivery assessment periodically
It is highly recommended to repeat the Continuous Delivery assessment periodically. If possible, do schedule the same assessment in the same product context with the same interviewees.
For example, a certain company with an IT department with a hundred people repeats this assessment every quarter.
Another example is for an amazing consultancy engagement, that do run such an assessment in the beginning of the engagement, and then repeat it towards the end of the engagement to measure and show progress.
Comparing the results from time to time will show progress and the next actions to be taken. This is essential to validate what has been done, understand the path the organisation has followed so far, and reassess the next steps.
The Continuous Delivery Assessment Dimensions
Please find below each of the 8 dimensions for the Continuous Delivery assessment, as well as the questions to be used for rating each dimension.
Delivery Process
This is the ability to build and test a product release base to ensure individual changes are compatible with other changes made, synchronously and continuously. This allows the team to manage the pace of their work and to deliver a high-quality, reproducible product on demand.
Reference – Score 0
- The discovery of errors made due to incompatibility of changes occur in production.
- There are no guidelines or practices for verifying compatibility between parts of the system.
Reference – Score 3
- Managed and reproducible practice is applied consistently (both manually and automatically), using defined standards and practices, which everyone understands.
- The team is expected to operate in a consistent and predictable manner, but the metrics focus on failure rather than success.
Reference – Score 5
- Each successful build generates a release candidate
- The focus is on making more frequent commits, with increasing confidence in the product’s quality
- CI (continuous integration) creates environments to allow scalability of tests
- Tests are run in parallel or on multiple machines
- Delivery pipeline extends to production
Quality Control
Quality control is the concept of systematically and quickly discovering problems during the product delivery process with more frequent and shorter feedback loops to guarantee quality. Discovering defects earlier in the development cycle is less costly and easier to fix. Problems are not exposed to the customer because they are identified and resolved before reaching production.
Reference – Score 0
- There are no testers
- The product is delivered directly to production untested
- Customer support deals with defects
Reference – Score 3
- Developing and testing are collaborative functions
- Testing is part of everyone’s responsibility on the delivery team
- Awareness to build with quality instead of reacting when finding something broken
- Automation is in place, but not comprehensively or sustainably
- The practices are planned to shorten the feedback cycles and move the tests to the beginning of the cycle.
Reference – Score 5
- “Immune” production systems detect implementation failures and automatically correct them
- The team has a high awareness of testing practices and selects those that provide the greatest benefit to the product as well as a rapid response to change
- New methodologies, techniques and approaches are explored and applied to improve product quality
- Approximately 100% test coverage
Organizational Alignment
The ability of team members to share ideas and work together to improve processes and the product, delivering working software faster and more securely. Team’s ability to share knowledge and skills and determine improvements.
Reference – Score 0
- No effort to facilitate open and transparent communication
- Large teams of individuals performing tasks in isolation
- The technical manager is only a nominal position
- Code sections are entirely owned by individuals
Reference – Score 3
- All functional teams are seen as members of the product team and are represented in regular product meetings aimed at improving delivery
- Operations teams provide an advisory service to product teams
- Sharing knowledge among cross-functional groups is not a common and consistent practice
- There is a plan to maintain continuity of team composition between interactions
Reference – Score 5
- The business’s ability to submit changes is the factor that limits what the product team is working on
- Companies have rich data based on usage patterns and the ability to release new products to select end-users in production (canary release, A/B testing, etc.)
- All team members are qualified in all technical areas and there is little specialization reliant on single individuals
Visibility
The ability to plan and respond to the Product Owner’s change requests in a way that allows for a consistent and predictable pace of work that is also completely transparent to everyone
Reference – Score 0
- There is no easy tool or way to check what was done, by whom and why
Reference – Score 3
- All product changes can be tracked through a common tool, shared across teams, which includes tracking approvals and test results
- Failures can easily be linked to individual changes early in the lifecycle
Reference – Score 5
- There is complete transparency of what is part of each release
- The PO is able to determine when a version goes into production and is no longer dependent on the team’s ability to deliver
- Control evidence and decisions made can be generated through automated toolsets used by the product team
Configuration management
Ability to track changes made to artifacts that affect the behavior of a system and manage multiple contributions to a single artifact. This includes source code, libraries, configuration files, tests, environment descriptions, dependent libraries, database structure, supporting documentation, and anything else related to product delivery.
Reference – Score 0
- Changes are made by multiple team members simultaneously, without any effort to maintain versions or track who made the change and when
- If version control exists, it is usually done by individuals who need structure to organize their activities
- There is no way for a team to revert changes to a previous working version
Reference – Score 3
- The items needed to configure all environments are identified
- A single set of tools for product configuration management has been determined and there is an effort to move all delivery artifacts to the version control system
- Test scripts, libraries, and dependencies are managed
- All team members commit frequently and regularly
Reference – Score 5
- There is a single flow of contribution in the project for everyone involved, which is constantly validated through deliverables
- The team changes practices and adjust artifacts frequently as the product evolves
- New version control tools are evaluated and implemented to meet the evolving needs of product delivery practices
- Development is all done on the main branch and the team is able to deliver and add new features without resorting to long-lived branches
Environments and Deployments
This represents the availability of suitable environments for development and testing to ensure that the product will function as expected in production. The ability to go into production with minimal work and no disruption to operations and end users.
Reference – Score 0
- There are no separate environments for development and testing
- Development environments are overcrowded
- Firewalls and network configurations are often blockers to development
Reference – Score 3
- Automated provisioning with scripted deployments
- There is still reliance on individual skills to ensure the deployment will work in production
- Test environments are readily available and reproducible with manual work and coordination across operational teams
Reference – Score 5
- There is pipeline capacity for continuous delivery
- Environments are easily replicated on-demand as needed using a self-service model to facilitate the optimal feedback cycle
- Provisioning and configuration of the environment is fully automated, preferably using a cloud-based system
- Environments are regularly reviewed to simplify and optimize the effective use of technology
- Provisioning is scalable to meet fluctuating demands
Data Storage
This is the ability for everyone to follow the changes made to the database structure for every release and also the effect that these changes bring. This includes the ability to roll forward or rollback the version of all changes. Database changes must be scripted alongside other deployment artifacts, and reusable test data must be created for all environments.
Reference – Score 0
- The development team is unaware of the process and functioning of the database
- Complete control of data and database is carried out by an external team
Reference – Score 3
- Database changes are performed automatically as part of the delivery process
- Datasets are defined for different purposes in the delivery process. For example: development environment, integration, user acceptance testing (UAT), load, and performance
- All datasets are scripted and included in the delivery pipeline
Reference – Score 5
- Automatic feedback loops for the database performance and delivery is present in the process and it is used to drive improvements
- Testing is performed using isolated datasets that are well sized for the purpose of testing (pre-production environments use smaller datasets)
Technical Architecture
Holistic thinking about technical decisions and their effects on the business’s ability to change. This is expensive and difficult to change once the technology is in use. The concept refers to the mechanisms in place for making architectural decisions as well as decisions around shared resources. Model components around business capabilities to reduce costs and risks and facilitate changes.
Reference – Score 0
- Technical decisions are made on the spot and there is no long-term vision or technical planning
- Use of high-level components and business logic in parts that are difficult to test
- Dependencies are not managed and not fully understood
- Dependencies are not around business capabilities
- Dependency versioning is poorly managed
- Code does not have automated testing and the architecture does not allow for automated unit testing
- Architecture only allows big bang releases
Reference – Score 3
- There are open lines of communication between architects and product teams
- Product teams participate in design decisions
- Communication between the team and the architect is informal and frequent
- Teams use modern practices to isolate features/services to gain ability to deliver independently
- Teams are mature enough to decide when to refactor and architect the system to support new features or dependencies
- There is a versioning strategy for dependencies and how to handle changes that break the API design, but it is not always followed
Reference – Score 5
- Mechanisms exist to enable the product team to make architecture and technology decisions. That’s because the technical vision is clear and transparent
- Architects are fully engaged with the business to enable business innovation
- The architecture allows you to measure and tune to improve performance
- The architecture can scale in a variety of ways, depending on the business needs
- The team is able to make changes in the build process, pipeline, and delivery stages during the architecture lifecycle, to optimize both the whole and local areas
- API design versioning strategies are well understood and managed by all teams
How to download and use the Continuous Delivery assessment shared excel
1. Click on the shared excel link (this should open a view only shared google excel)
2. Make a copy (or download as an excel file)
3. Find the question on each Continuous Delivery (CD) dimension tab
4. Write the result for each dimension (if this is the average of several interviews, add the result)
5. View the result table and the spider chart graph on the result tab
Article history: This article is based on a few articles and blog posts originally written in Portuguese. Over the years (since 2011) , I have been involved in many assessments for Continuous Delivery and DevOps. The provided excel is a simplification of many other assessments styles I have seen / used before.
Lately I have been using assessments as part of inceptions for transformational projects. Therefore the need to explain it in more details.