Judging Project Progress

2014-03-19 08.29.06One of the biggest issues I see with projects is an overly optimistic assessment of progress.

Typically progress is judged by counting deliverables as part of an earned value calculation.  Unfortunately deliverables are only one indication of progress and not that reliable.  All to often as a project nears completion a green project judged 90% complete turns to amber or red as integration testing or early adopter deployments start to confront reality and a project slides back to 70% complete.

I’m going to drill down into two other techniques that I favour as a complement to earned value reporting for IT projects.

The first technique is rarely used, but in my view is universally relevant and very powerful is to assess progress by asking key stakeholders to write down statements that they expect to be true at the milestone being assessed.  Then the stakeholders should determine their confidence level concerning the statements being true by interviewing the development team reviewing documents and testing software. Again an example should make this clear.

Consider a team developing a new multi-tenant virtual machine infrastructure with automated provisioning, monitoring, usage tracking and billing, different stakeholders will have different statements, after the milestone review the sales team might say:

  1. We understand the target market for the service with 100% confidence
  2. We understand how we will differentiate the service with 50% confidence
  3. We understand how the sales process will work with 100% confidence
  4. We understand the cost and pricing model with 30% confidence

The operations team might say:

  1. We understand how we will deploy the service with 100% confidence
  2. We understand how we will test the service with 79% confidence
  3. We understand how we will operate the service with 60% confidence
  4. We understand how we will on-board new customers onto the service with 50% confidence

A refinement of the above would be to make the statements as follows:

We understand how we will test the service with 79% confidence, we expected to be 90% confident at this point in the project.

I’ve shown the above in text but in reality it would be shown as a horizontal stacked bar chart.  Note that each stakeholder should have supporting evidence for their views, which they would discuss with the development team.  The development team then knows where they need to focus to build the confidence of their stakeholders.

Both of these techniques are a complement to traditional progress tracking, but in my view they each provide a way to make it much more robust.  There are some particular benefits to the stakeholder assessments:

  1. They allow progress to be visualised in business terms
  2. They allow stakeholders to clearly communicate upwards and to the development team
  3. At each milestone review progress can be tracked visually, sometimes confidence will reduce as the development progresses and the stakeholders learn more
  4. Assessing confidence levels requires conversations between the reviewers and the development teams, these conversations are much more valuable than passing hundreds of review comments and responses back and forwards
  5. Different stakeholders get an appreciation of what’s important to each other and of their respective view of progress
  6. Discussion between stakeholders is very likely.  For example one group might believe they have a good understanding of the cost of the service, but another might disagree, exploring why might be very revealing

The second technique which is gaining considerable support is to structure a project into units of work that can be completed and tested independently of each other.  Ideally these deliverables provide useful value in their own right, if these deliverables can be used by the project team or customers all the better.  An example will help. Consider the same hosting service described above.

  1. Step one would be to start by building just the virtual machine hosting layer
  2. The team would then start using the hosting layer to develop and test the provisioning service.
  3. Once the provisioning service was running they would use that to provision the test VMs for the monitoring service, which would monitor the provisioning service and the hosting service
  4. Once the monitoring service was running they would use that to test the reporting service
  5. Once the reporting service was running they would use that to test the billing service

The deliverables from each stage are used in creating the next stage, because they are being used they are being tested.  The hosting layer doesn’t need to be complete before it’s used to develop the provisioning service, it just needs to have enough capability to be used by rick tolerant developers.  As new capabilities are added daily/weekly they get tested through use.

Using the process described above, integration of new capabilities happens continuously and is tested by the team developing the service both formally (by independent testers) and informally through usage.  Because the service being developed is in continuous use the developers have SOME real world experience to base their assessment of progress.  I stress SOME because this is no substitute for formal testing that can address issues like stress testing better than day to day usage by a small development team can.

Steve Richards

I'm retired from work as a business and IT strategist. now I'm travelling, hiking, cycling, swimming, reading, gardening, learning, writing this blog and generally enjoying good times with friends and family

4 Responses

  1. March 26, 2014

    […] measures of a projects status and I’ve described some of the issues with them in this recent post.  This post is concerned with the soft […]

  2. March 26, 2014

    […]  Developing services this way improves many aspects of a project and I include it in my tips for improving the way we measure progress in a project.  I have another post that drills into the technique for a Desktop Infrastructure […]

  3. March 26, 2014

    […] services this way improves many aspects of a project and I include it in my tips for improving the way we measure progress in a project.  I have another post that drills into the technique for a Desktop Infrastructure […]

  4. March 29, 2014

    […] plans, estimates, registers, and deliverables and not enough about scope, objectives, people, progress, discussion, review and quality.   Joel has written a great book,  that has some useful insights […]

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: