BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News How Statistical Forecasting Can Help You Trust Your Data and Drive Business Agility

How Statistical Forecasting Can Help You Trust Your Data and Drive Business Agility

Leia em Português

This item in japanese

Bookmarks

Piotr Leszczynski gave a talk at AgileByExample last year on the benefits of using statistical forecasting methods for software teams. Leszczynski discussed the most obvious benefit of statistical forecasting being that we have better estimates as a result. He also focused on the less obvious but probably more significant benefits for business agility. This includes:

  • Improved collaboration and understanding between technology teams and product / business stakeholders
  • Improved data collection and business awareness of the value of data
  • Improved planning and prioritisation processes
  • Improved trust of data

Leszczynski says one of the most important gains of the statistical forecasting was a

"mindset shift across the product group from people talking about linear regression and using averages to guesstimate, to people talking about flow efficiency, percentiles, distributions and understanding what they are discussing not just repeating an agile coaches' words. This then spread."

A lot has been said about why statistical forecasting trumps estimating. The #noestimates topic has brought significant debate and awareness to the subject. For much more detail on #noestimates, you can read Vasco Duarte's book. Duarte and other thought leaders such as Woody Zuill have begun to question the efficacy, and even the purpose, of using estimates to predict a project's cost and timeline.

Leszczynski and others such as Troy Magennis emphasise the benefit of using statistical forecasting to get better estimates instead of doing no estimates.

Leszczynski demonstrates that probabilistic forecasting using the Monte Carlo method is a straightforward approach that can be applied in many circumstances. He provides a series of common issues to look out for when preparing forecasts, including:

  • KPIs bias
  • People forgetting to enter data
  • Split rates of items
  • The size of the items
  • Fungibility of teams
  • Types of items in your delivery
  • Handling scope creep
  • Ratio of technical improvements

He says that

"We applied the Monte Carlo method to our product roadmap and got some answers for a three-year prediction timeframe. We needed to make assumptions and decide how we are going to validate them; we needed to adjust our data so it fitted the models we wanted to use. We weren't happy with what the data told us, but we decided to trust it more than we wanted to trust our optimistic gut. We came up with a series of experiments that can bring the reality closer to the business expectations of our delivery. We are almost a year into that and a couple of reforecasts happened. We learnt a lot."

If you are doing some form of agile development already, Leszczynski demonstrates how this forecasting would be quite simple to adopt. If you want more on this topic, there are extensive resources on Troy Magennis's website including detailed run-throughs of many forms of probabilistic forecasting and the tools to help you do this for yourself quickly. Magennis says, "Any proposed forecasting method just has to be better than what you do now, or at least less expensive with a similar result".

Key Improvements to Business Agility

Expanding on the list of benefits, Leszczynski explains how the benefits are achieved:

  • Improved collaboration and understanding between technology teams and stakeholders
    Statistical forecasting provides a traceable model which business stakeholders can follow (not an obscure guess they cannot understand). This grows their trust in the model; they can see which levers will improve the predictions and they become more involved with the technical teams to improve the process.

  • Improved data collection and business awareness of the value of data
    Once everyone can see that the quality of the data has a direct effect on the quality of the predictions for delivery and sales timelines, it creates motivation to provide better data. Validating assumptions, such as the number of technical improvements and the predicted rate of customer requests, becomes important.

  • Improved planning and prioritisation processes
    The use of models created clarity about the costs of prioritisation and how important it was to the forecasts produced. This drove a need for the business to pay far more attention to the prioritisation of features, especially between different business units.

  • Improved trust of data
    Piotr’s team made a point of involving the stakeholders in negotiating assumptions that would be included in the model for forecasting. This allowed stakeholders to see how it generates answers to their questions "Why does it take so long?" and "How can I shorten the time to get my item?". They provided business people guidance on how to use the simulation model themselves and having extremely cheap re-forecasting removed the fear and recriminations around estimation. Now they had people involved in continuous improvement at a business level.

Rate this Article

Adoption
Style

Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Community comments

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

BT