The right time for decision making
If you ever questioned your decision making process, you might want to see the recent post by Jim Bird, CTO at BIDS Trading Technologies Ltd. On his blog, “Building Real Software”, Bird writes about the conflict between what he calls Agile and Lean decision making. He points out the differences between these two approaches and notes the controversy surrounding this topic in the community.
Citing number four on the “7 Key Principles of Lean Software Development” list by Kelly Waters, Bird goes through the idea and main benefits of the just-in-time, informed decision making. He argues that leaving detailed design decisions and the matter of resolving dependencies to later stages of the project will allow teams to gather more relevant and up-to-date information and ultimately “(...) this means that you should be able to make better quality decisions”. Bird identifies two kinds of situations when deferred decision making is especially relevant: when the team does not know enough about the problem it is trying to solve and when the decision refers to a part of the system that is contained and can be defined well enough for the team to know that it needs to be done and that it can be done. In both cases focusing on other problems will prevent waste and will lead to less work for the developers.
In the second part of his post Bird explains that there are “(...) decisions that you need to make early on, while there is still time to learn and while there is still time to change your mind and start again if you have to.” Using examples from Mike Cohn’s book, “Agile Estimating and Planning”, he mainly focuses on cross-cutting concerns, such as internationalization, data handling or monitoring. According to Bird the risk of getting these aspects wrong from the start of the project is high enough to allow the waste generated by early decision making process.
Agree with pretty much everything...
When other influential factors in the system (or indeed business or team) are better determined, that alone may reduce the cone of uncertainty that often surrounds planning work and as has been indicated, this can indeed be as a result of cross cutting concerns in a system.
For example, if a team is new, they have no idea how to work together. i.e. the forming, storming stages. Moving through to the norming and performing stages puts the team in sync and hence removes a variable from the uncertainty as they become more predictable (thereby reducing the curve). In the mean time, lesser risk productivity work, or cross cutting concerns can be created and delivered, which are often associated with common systems architectures that are eluded to in methods such as TOGAF or cross cutting concerns at a system level.
However, if decisions cannot be made on the risky elements of the system and the uncertainty is still 80% but you have only a low (<10%) chance of hitting the deadline, then the sensitivity is high (to pluck numbers out of thin air). Note, the cone of uncertainty is the risk here and the time left is the sensitivity.
As a very abstract conjecture, you could say that if;
risk * sensitivity = 'risk utility'
...and risk reduces as some function of time (or preferably, of 'decisions' which are a function of productive time. Whether that is coding, analysing, pre/prototyping, role playing or interacting), whilst sensitivity increases linearly (as we were all led to believe in university/college - in terms of cost of software to fix as a linear function of time) then there comes a point where this 'risk utility' (aka cost) is minimised.