Dave Snowden: Agile - sound practice, poor theory

by Shane Hastie on Sep 02, 2012 |

Last night the Auckland, New Zealand, Agile Professionals Network chapter hosted a talk by Dave Snowden titled "Agile: good practice, poor theory".

Snowden's theme for the talk was how we "have to find a better way to communicate between Agile in IT and corporate strategy", and asking executive management to become Agile won't work, and "executives don't want to hear about a manifesto produced in a ski resort".

He maintains that our current approach to requirements gathering (agile or not) is fundamentally broken, as it is tainted by cognitive bias resulting in fixation on the first one or two ideas we come across in the requirements identification process. He presented an alternate approach based on Distributed Cognition, which is achieved by monitoring how people use existing systems and providing a mechanism for them to provide feedback on what they like, what they don't and ideas they may have. He emphasized that this is not crowd-sourcing, and it is a technique to use computers to augment human intelligence rather than attempting to replace human intelligence.

He ran a brief experiment with the audience to determine how observant they are, and only 8 people out of the 80 or so in the audience observed the unusual aspect in the experiment. Following the experiment he explained that most western-culture people actually only observe 3-5% of what they see, and how the remaining 95+% is filled in by our brains based on pattern matching. (Interestingly people in Asian cultures observe approx 10%, most likely because of their pictographic written languages). One problem with this pattern matching activity is we tend to use "first-fit" pattern matching rather than "best-fit", which means we are very prone to making quick decisions which may be flawed.

He continued to talk about how human systems are not mechanistic and predictive, rather than are an ecology; complex adaptive systems with many intersecting influences and how management/control techniques designed for predictive, causative domains are actually dangerous when applied in complex adaptive ecosystems.

When working in complex, adaptive systems we need to operate in three separate modes which he calls See, Attend, Act. For each mode we need different processes, and ways of acting. Trying to use the same process across modes will result in suboptimal outcomes.

To illustrate these concepts he discussed behavior in the technology purchasing marketplace over the last few years. Given their choice most people will select a variety of tools/apps that do different things rather than a single monolithic tool that tries to do everything. He contrasted Microsoft Word with the App space where people have a number of apps on devices to do separate tasks easily with simple interfaces. He is a keen hiker and talked about how he has 15 apps on his smartphone that can be used when hiking (from GPS tracking to a pub finder). His premise is that being able to pick from a variety of apps is easier as we associate each app with a single function or small group of functions and making the choice of which app to start is based on what we want to achieve at the point in time, having chosen the app it is cognitively easier to select the task we want it to do, rather than needing to navigate a complex menu structure in a monolithic tool.

He continued to talk about innovation and EXAPTATION - the activity of taking an existing tool developed for one function and repurposing it for another use. He maintains that exaptation produces radical change and is the source of most innovation in society today. He calls the processes of exaptation "managed serendipity" and cites Apple Computer as an example of an organisation who do exaptation well, most of Apple's innovations are not new ideas but new uses for existing ideas.

To round off his talk he introduced the Cynefin Framework which he has developed. It says that activities in an organisation are occurring in one of five domains of the framework:

  • Simple, in which the relationship between cause and effect is obvious to all, the approach is to Sense - Categorise - Respond and we can apply best practice.
  • Complicated, in which the relationship between cause and effect requires analysis or some other form of investigation and/or the application of expert knowledge, the approach is to Sense - Analyze - Respond and we can apply good practice.
  • Complex, in which the relationship between cause and effect can only be perceived in retrospect, but not in advance, the approach is to Probe - Sense - Respond and we can sense emergent practice.
  • Chaotic, in which there is no relationship between cause and effect at systems level, the approach is to Act - Sense - Respond and we can discover novel practice.
  • The fifth domain is Disorder, which is the state of not knowing what type of causality exists, in which state people will revert to their own comfort zone in making a decision.

He said that many practitioners confuse Complicated and Complex domains.  Complicated environments can be addressed using good practices, whereas Complex ones require experimentation as the patterns which work in complicated spaces are unlikely to work in the complex space, and may well be harmful. 

He emphasized the importance of this experimentation approach, and said that the conditions for experimenting are very important to successfully finding the best approach in Complex environments:

  • run a number of fast, inexpensive experiments that are truly safe to fail
  • some of those experiments MUST fail, otherwise there is no learning
  • some of the experiments must be Oblique - not trying to solve the problem directly but looking for opportunities for exaptation

Unless these conditions are met the experiments are worthless in terms of actually finding the innovative solutions to organisational problems/taking advantage of opportunities.

Software development is fundamentally a Complex environment, needs cannot be fully understood in advance and experimentation is a must. Agile practices are very appropriate in complex environments as the emergent nature of business needs and the evolutionary development fits with the experimentation approach.

He feels that Agile methods are on the right path, but many practitioners don't understand why they work, which means there is a tendency to slip into dogma - defining the rules prescriptively rather than allowing the experimentation needed for complex environments to happen. When an Agile method becomes a set of rigid rules to be followed rater than a set of guidelines to be adapted then there is a likelihood that a sub-optimal solution will be the result.

Without a sound understanding of complexity theory Agile practitioners (he mentioned Agile Coaches in particular) are likely to impose dogma and rules which are appropriate for complicated environments on areas that are actually complex and need experimentation and adaption.

Organisational systems move in a continual cycle between complicated and complex, and it is important to be able to know when you are working in complicated (use proven good practices) and complex (experiment and exapt which might result in new ways of working).

His approach is deliberately controversial and he makes no apology for attacking the "sacred cows" of the Agile movement. He was particularly scathing about the Scrum devotion to a single process and the role of the ScrumMaster.

His advice to agile practitioners and teams is to study complexity theory so you can recognize when to switch modes and stop applying the rules of complicated systems to complex environments.

[Note: This piece was edited in response to feedback on 8 September]

Hello stranger!

You need to Register an InfoQ account or to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Tell us what you think

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

Great post but a couple of points by Dave Snowden

Its a good summary. Two quick points (i) Cynefin has five domains - disorder is important - not four quadrants and (ii) I was not attacking SCRUM per se, rather the idea that "master" was legitimately applied after two days of training and completing a multi-choice questionnaire. SCRUM is an established AGILE method and it needs to value itself more before assigning those sort of terms.

Re: Great post but a couple of points by Shane Hastie

Thank you for the clarification Dave.

Sky resort? by Renee Troughton

Shane was that a misquote? "sky resort" not "ski resort"?

Dave - are you able to post here some links for more detail about recognising and application so that readers can find out more?

Wish I had heard the talk in person by Dave Nicolette

It sounds like a great talk with considerable deep content. Sorry I missed it!

One small comment, and I understand it's nothing more than a personal peeve of mine, but I really don't like this usage of the word "fail." When we run experiments with the purpose of learning something, all outcomes that result in learning are successful outcomes. In this context, "fail" only means "outcome was not as we predicted." Who do we mere humans think we are, that we define success as "consistent with our predictions?"

When people say they (a) succeed 85% of the time, and (b) learn from their failures, I interpret it to mean that they learn only 15% of the time. Being mostly an empty vessel myself, I prefer to learn more frequently than that. As always, of course, YMMV.

Re: Sky resort? by Shane Hastie

Hi Rene - yes it was a typo. I can't even blame autocorrect for that one :-(
I have updated the article.

Re: Wish I had heard the talk in person by Ethar Alali

I am not all that enamoured with the idea that you only learn when you are failing. It is perfectly possible to learn whilst 'succeeding', but I would argue that it depend on the environment (in line with the environments defined here).

Re: Great post but a couple of points by Ethar Alali

Personally, FWIW, I think this is an excellent article and brings to mind a few comparisons with other fields where causal links cannot be easily determined.

Consider the use of use statistical experimentation in medical disciplines or drug research to determine the 'success' or 'failure' of a method. In particular the use of null hypotheses in experimentation results in a small-p probability of the null hypothesis being correct. Those sort of experiments don't aim to find much more than a statistical link between converse of the null hypothesis and the effects they observe during experimentation, which is often carried out by randomization and double blinding both the researcher and the participant.

Contrast that with 'harder science' experiments used in subjects such as physics and engineering. These start with a theory which is determined by equations developed from first principles or from other equations and this may be carried out on models of the end result, fragments of material, a prototype end product or some other manifestation. These results are then compared with the predicted results, maybe with an expectation of statistical error if the tests are repeated more than once.

These are inherently two different ways of experimentation, taking different positions in the aforementioned domains.

It will be really interesting to see what would happen if developers/analysts truly understood complexity theory. It could well be a game changer. However, I would settle for analysts/devs having a repertoire of techniques to choose from for different situations. Also, in software, we have used some of the techniques before (for example evaluating systems from other vendors to determine what people liked and disliked) as it gives a more concrete example to the customer to play with (which I would argue is something that role-playing, CRC and even BDD can help tease out).

It is not lost on me that to suggest these be used is 'exaptation' too :-) But joking aside, this sort of success is always the way it is going to be. We humans think and work in patterns (which in this case is where the brain fills in the gaps) and we can truly see value in things we can map across to the patterns we know already. The learning curve is shorter and we gain more consistency and predictability out of something (which are key pillars of HCI and indeed UX work). Hence I would assume the better we can exapt, the more the product is likely to succeed.

It's ALL Cognition by Dan Mezick

I actively follow the work of David Snowden and find his work useful and valuable. The Agile community is served well by Snowden calling to attention the mechanics of cognition and the essential topic of cognitive biases. There is not nearly enough attention paid to the role of cognition in Agile and Snowden is a man to pay attention to. At we are exploring how to best utilize Snowden's techniques with our clients in the Boston area.

Dan Mezick

Re: Sky resort? by Dave Snowden

Try the four points method for Cynefin creation on our web site

Re: Wish I had heard the talk in person by Dave Snowden

The point I made is that we learn more from failure than from success. It does not preclude the latter. However there is an evolutionary reason for the fact that most of the water cooler gossip is negative. Avoidance of failure is a more successful evolutionary strategy than imitation of success.

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

10 Discuss

Educational Content

General Feedback
Editorial and all content copyright © 2006-2013 C4Media Inc. hosted at Contegix, the best ISP we've ever worked with.
Privacy policy