BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News Road Mapping Your Way to Agile Fluency

Road Mapping Your Way to Agile Fluency

Kelsey van Haaster will give a talk at 1st Conference about how to develop a road map to agile fluency for teams and organisations.

1st Conference is a one day conference aimed at people starting out with agile. The conference will be held on Monday February 15 in Melbourne, Australia. InfoQ will be covering the conference.

InfoQ interviewed Kelsey van Haaster about the possible ways to do an agile fluency assessment, examples of findings and improvement opportunities that came out of the assessments and things that she learned, and advice for readers who want to use the agile fluency model in their organisation.

InfoQ: Can you briefly explain the agile fluency model?

Kelsey van Haaster: The Agile Fluency model was first proposed in 2012 by Diana Larsen and James Shore, both of whom have published a number of great articles and talks which explain how the model works including Agile Fluency: Finding Agile That’s Fit-for-Purpose. In brief the Agile Fluency model categorises the observable characteristics and behaviours of Agile teams against a star model. Starting from 1 star, which describes a team who are still learning to work together and think as a team, rather than as individuals, through to 4 stars, which describes a team (and often an organisation) in an environment, totally focused on the delivery of technology and services.

InfoQ: What are the possible ways to do an agile fluency assessment?

Kelsey van Haaster: The model is not designed as a tool through which a team can measure its performance against an arbitrary standard, or compare itself to other teams. The model itself does not specify or mandate any particular use case; it is a framework for understanding and supports a wide range of different approaches. Whilst the developers of the Agile Fluency model acknowledge that teams do generally progress through the levels of fluency, they make the point that achieving 3 or 4 stars should not necessarily be the goal. Rather the goal is for teams to understand and identify the level of fluency that makes sense in their particular context. Once a team has identified their desired level, the model can assist them to articulate and plan the investments (time, money and focus) they need to make to achieve it.

At Thoughtworks we have combined the visual concept underlying the ThoughtWorks Tech Radar with the Agile Fluency model, to support the production of a customisable, context specific development road map for a particular team. Teams can identify the specific practices, processes and concepts that are indicators of a particular level of Agile Fluency and assess themselves against these using a traffic light system. For example; red indicates that the team is not doing the practice, or faces blockers, through yellow where a team identifies a practice it wants to focus on, to green where the team is comfortable with the practice. This provides the team with a visualisation of where they are at a particular point in time and indicates where the team wants to develop. Revisiting the radar allows teams to track their progress over time.

InfoQ: Can you give some examples of findings and improvement opportunities that came out of the assessments that you have done?

Kelsey van Haaster: Working with multiple teams within a department or organisation is particularly valuable, each team is unique and has it’s own strengths and weaknesses. Key findings were that sharing the team visualisations allows a team to identify where another team has strengths in a particular practice or process and can potentially provide advice and support. It is also useful to see any processes and practices identified as learning opportunities or challenges for many teams. This can inform decisions about group training that would be of high value as well as calling out organisational blockers that can then be addressed by management teams.

Having a team sit together and discuss their practices was valuable, in some instances this was the first time teams had done this. The process helped the teams to generate a shared understanding of what a particular process or practice looked like.

InfoQ: What did you learn in the assessments? Do you have suggestions on how the agile fluency model can be improved based on this?

Kelsey van Haaster: We learned a great deal through this exercise, in particular, you need to be very clear in explaining that the model is not a measure of maturity against an external standard. Otherwise teams will want to make things green in order to achieve a particular star rating. It’s also very important to stress the idea of context and explain how this influences the right level of fluency for a particular team at a particular time.

The fluency model itself is very flexible, and whilst it does provide for core metrics through which to identify whether a particular level of fluency has been achieved, the addition of some finer grained metrics would be helpful. For teams new to Agile software development, a coach or some level of expertise is needed to help teams identify and understand which practices and processes are relevant for their context.

InfoQ: Any advice that you want to give to InfoQ readers when they want to use the agile fluency model in their organisation?

Kelsey van Haaster: One of the ways organisations can use the model is to indicate the amount of investment needed to support team development to a particular level. This investment can take the form of time, money, coaching and willingness to accept changes in productivity. Organisations using the model need to understand this and take it into consideration when deciding which level of fluency is right for them. Sometimes the right level of fluency may be a function of how much investment you can afford and what you can achieve based on that investment.

Rate this Article

Adoption
Style

BT