Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ


Choose your language

InfoQ Homepage Articles Interview and Book Review: Specification by Example

Interview and Book Review: Specification by Example

Gojko Adzic has written a book titled Specification by Example in which he provides advice and guidelines on adopting specification by example as a way to create living documentation on a software development project. Specification by Example is a set of techniques for describing the functional and behavioural aspects of a computer system in a way that they are useful to the development team (expressed ideally as executable tests), understandable by non-technical stakeholders and maintainable to remain relevant despite changing customer demands.

The book is rich with examples and advice with over 50 case studies that examine how different teams and organisations have applied the approach, with greater and lesser degrees of success. The author doesn’t gloss over the challenges involved in bringing in these approaches, and a number of the case studies look at failure modes and patterns with advice on how to avoid them.

Along with the case studies and examples he identifies common challenges and issues that can face teams brining in Specification by Example and offers concrete advice on how to tackle them.

While many of the examples he cites and the case studies he describes are organisations and teams that are adopting Agile techniques, he makes a point that these techniques are not restricted to, nor dependant on, using an Agile approach.

The book talks about the value of test automation, and gives advice on implementing automation but does not delve deeply into the available tools and how they can be configured. The accompanying website does provide resources including books, links to open source and proprietary tools, links to articles, video tutorials and presentations.

On his website Adzic describes the key ideas of the book as follows:

Specification by Example is a set of process patterns that facilitate change in software products to ensure that the right product is delivered efficiently. When I say the right product, I mean software that delivers the required business effect or fulfills a business goal set by the customers or business users and it is flexible enough to be able to receive future improvements with a relatively flat cost of change.

He describes a multi-step process for deriving the specification and turning it into a “living document”:

  • Deriving Scope From Goals
  • Illustrating using examples
  • Refining the specification
  • Automating validation without changing specifications
  • Validating frequently
  • Evolving a documentation system

He says:

An automated specification with examples, still in a human-readable form and easily accessible to all team members, becomes an executable specification.

Living documentation is the end-product of Specification by Example. To create a living documentation system, many teams have ended up designing a domain specific language for specifications and tests.

The publishers have made a sample chapter available to InfoQ readers – it can be found here.

He is also running an "A-B Test" comparing with and offering a poster for assisting with the test.

Recently the author spoke to Shane Hastie from InfoQ:

InfoQ: Please tell us a bit about yourself, and what inspired you to write Specification by Example?

Gojko: I'm a consultant, mostly working with ambitious teams who want to tighten up the quality of their software products and processes in iterative delivery models. Specification by Example is kind of a natural continuation of my focus on agile acceptance testing and behaviour driven development over the last five or six years. There were two sparks that caused me to write a new book. One was Tom Gilb's presentation at agile testing days 2009 where he claimed that agile doesn't work and that there is no data to prove that it does with people mostly writing and presenting about their gut-feel. The other is an InfoQ article published in 2009 that was asking whether anyone really does automated acceptance testing in agile processes or is that a purely theoretical proposition. At that time, I've already seen the process succeed fantastically in several environments and I couldn't really understand such doubt and negativity about it, so I decided to collect real stories, not gut-feel, and show the world that there are people out there, quite a few in fact, who have made it work and got big benefits out of it.

InfoQ: What is the problem you are trying to solve with this book?

Gojko: There are a lot of misconceptions about specification by example, acceptance test driven development and behaviour driven development in the community. Five years ago I was mostly meeting people at conferences who never heard of those ideas. Now most people I meet have heard about them, but so few have succeeded in actually implementing them in their process. Most of the failures are caused by common misunderstandings. In the book I present seven key patterns that successful teams have in their processes and lots of tips and things to avoid from the experience of about 50 projects in different contexts. I hope that this knowledge helps teams spot the typical hurdles, get over then and improve their delivery processes.

InfoQ: The concepts and ideas in the book are frequently referred to as Acceptance Test Driven Development (ATDD) or Behavior Driven Development (BDD) - why did you coin the term Specification by Example rather than using those?

Gojko: It's the one with the least amount of negative baggage. I had a big problem while writing the book of actually presenting the knowledge from lots of different teams as a consistent text, and I've realised that as a community we've adopted a completely wrong language for the whole thing. We call things with technical names that confuse business users, so any attempts to get them involved fail. We also have very little consistency in how artifacts of the process are called, which creates further confusion. I then decided that my goal with the book will be to promote a set of names that create the right mental images in people's heads and that allow teams to avoid common pitfalls and misunderstanding. One key issue is focusing too much on the testing aspect of the process, and considering the whole thing just a testing activity. This is why I did not want to use ATDD. BDD is a whole methodology that still isn't that precisely defined. Depending who you talk to, it might or might not include pull-based work models, outside-in design and things like that. Working with examples as specifications of features and automating tests based on those examples is a key pillar of BDD, but it might not be all of it. We'll probably have to wait for Dan North to write a book on BDD to actually nail it down. I wanted this book to be about a very concrete set of practices that can be used in lots of different methodologies, as teams I interviewed claimed to use XP, Scrum, Kanban and some other things.

InfoQ: Who is the book aimed at?

Gojko: Anyone serious about delivering high-quality software. I tried very hard to make it non-technical and understandable to many different roles, as high quality of processes and products come from a holistic view and collaboration - no single group can deliver that on its own.

Product people, business analysts, testers, developers and project managers should all be able to benefit equally from it.

InfoQ: You make the point that you are not getting into the details of any particular tool, but surely implementing your ideas requires a thorough understanding of tools to be effective, what advice can you give readers about the tools?

Gojko: Tools automate processes - they make a process run faster. If you have a process that hurts, automating it only makes it hurt more frequently. I made the mistake of thinking that a tool will save us early on, and we failed miserably. This actually got me started on writing my first book, when I realised that collaboration is the missing link in what we did. Many teams I interviewed for Specification by Example made the same mistake, and it seems that this is a common problem from the discussions I have with people at conferences. My advice to readers is fix the process first, then automate it to run smoother. Once the right process is in place, a good tool will make it fly.

InfoQ: You use the term "living documentation" to describe the outcome of your approach - how is this different from either traditional documentation or lightweight Agile documentation using story cards?

Gojko: Story cards aren't really intended to be kept for a long period of time. they are useful for short term prioritisation and planning, but when you need to understand what your system does six months after implementing a story, the card isn't going to help much. traditional documentation gets out of date very quickly. having programming language code as the only reliable source of truth on functionality creates information bottlenecks and black holes. this is where the long term effects of well written specifications with examples really come in. as the validation of these specifications is automated through acceptance tests and they run frequently, we can trust that the system does what these tests specify or from the other side, that these documents still say what the system does. well written specifications with examples will be easy to read, access and understand, so they help us remove information bottlenecks.

For many large organisations I work with, not having reliable documentation is not just a problem of the software delivery team, it genuinely hurts the business. living documentation is a way for the IT team to provide additional business value by creating and maintaining business process documentation that is reliable and easy to maintain.

InfoQ: What advice do you have for teams looking to implement this approach - what do they need to do to make sure they are ready to make the change?

Gojko: Everything is contextual and each team needs to understand what their problems are and then use the ideas from the book as inspiration to address those issues. a good strategy is to get the whole team to agree on their #1 problem with delivering high quality, sort that out, then move on to the next problem. this is a really successful improvement strategy as it creates a shared goal, reduces resistance, and gives management a compelling argument to support the change (hey, the team is solving their top issue).

InfoQ: What are the biggest changes that teams and organizations need to make to adopt this approach?

Gojko: Again, this is contextual. often the biggest change is cultural, moving from imaginary walls over which tasks are being thrown and responsibilities handed over to a more collaborative, holistic approach to delivering software. specification by example requires tight collaboration between different roles and also supports the teams in making the transition to a more collaborative environment.

InfoQ: What are some common mistakes that you have seen, and how do teams avoid them?

Gojko: Focusing on a tool is certainly a big mistake that people commonly make, which doesn't improve collaboration and just creates more problems. another common pitfall is approaching the process from a purely testing perspective, where the artifacts become oversaturated with data and combinatorial explosion of cases, making them useless as communication tools. the third common issue is getting the design of specifications wrong, using technical language and scripts to describe how something is tested instead of what the system is supposed to do.

This leads to maintenance headaches later.

To avoid these mistakes teams have to keep their eye on the prize and focus on collaboration and improving communication, evolving a common language and using it consistently in all the artifacts.

InfoQ: What advice can you give to help teams sustain the change, keeping the "living" aspect of the documentation alive?

Gojko: Teams have to recognise that the project language evolves over time, as their understanding of the domain evolves and as the business opportunities change. this has an impact on how things are structured and explained in specifications with examples, and in the living documentation system. to get the long term benefits teams have to invest in keeping the living documentation system consistent. this is a much wider topic, involving domain driven design and ubiquitous language and how using it consistently supports symmetric change - a small change in business functionality will be represented by a small change in software and documentation. if we allow these models to drift apart, then software quickly becomes legacy and at some point people just give up and decide to rewrite it from scratch. but when the models are kept in line and consistent, we can avoid this legacy trap. so the living documentation system is really alerting us to such problems earlier, and I think teams need to understand that keeping the documentation alive makes them keep the underlying software alive as well.

InfoQ: How have some organizations and teams benefited from adopting Specification by Example?

Gojko: Generally the benefits fall into four categories: higher product quality, better alignment of analysis/development/testing activities in iterations, implementing changes more effectively and having a lot less rework to push things out. this means faster time to market and better quality. examples of that are a team that had no serious bugs in production for years although they release every two weeks, a team that cleaned up a horrible legacy system enough to stop using bug tracking tools, and a team that cut their time to market from six months to four days in average. and the key thing here is that this is not rocket science or some kind of dark art - is is achievable and reproducible if people put in the effort and approach it from the right perspective.

Blogger Craig Smith was one of the early reviewers of the book, and he wrote about it on his blog. He states:

Overall, this book is a definite must read for any teams (particularly agile teams) who are trying to balance or find a decent approach to specifications and testing. It is a good balance of patterns and real case studies on how testing and specifications should be approached in an agile world. It would make my list of Top 5 must read testing books and Top 10 must read agile books. And now I know what the proper name is for the cats eyes that are embedded in the freeway!

He presents the following list of the key points he took from the book:

Here are my key notes from the book:

  • a people problem, not a technical one
  • building the product right and building the right product are two very different things, we need both to be successful
  • living documents – fundamental – a source of information about system functionality that is as reliable as the programming language code but much easier to access and understand
  • allows easier management of product backlogs
  • proceed with specifications only when the team is ready to start implementing an item eg. at the start of an iteration
  • derive scope from goals – business communicate the intent and team suggest a solution
  • verbose descriptions over-constrain the system – how something should be done rather than just what is to be done
  • traditional validation – we risk introducing problems if things get lost in translation between the business specification and technical automation
  • an automated specification with examples, still in a human readable form and easily accessible to all team members, becomes an executable specification
  • tests are specifications, specifications are tests
  • consider living documentation as a separate product with different customers and stakeholders
  • may find that Specification By Example means that UAT is no longer needed
  • changing the process – push Specification By Example as part of a culture change, focus on improving quality, start with functional test automation, introduce a new tool, use TDD as a stepping stone
  • changing the culture – avoid agile terminology, management support, Specification By Example a better way to do UAT, don’t make automation the end goal, don’t focus on a tool, leave one person behind to migrate legacy scripts (batman), track who is/isn’t running automated tests, hire someone who has done it before, bring in a consultant, introduce training
  • dealing with signoff and tracebility – keep specifications in a version control system, get signoff of living documentation, get signoff on scope not specifications, get signoff on slimmed down use cases, introduce use case realisations
  • warning signs – watch out for tests that change frequently, boomerangs, test slippage, just in case code and shotgun surgery
  • F16 – asked to be built for speed but real problem was to escape enemy combat – still very successful 30+ years later
  • scope implies solutions – work out the goals and collaborately work out the scope to meet goals
  • people tell you what they think they need, and by asking them ‘why’ you can identify new implicit goals they have
  • understanding why something is needed, and who needs it, is crucial to evaluating a suggested solution.
  • discuss, prioritise and estimate at goals level for better understanding and reduced effort
  • outside-in design – start with the outputs of the system and investigate why they are needed and how the software can provide them (comes from BDD)
  • one approach is to get developers to write the “I want” part of the storycard
  • when you don’t have control of scope – ask how something is useful, ask for an alternative solution, don’t only look at lowest level, deliver complete features
  • collaboration is valuable – big all team workshops, smaller workshops (three amigos), developers and analysts pairing on tests, developers review tests, informal conversations
  • business analysts are part of the delivery team, not customer representatives
  • right level of detail is picking up a card and saying ‘I’m not quite sure’, it pushes you to have a conversation
  • collaboration – hold introductory meetings, involve stakeholders, work ahead to prepare, developers and testers review stories, prepare only basic examples, overprescribing hinders discussion
  • one of the best ways to check if the requirements are complete is to try to design black-box test cases against them. If we don’t have enough information to design good test cases, we definitely don’t have enough information to build the system.
  • feature examples should be precise (no yes/no answers, use concrete examples), realistic (use real data, get realistic examples from customers), complete (experiment with data combinations, check for alternate ways to test) and easy to understand (don’t explore every combination, look for implied concepts)
  • whenever you see too many examples or very complicated examples in a specification, try to raise the level of abstraction for those descriptions
  • illustrate non-functional requirements – get precice performance requirements, use low-fi prototypes for UI, use the QUPER model, use a checklist for discussions, build a reference example for things that are hard to quantify (such as fun) to compare against
  • good specifications – should be precise and testable, not written as a script, not written as a flow
  • watch out for descriptions of how the system should work, think about what the system should do
  • specifications should not be about software design – not tightly coupled with code, work around technical difficulties, trapped in user interface details
  • specifications should be self explanatory – descriptive title and short paragraph of the goal, understood by others, not over-specified, start basic and then expanded
  • specifications should be focussed – use given-when-then, don’t explicitly detail all the dependencies, put defaults at the technical layer but don’t rely on them
  • define and use an ubiquitous language
  • starting with automation – try a small sample project, plan upfront, don’t postpone or delegate, avoid automating existing manual scripts, gain trust with UI tests
  • managing test automation – don’t treat as second-grade code, describe validation. don’t replicate business logic, automate along system boundaries, don’t check business logic through the UI
  • automating user interfaces – specify interaction at a higher level (logging rather than filling out the login page), check UI functionality with UI specifications, avoid record and playback, setup context in a database
  • test data management – avoid using pre-populated data, use pre-populated reference data, pull prototypes from the database,
  • Bott’s Dott’s are the lane markers on the roads that alert you when you move out of your lane, continuous integration has that function in software, run it with Specification By Example and you have continuous validation
  • reducing unreliability – find most annoying thing and fix it, identify unstable tests, setup dedicated validation environment, automated deployment, test doubles for external systems, multi-stage validation, execute tests in transactions, run quick checks for reference data, wait for events not elapsed time, make asynchronous processing optional, don’t use specification as an end to end validation
  • faster feedback – introduce business time, break long tests into smaller modules, avoid in-memory databases for testing, separate quick and slow tests, keep overnight tests stable, create a current iteration pack, parallelise test runs
  • managing failing tests – sometimes you can’t fix tests – create a known regression failures pack, automatically check disabled tests
  • easy to understand documentation – avoid long specifications, avoid lots of small specifications for a single feature, look for higher level concepts, avoid technical automation concepts
  • consistent documentation – evolve an ubiquitous language, use personas, collaborate on defining language, document building blocks
  • organize for easy access – by stories, functional areas, UI navigation routes, business processes, use tags instead of URLs

About the book author

Gojko Adzic is a strategic software delivery consultant who works with ambitious teams to improve the quality of their software products and processes. He specialises in agile and lean quality improvement, in particular agile testing, specification by example and behaviour driven development. Gojko is a frequent speaker at leading software development and testing conferences and runs the UK agile testing user group. Over the last eleven years, he has worked as a developer, architect, technical director and consultant on projects delivering financial and energy trading platforms, mobile positioning and e-commerce applications, online gaming and complex configuration management systems.

He is the author of Specification by Example, Bridging the Communication Gap, Test Driven .NET Development with FitNesse and The Secret Ninja Cucumber Scrolls

Rate this Article