Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ


Choose your language

InfoQ Homepage Articles Remote Ensemble Testing - How an Experiment Shaped the Way We Work

Remote Ensemble Testing - How an Experiment Shaped the Way We Work


Key Takeaways

  •  Remote ensemble testing can be used as an enabler advocating for a whole team approach, where quality is a shared responsibility rather than a single role task.
  • Applying ensemble testing in a remote context requires a different setup than in the on-site setting as the team can only connect and collaborate through the provided software and equipment. Therefore experimenting with existing tools like conferencing software, remote access, and test session recorders helped us to find the best setup for our remote ensemble testing sessions.
  • The ensemble approach is a framework, and so we adapted it to our needs as a remote testing ensemble. For example, the number of participants in our sessions changed depending on the testing context and the members’ availability. We have sessions with three people or ten or anything in between. In collaboration, there is no one-size-fits-all solution available. Openness and an experimental mindset helped us to find the team's best fit for various contexts. 
  • “The team is the right one” - This statement is valid for our ensemble. Everyone is welcome to join us regardless of their profession, level of technology confidence or previous exposure to testing. Because everyone has skills, knowledge, and experience to share. This is how professional diversity becomes reality.
  • Running remote ensemble testing sessions regularly influences our culture, how we interact with each other, and our skills, as well as the software we create. The benefits of this approach for us are for example better relationships, connection to colleagues in times of physical distancing, and a decrease of post production issues raised by customers.

In this article I will share how an experiment evolved into a common practice at the workplace. In an experimental approach with remote ensemble testing, I have tried to get my teammates on our cross-functional team more involved in the testing activities of our jointly created product. This all started in the times of a global pandemic where the entire team was working from home. I will highlight some variations of the approach, and will share what we achieved by applying remote ensemble testing and how we shaped the framework in a way that it works for us as the remote ensemble.

When you’re the only tester in the team

As a tester in a cross-functional team, surrounded mainly by developers, I felt misunderstood and challenged with common misconceptions about testing, about what testing is and what testing is not. Or at least what non-professional testers think what testing is and what not.  

For some of my developer colleagues, I was the first full time professional in testing they've ever worked with. Comments like, “Testing is on your paycheck. Not mine…” haven’t been unusual.

So I tried explaining the idea of testing, invited to pair up and advocated for the whole team approach. I tried different ways of explaining and selling the concept of testing. For example, I invited everyone on the team to accompany me for some hours of testing, in order to create more visibility around the testing I was doing. 

Running shadowing sessions with my colleagues has been nice; some shared that they now understand my worries about missing requirements better. However, just a couple of weeks later, any empathy was displaced by the distraction of the daily work routine and keeping up with the release cycle. 

All the experiments had one thing in common: the short term feedback was positive, but they hadn’t made a long lasting impact. The experiments simply haven't been sustainable. Considering the previous failed experiments and the experiences I gained from them, I wanted to try another approach that may serve the team and their culture better.

Let’s try out ensemble testing

So I started all over again and carried out some research which led me to come across the idea of mob programming.

All the brilliant people working on the same thing, at the same time, in the same space, and on the same computer.                                                                                                (Woody Zuill)  

The original definition is meant to cover the idea of mob programming, however it can also be applied as a definition for mob testing or in general as a way of describing some collaborative way of working or problem solving. 

Today mob testing is also referred to as ensemble testing, and that is the wording I am using here.

Reading about how a team works collaboratively together in one room, on a single task, on one computer, at the same time was an idea that attracted me; it basically put the whole team approach into practice, a matter I advocated for some time without any success.

The idea of collaboration, communication and solving a task as a team sounded like fun to me. I worked mostly on my own at that time and it could feel lonely once in a while, especially since working from home was the new normal.  

I became curious and wanted to try it out. Although I wasn’t sure if ensemble testing would be a useful approach for us, I wanted to take the chance and test it out. Therefore, I braced myself for another experiment.  

Our experiments

In fact it was not just one experiment. Initially it started as a one-time, but with momentum, it was the starting point of a series of experiments that helped to shape the remote ensemble testing framework we have today.

We experimented in various ways, and there is something I find very interesting about ensemble testing. The composition of the ensemble can change significantly. Depending on the scope of the testing session, we had ensembles from 3 up to around 10 people, from pure IT tester/developer ensembles to true cross-functional ensembles, even including sales people.

We played around with the tech set up, from screen sharing, over remote control, to a dedicated remote access computer, using a wide range of tools to find the best fit.

And over time we almost naturally developed our own type of testing flow. At the beginning I stuck to a list that named everyone in the session, ticking off names and ensuring everyone was getting their share of screen time either driving or navigating the session. But as we got more versed with the sessions we started our own unique flow of passing over the torch.

During all those various experiments, I tried listening carefully to my peers; when they shared their view of the sessions, I actively requested feedback and ideas. Using the feedback and experiences helped to shape the remote ensemble testing framework we adapt today and helped it to make it our own. 

For example, based on the feedback the test charters became more precise testing missions to create a story around the software artifact for the session. Another aspect is that the planned sessions will be announced in our location-wide slack channel to give everyone interested the chance to sign up for the session.

How remote ensemble testing looks

We started our ensemble testing experiment in the middle of the global Covid pandemic. That means, we worked from home, encapsulated in our personal space. There was no thought of meeting in a room for a testing experiment. Therefore, we started it remotely and till today, we haven’t done it together in one room with one shared computer, keyboard and mouse. 

While it may have appeared as a challenge for the team, I like to believe it was a chance for us to learn how to collaborate in a remote environment.

Certainly, there are differences compared to an on-site ensemble testing approach, where just a single working set up is needed.

When working remotely the team is much more dependent on working technology and equipment. Therefore, I will outline our technology set up a bit more in depth. 

Firstly, a video conferencing tool is needed to mock the one-room scenario. And secondly, in order to have a rotating driver, the ensemble either uses some way of remote access to a single working set up, or everyone creates a local set up and then shares their screen. The latter in our case would mean having the on premise installation ready on every participant's computer, handing over files and settings. Since that approach did not appear manageable for everyone in the ensemble, we played around with various other approaches, like a single driver and remote access either on an ensemble member computer or on a dedicated computer for the testing session. Sharing the access and rotating the driver role felt best in our case. 

Further, to the minimum set of required tools a recording software was added as the tool of choice for documenting the testing sessions. In general, the embedded recorder of a conferencing system works well. In the beginning we used a screen recorder to log the steps in order to support the reproduction of the interaction with the software under test. But the capturing did not work very well, screens hadn’t been recorded, and the flows were not understandable. Now, the sessions will be recorded on video. And those can be used for debriefing sessions, or as a source for filing bug tickets.

Having a good quality headset or speaker/microphones is also a nice quality booster for remote ensemble testing sessions. Being understandable is one core pillar for a successful session.  

Despite the tech challenges, I realized very soon that it required a certain amount of discipline to practice ensemble testing; as a driver, not to get side tracked and wander off on their own path, and as a navigator, to be clear and precise in verbalizing thoughts, ideas and testing attempts. It was a challenge for me and my colleagues, but I feel it was also a very good live training about communication and collaboration.

Our usual set up, that the session is based on, is a test charter; our team’s mission, so to speak. An example for a charter in our case could be: 

The navigator will pick the test charter, create and share the testing idea. The driver then shares the screen and acts on the navigator's request. The remaining ensemble- I call them the spectators- are watching, observing, learning, and assisting when the navigator gets stuck. Whenever the navigator's task is fulfilled, the next person in the ensemble will start sharing their testing idea.

When multiple people are coming up with ideas at the same time, we put them in a queue and explore them one after another, so that everyone has a chance to express their testing idea and navigate the driver through the interaction.

Our rotation system of the roles deviates from the approach of a timer-based rotation. The navigator role will change when the idea is explored, while the driver might stay in its role for around 15 to 20 minutes. Usually the driver will ask for a swap or I, in the role of a facilitator, will offer the rotation once in a while. We apply it that way as that is what supports our flow at best. 

I also experienced that some colleagues, especially first-joiners of the session, like to stay a bit longer in the spectator role, observe and learn what is going on, before they feel confident enough to make it their call. To me it is very important to give them the time they need to adjust to the situation and the team as it allows everyone in the ensemble to feel comfortable sharing their ideas; the majority of them are not professional testers, and they act very much out of their comfort zone.  

The benefits of ensemble testing

Overall, I feel ensemble testing has helped the team to understand each other a bit better. We got to know each other from a different angle. Personally, it changed some of the perceptions I had about my peers.

It challenged us on our ability to collaborate and communicate by giving us constant feedback throughout the session. If the driver did not react or did something unexpected, the navigator learned that the instructions were not clear enough. On the other hand, the driver was called to order if they went their own way without communicating it. Sometimes it was tough, but I’d say it was worth it. 

And in general it levered the conversation about testing. It is hard to describe, but I felt something of a shift, not a sudden one, but more subtle and evolving. But in the last year, since we started doing ensemble testing sessions, I’ve had more conversations about testing with non-testers than in all the other years put together. While looking more on the business side, the number of issues or bugs raised by the end-users after shipping the software decreased for those modules and features where we did use the (remote) ensemble testing approach compared to artifacts where it has not been used. I remember one particular feature where the ensemble and pariring approaches have been used throughout the entire process. From collaborative user story writing, UI and UX design, to pair programming, ensemble testing and creating the test design for the regression tests together as a team. The customer feedback was great! That was by far the most interesting, challenging and fun ensemble team I had the fortune to be part of.  

My learnings

By watching my developer colleagues testing the software, I learned some interesting approaches I could add to my testers’ toolbox. As an example, I might approach something through the User Interface while a developer would use the command line tool. They shared some of their shortcuts. I in return could share my knowledge about a particular feature and how to use it, while the support colleague in the ensemble could contribute first hand user feedback to the session.

I had aha-moments when I saw someone from another department using the software for the first time in their life. How the software I tested for years appears to someone who has never used it before, is quite an interesting experience and recalibrated my own perception on what the intended use was. 

And I learned that even though I might be the only tester in the session, that does not mean I have to test alone or have to do all the testing. 

Your turn - how to get started with (remote) ensemble testing

Just do it! OK, to be a bit more serious, I would suggest stating it as an experiment, as a method the team could try and see if they like it or not.

When I ran the first session, I created a checklist for the running order of each topic, from preparation, to hosting the session until wrapping it up. I had it placed on my desk, to remind me about the next steps. Please feel free to use this Remote Ensemble Testing Cheat Sheet that contains the mentioned checklist. 

  Further, I’d recommend having a dry run for the technical setup with a person of trust, to ensure that the screen sharing, remote control, access to required tools and everything else works. Personally, running those kinds of checks in advance calms my nerves, and good preparation helps if picky people might be around. 

Maybe also running the first session with a smaller ensemble could be a good way of experimenting with it and learning about it. Giving everyone the opportunity to fail and learn from it is something that is important to me. Most of the changes and improvements in our ensemble sessions happened because of feedback and retrospection. 

An alternative approach to experience ensemble testing but that is outside of the corporate context, is to get in touch with the testing community and have a first run with someone experienced in this field. With this in mind, I would like to wish everyone interesting experiments.

About the Author

Andrea Jensen is currently a software quality engineer in the maritime industry by day, and a reader, crafter, curious learner and RiskStorming Online advocate by night. You can find her on LinkedIn or Twitter.

Rate this Article


Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Community comments

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p