BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Podcasts Lanette Creamer on Exploratory Testing and Technical Testers

Lanette Creamer on Exploratory Testing and Technical Testers

Bookmarks

In this podcast recorded at Agile 2019, Shane Hastie, Lead Editor for Culture & Methods, spoke to Lanette Creamer about the need for technical skills by testers and the importance of exploratory testing.

Key Takeaways

  • Despite the importance and value it provides, software testing is not a particularly respected profession
  • Testers with development skills and developers with testing skills can communicate effectively with each other and pairing results in faster bug identification and removal
  • Unit tests are an asset of confidence
  • Testers have an ethical responsibility to think beyond the intended use of the code, considering what could happen and how the product could be misused
  • Exploratory testing is an approach where instead of trying to prove that the software works, the goal is discovery

Transcript

  • 00:21 Good day folks. This is Shane Hastie for the InfoQ Engineering Culture podcast. I'm at Agile 2019, and I'm sitting down with Lanette Creamer.
  • 00:29 Hi, Lanette. Welcome. Thanks for taking the time to talk to us. We've followed each other in the social media for a while, but this is the first time we've had a chance to meet in person.
  • 00:39 Would you mind giving us a little bit of introduction and background?

  • 00:42 Lanette: Sure. So, I have been on computers since the BBS days. I started on DOS before we had Windows 95, even, and I moved to the Mac Quadra's and I just really liked being on the computer.
  • 00:56 I went to school for graphic design and I ended up, just through accident, getting a job in tech support for an outsourcer for Adobe, I was a contractor there, so many levels of separation. I ended up loving it and I ended up getting a contract testing job at Adobe, I ended up really liking that and to get my first job at Adobe I interviewed against about 50 people.
  • 01:20 I managed to get that job and I stayed there for 10 years and it was a great experience.  I worked on some of the first Creative Suites we ever put together. The early versions of InDesign back when Quark was the big thing.
  • 01:31 Then I did some consulting for like big coffee company in Seattle. I worked on some hospital data, it was really an interesting space. I was the first tester ever at a company that builds products on InDesign Server. I started a testing discipline there, and then the past few years I've been working for the Omni group and we do iOS and Mac software for productivity. And that's been really great, I've never got the chance to learn the iOS tools before. And that's been fun.
  • 01:59 that brings me up to here where I'm  now focusing on becoming a more technical tester, more on the development side, working on my developer skills and trying to build that up, because I've been testing for so long I feel like my growth opportunities really more on the programming side.
  • 02:17 One of the trends that I've certainly seen in the testing space over the last few years is the shift towards more technical testers, testers needing more technical skills. What do you feel about that? Is that the case?

  • 02:30 Lanette: I feel very mixed about it because I love exploratory testing, it's a true passion of mine. I'm very good at it. But I also know the ceiling is too low. Testing is not a respected profession. You can say as much as you want to, that it's equal, but programming pays more and I'm tired of trying to prove my value as a tester. It's impossible. There are some people that will never be convinced.  The very best tester you could ever have is still going to make less than a mediocre programmer.
  • 03:00 there comes a point when you're just swimming upstream. You cannot continue to bash your head against a brick wall .
  • 03:08 I think there's great value in exploratory testing and I don't think everyone should have to be a programmer, but I also feel like I can be a good programmer and a good tester. I can do both. It doesn't take anything away from my testing that I want to write code too, and I'll still do both. I'll still be a tester, but I'll also be a developer.
  • 03:28 Shane: So that combination of skills, what does it bring extra?
  • 03:31 Lanette: 03:31 It brings some debugging abilities, in my opinion, it brings some different levels of testing - so I can go in and write some API tests in the same language as the code, and that makes it easier to involve my developers. I mostly write Java script at this point, but I've written Python in the past and I'd like to learn Java too.
  • 03:51 If you can go in and write some code in the same language as your developers and check it in where they can easily review it, they're more likely to be able to participate with you and make suggestions where they may not know the testing space, but they know the programming space and then they know what the API should do. So it's a area for collaboration that didn't exist before. Maybe meeting them in the middle where they're at.
  • 04:14 Shane: And for the programmer - why should they learn about testing?
  • 04:19 Lanette: Well, they have a responsibility to test their own code and I would hope that we all want a quality product. 
  • 04:26 For one thing, companies are just saying, developers can do all the testing and they don't have time to do it. So you better learn to do it efficiently, if you want time to code your features.
  • 04:35 So I would think even if you have testers, you want to be able to unit test as well as you can and make sure that you aren't changing anything that breaks behind you, just for your own sanity, for your own capability to change your code in the future, having those tests, I think is an asset of confidence.
  • 04:55 It just adds a layer of confidence. If you're good at testing your own code, you aren't just throwing something over the wall that might be embarrassing.
  • 05:03 Shane: So what are the key skills that a programmer would have to pick up to do good testing?
  • 05:11 Lanette: I think the top thing is curiosity. Not what it's supposed to do, but you need to think about what it's not supposed to do or how it may interact with things outside the scope of your code.
  • 05:24 It's those areas that are getting missed. Things like performance, race conditions, error handling, security, accessibility. We need to think far outside the requirements and outside the code. And that means bringing in some customer perspective, is it even suitable to the purpose and what unintended consequences might it have?
  • 05:44 You know, this past year we've had all these security leaks. We have had the unintended consequences of social media really be biting in several countries. No one's thinking about these things. So we have to, as agileists, start thinking about beyond our intended use of the code, what could happen. And that's really an ethical concern that we do that.
  • 06:06 So I'm just a programmer. They give me a spec and they tell me what to do. I do it, don't I?

  • 06:10 Lanette: Well, I would hope that you would have more professionalism than that.  maybe when you first start, that would be your approach is just to get it to work. But at some point, I think we all have pride of what we've built, and we want to see it really solve the problem for a person. I think most of us, the satisfaction, isn't just, Oh, here's this piece of code. It's having an impact on someone who's using it and that they can get their task done.
  • 06:34 So it's going one level further beyond does it work to does it satisfy this user's need and is it as good as it can be?
  • 06:42 And I think that's where exploratory testing comes in, that's beyond the functional aspects of does it function? But is it suitable and doesn't do anything it shouldn’t?
  • 06:53 We know about TDD for instance, or we should know about TDD. If I don't know about TDD, where do I find out?

  • 07:02 Lanette: Well, that's interesting because we didn't learn about TDD in my class. I took a certification class for JavaScript developers, and we literally did one test the entire nine month course. it's like testing didn't exist.
  • 07:15 Since then, I've been trying to learn TDD and part of what's been excellent there is people are willing to pair with you online, sometimes. And there's these websites now like free code camp and Exercism IO where you can go in, in any language and there's already tests set up and you can start learning TDD right there.
  • 07:35 that’s been really great for me because I started from nothing. I knew nothing about TDD. I'd literally never paired until I went to an interview and they paired with me and I'd never done TDD or pairing before. To say I was slow and not good at it would be an understatement, it wasn't my best performance because I was just so new to it.
  • 07:53 The fact that I'm supposed to be listening, reading the code, processing it, then adding a test and talking at the same time. That's a lot to do. So just learning how to take a pause when I need to,  learning how to do that hand off when you're pairing and not try and both type and drive, it takes practice.
  • 08:12 Pairing is a skill.

  • 08:14 Lanette: I think it is.
  • 08:16 Shane: And it takes practice
  • 08:18 Lanette: Getting the pace is not something that comes natural to everyone. It was pretty awkward for me. I felt like I was bringing overload, trying to do all of this one time.
  • 08:27 Let's step back into the testing space. You're given a workshop here at the conference on coaching for exploratory testing.
  • 08:35 You mentioned that exploratory testing is a really important skill that goes beyond just looking at, does it meet the requirements? So what is exploratory testing and what's needed for it?

  • 08:49 Lanette: Exploratory testing is an approach where instead of trying to prove that the software works,  the goal is discovery.
  • 08:56 We're going to discover new things. And so you need creativity, but in order to coach it, you want to set people up for success, especially if you have people that aren't testers. And we have cloud setups now, and we have containers, which is a huge boon if you want to coach exploratory testing, because if you have a complex setup, you can get everything set up in the environment in advance, and that maximizes your testing time.
  • 09:24 So when I'm coaching other people and trying to lead exploratory testing sessions, I get everything set up first and I preflight it, I make sure we aren't dead on arrival, that we have builds that work that we have the access needed.
  • 09:37 Then you also generally want to start with a charter and the charter is a mission statement, and it needs to be broken down so that you can complete it in two hours, but you need several charters because you may not end up finding anything in the first one. And if you're not finding anything interesting and the discovery is not happening, you move on to the next one. The participant will be following the charter as an overall mission statement.
  • 10:00 What would a charter look like?

  • 10:02 Lanette: Here's a great example. This is one of my favourite examples, just in general, of a charter. Christopher Columbus, using the ships and the money from the monarchy of Spain would sail to find a passage to India. And guess what? Along the way, he bumped into a continent.
  • 10:21 Now, if you bump into a continent, you don't ignore it. That's the point, it's the discovery. We don't know what's there. And I said in my workshop that the main obstacle to effective exploratory testing is the belief that there isn't much to find.
  • 10:35 Just like Christopher Columbus was so sure there was nothing between Spain and India, he's just going to sail straight there. That assumption, we need to actually go on the journey and see that. And maybe we're right. Maybe we do end up finding new passage to India. Great. And we want to report our success if we do, but if we do discover something unexpected, that's really the point of the charter -  to go on that journey and verify what we find.
  • 11:03 And it may be that our assumptions are correct. And we just validate those or maybe that we bumped into a surprise.
  • 11:09 Shane: So we've got a charter that tells us what we think we're looking for or we suspect might be there or some assumption that we want to validate or invalidate. And now I'm sitting in front of a computer
  • 11:21 Lanette: And you have your setups, so I've made sure you have the right permissions.
  • 11:24 I've set you up. You've got a container. You understand the charter, you have what you need, and as a coach, I'm going to keep track of time for you. I'm going to make sure you have the capability to take notes. So you can just write down anything as you go. I'm going to make sure you know how to grab screenshots and a screen recording if you see something interesting, we want to look into later and then if you get stuck, I'm going to help you. And if you aren't used to reporting bugs, I'm going to help you.
  • 11:49 I explained in my workshop some basics of how to take performance benchmarks, how to isolate bugs, how to get screenshots. And importantly, how you want to report bugs in neutral language, factually.
  • 12:01 Not a judgment - so you don't want to say, Oh, redraw is bad. Instead, you might say pixel redraw incomplete in this section.
  • 12:08 Because it's people's work, you don't want to tell someone their baby's ugly. But you do want to be very factual about what's happening and provide as much evidence as you can and get it down as simple as you can, and always make sure you're maintaining a good relationship with the team, including giving them what worked well.
  • 12:26 Developers don't get enough positive feedback about what was really fun. What did you do that was cool? And so when I'm coaching, I try and bring those facts out as well. So we're not just dumping an ugly pile of bugs on their desk, but know what were we surprised was so good.
  • 12:43 We found some bugs, but wouldn't it be better if everyone saw them?

  • 12:47 Lanette: Sometimes, it can be better. And there are several ways. I've tried to report that one way will be all laid out a workflow in a flow chart. And I'll make it very basic. So green, if it went great, little yellow triangle if there were a few problems  but we could get there and a full out red stop sign if you're blocked.
  • 13:08 That way, you can kind of get an overview of which areas were we successful. It's sort of like just a map, a visual map. Sometimes on a remote team, we would have a Pivotal Tracker project that had the charters in it, and everyone could see the charters and prioritize them along with them who did user stories. And other times we would just share the bug findings in a chat room so that people could see what was found.
  • 13:34 Some teams prefer an email summary. I've done it that way as well. I would just do an outline of what we covered, what worked well and what the bugs were. 
  • 13:41 All of those can be effective, depending on the team. One of my favorite ways to include charters, if the team's doing sprints is we write charters on a sticky note and then we vote for which one's most important when we're doing user story review, and then at the beginning of the sprint, when the developers are, you know, heads down, writing features, testers will grab the most important charter and put it in the test column.  Then when it's done, if it's multiple people doing it and it's coached, the coach moves them, or if it's just one tester at a time doing it, then they move 'em. 
  • 14:16 And then the charters that aren't done, sometimes they're important and they go back in to the next story review. Other times we just say, we're not going to do them if they don't get picked. then I like to put those on the backlog somewhere in a backlog pile  and then we see if we made a good decision after it ships or, you know, is that something maybe we should revise next time
  • 14:34 Inspect and adapt our testing?

  • 14:36 Lanette: Exactly.
  • 14:37 Shane: Who would have thought?
  • 14:38 Lanette: Yes, it's such a simple thing and yet I don't see very many people include their charters as part of their testing, but it's a great thing for testers to do early in the sprint where we have the code that we've just validated as functional but have we really validated that it works together, that it's secure, that it's performing?
  • 14:56 It can kind of help us bake in a little bit more quality to the stuff we did the last sprint.
  • 15:01 What about including other non-testing team members in doing the exploratory testing? You know, asking our technical friends, the developers to actually do some of that testing.

  • 15:11 Lanette: 15:11 I think it's really effective, but one of my favorite things to do, if developers have time, is if you can have them watch actual customers go through and use the product that gives some strong motivation to fix bugs, because there's no excuse like, Oh, a user wouldn't do that, or that's a tester case, or that's my favorite - that's an edge case.
  • 15:31 Shane: I'm an edge.
  • 15:32 Lanette: Yes. Seeing the impact it has on customers can be really positive, it can be that instant feedback that we need, but I've also run exploratory testing sessions with up to 40 people, and it doesn't matter who those people are, as long as they understand the charter and they know enough, you know, whether you need to give them a demo at the beginning so they understand a little more.
  • 15:54 It's totally possible to include anyone, including customers, I've had a business analyst, PO's anyone at all can really participate as long as they care about the quality of the product and they have an open mind.
  • 16:06 If you're organizing one of those relatively large scale testing sessions, this is where the coach role, I'm guessing, would be pretty important to be able to guide and support.

  • 16:17 Lanette: Yes. The main thing you're doing when you're coaching is making sure everybody else is able to focus. And during the time box, part of that is if you're doing it remote, people need to have access to the channel so they can ask questions live, and they can see what other people are finding.
  • 16:33 And that's where one person finds a bug, and another person looks in that area and finds something slightly different because a lot of times bugs live in nest. So doing it remote, the coach is pretty actively in the chat channel. If doing it live, like I have before in a large lab, then you're walking around and you're really seeing everyone has what they need.
  • 16:52 Someone finds a bug, they're not quite sure what it is, you're going to pair with them to help isolate it. Especially if they're not testing commonly, it may not come as naturally to them to isolate that bug or to write it up, they may have questions on the process. So you want to be available for that.
  • 17:07 Shane: So in that case, you're not going to be doing much testing yourself. You're providing that support and guidance.
  • 17:12 Lanette: I tend to do still do testing because if someone's kind of having trouble getting started, then I'll pair with them.  So if you're not helping someone get unstuck, you can still help someone get started with their testing ideas by verbalizing what you're doing and kind of testing alongside with them.
  • 17:28 It can be hard to just get in there, just test because when we're testing, especially exploratory testing, we have our internal assumption, our mental model, and we're comparing it to what we're seeing and there's a lot of ways to verify our mental models, correct, by comparison. And there's also a lot of ways we can miss what we're seeing, where if you pair up with someone else, they may see something you don't see.
  • 17:50 The pairing activity. Adds extra value

  • 17:54 Lanette: You have another pair of eyes and another mental model with you to help you problem solve  and that applies to testing as well as to development.
  • 18:02 Sometimes just verbalizing something, when you're isolating a bug verbalizing, what you think is happening, it can help you clarify it and help the other person come up with a way to isolate it further.
  • 18:13 And sharing the bugs on the chat room has that effect to some extent, if you're remote. I really like to exploratory test in person, but we're doing more remote in a lot of companies now. So for that it's chat rooms, JIRA tickets, video conference, kind of thing.
  • 18:29 When to include exploratory testing is one question that I've been asked a lot.
  • 18:33 I think one very good time to have a whole team is before you release, especially if you have a code freeze period of time, even if it's a short period of time, just make sure there's nothing embarrassing that you can fix right before you go out. Sometimes you'll find something simple. Sometimes you'll find something serious.
  • 18:52 One of my favorite examples back when we were doing exploratory testing across products for the Creative Suite. At one point teams had made different decisions on the product level and when we went to test them together,  we discovered the PDFs we exported, would not open in the version of Acrobat we planed to include because all the teams have made decisions in isolation, not realizing the impact on the other teams.
  • 19:16 And so we were able to fix that. That would have been really embarrassing if we had gone to beta and we couldn't open her own PDF files. And so we were able to avoid that. And that's a simple sanity check of doing a typical customer workflow.
  • 19:30 Sometimes we miss things that are outside of our product. We're so concerned about us we forget to think about what does the user need, what if they copy and paste from word? You know, we forget about that. Or, you know, when they put this in their email, what happens, these are things that if we go outside the scope of our product, just a little bit we can find out some things that might really impact our users.
  • 19:52 Exploratory testing to me is going beyond functional testing. We've done a great job of covering our functional testing in many ways, with TDD, with unit tests, with test automation, we now have UI automation a lot of times, and it's really the things outside of that scope, those gaps we need to detect.
  • 20:09 Shane: It needs a creative mindset as well.
  • 20:11 Lanette: Absolutely, and that's one thing I think developers have. But they have it focused in a narrow space, so it's really broadening that scope. And then they definitely have what it takes to do good exploratory testing, and sometimes they can just jump into the code and fix what they find right then and there, and that's nice.
  • 20:31 Shane: Reduce the cycle time.
  • 20:32 Lanette: 20:32 Yes. I used to pair with developers when I was the only tester at a  company, when they had new code before they would check it in, I would pair with them just for an hour and we would live test and fix the code without writing bugs. And that would be really effective. I mean, we would fix a lot of bugs that way, and then you don't even have to worry about reporting the bug.
  • 20:51 You just show it. And they are like, Oh, I know what that is. Um, but sometimes we would end up, it'd be a tough one and we'd have to put it on the backlog and that happens sometimes. But a lot of times we could fix a lot of bugs, like maybe even 10 bugs in an hour and that was pretty effective.
  • 21:06 Sometimes we didn't find anything, and then we felt more confident just checking in that code and going
  • 21:11 Shane: Switching tack entirely, because I happen to know that cats are a passion and today is international cat day. Tell us a little bit about your cats
  • 21:20 Lanette: I have two kittens one, Nevani, is seven months old and she's a rescue and she's a super good hunter. She's a white cat Siamese with Tabby points, but she's also a mutt , she's not purebred. Well, my other cat  Adelin is four months old and he is a little terror. He's Siamese with seal points and he is 25% ragdoll. So he's a very snugly, mouthy cat. The ragdoll part makes them extra snugly and he's personality plus it's been really fun having these two kittens and I've heard from my sister who's looking after them that they have been emotional eating while I've been gone. And they have been through a bag and a half of cat food while I've been out here at agile. So I have no idea what size these kittens are going to be. They might be tigers by the time I get back.
  • 22:07 Lanette, if people want to continue the conversation, where do they find you?

  • 22:10 Lanette: If you like a very active Twitter scream, they could follow me at @lanettecream on Twitter. I basically live tweet my entire life. So, it's not just about tech or if you want to email me lannettecreamer@gmail.com. My full name, the capitalization doesn't matter, but I'd be happy if anyone has questions about exploratory testing. I have posted my slides for my workshop on the agile website. There's a PDF on Dropbox. If you want to get it from there too, you can just send me an email. Send you the link.
  • 22:38 Shane: Excellent. Thanks so much.
  • 22:39 Lanette: Thank you.

More about our podcasts

You can keep up-to-date with the podcasts via our RSS Feed, and they are available via SoundCloud, Apple Podcasts, Spotify, Overcast and the Google Podcast. From this page you also have access to our recorded show notes. They all have clickable links that will take you directly to that part of the audio.

Previous podcasts

Rate this Article

Adoption
Style

BT