Transcript
Losio: We are going to discuss the journey to continuous testing in DevOps, and what that means, and how we are going to possibly understand where it's coming from. What are the major challenges, and where we are going?
We'd like to basically just have a very quick interaction on what we mean by continuous testing, and specifically continuous testing in the world of DevOps. Continuous testing was initially just proposed as a way to reduce the waiting time for feedback to developers, to the process of basically automating our test to receive some immediate feedback. Of course, to speed up the development lifecycle. That is great in theory, but at the end of the day, teams face different challenges to implement effective test automation. Automation is great, but there are some challenges. We're going to talk about some of the opportunities and some of the challenges that continuous testing in DevOps has.
Background, and the Continuous Testing Journey
My name is Renato Losio. I'm a Principal Cloud Architect at Funambol. Specifically, I'm an editor at InfoQ. I'm joined today by four industry experts, industry leaders. I would like to give the opportunity to each of them to quickly introduce who they are, and their personal journey to continuous testing.
Fletcher: My name is Tina Fletcher. I'm a Senior Software Development Manager at D2L. D2L is a company in the education space. We build an online learning platform. Earlier in my career, I was focused on a lot of different testing type roles. I've seen a lot of change in the past 15 years or so in terms of like, what kind of testing happens, and when. Who's doing it, and why you're doing it. I think that this evolution is still ongoing. It really depends on where your company's at and what project you're doing, and even who's on your team and what their strengths, and skills, and preferences are.
I think you asked, where am I at in my journey to continuous testing? It's not a very linear journey for some of the reasons that I just mentioned. My current context is that I'm working on something that's brand new, so a product that we've built from the ground up in about the past year or so. Certainly, we're focused on testing at all stages of our work. It's not really the same as testing a very well established product that has lots of users and lots of functionality. It's been an interesting experience for me to see just how the overall approach to testing can change, depending on what it is that you're doing and who you've got on-team.
Assad: My name is Josh Assad. I'm an engineering manager at CircleCI. I'm on a growth engineering team. It's an interesting space where we run experiments, A/B tests to help improve our products and measure how that's working out for our customers. I spent a lot of my time as an individual contributor coming up as a tester. Making that transition into engineering leadership has been interesting and fun. There's definitely been an evolution of what testing is, and how it's been along the way. When I reflected on your question, I feel like in terms of continuous testing, and DevOps, I'm at a point where it's table stakes. It seems like it's the norm. I think a lot of it has shifted towards more automation. I have mixed feelings about that.
Belcher: I'm founder at mabl, which provides a low code tool for continuous testing. I've spent most of my career in engineering and product leadership roles. I started as a quality coach at Microsoft. My passion for this topic runs very deep.
Kenyon: My name is Scott Kenyon. I am a QA and release manager, at a company called FirstPort, UK. They're in the property management sector, so looking after large buildings, and whatnot. Where I currently am, in terms of the DevOps continuous testing, we're currently going through a digital transformation. It's actually looking and ripping apart what we currently do, and rebuilding it with this in mind. It's understanding where we can add value and change value. I very much agree with Josh when it comes to, everyone's talking about automation, but why?
Why Continuous Testing Is Important
Losio: We'll definitely go into the topic of automation. I would like to start with what sounds like a very simple question, but I'm sure it's a tricky one. Why is continuous testing so important? Why do we discuss continuous testing?
Kenyon: I think continuous testing is an offshoot of the agile cycle and the way we work. It's one of the things that has just become the norm. We don't know we're doing it. Continuous testing is all about failing fast, failing early, and telling us about it. You can put it in lots of different ways, but that's where we are. I think we all do it inherently as humans. We're all checking to make sure what we're doing is good and we're trying to get that validation. It's inherently what we are, and what we do.
Belcher: I have maybe two things to add there. One is, I think it's so important that continuous testing allows us to find issues before we leave context. Unless we're finding out about issues in seconds to minutes, we move on to some other task, and then it becomes very expensive to go back and context switch. To me, that's, I think, a big part of the power of continuous testing. The other thing is, we have to remember that most of our products now are not things that we're developing, we're integrating all these different services, cloud services, and third party libraries, and so forth. Even when we're not changing our software, the systems are changing underneath us. You're running on Amazon or using Okta for your authentication, or bringing in third party libraries that get upgraded. The need to ensure quality of those changes is just as important. You may not know when they're being changed.
Kenyon: Dan said, I think, a good thing about the continuous testing, is it actually manages your velocity. It means when you fail when you run it, rather than moving forward, so it manages your speed as well, quite well.
Fletcher: Maybe another angle, or another way of thinking about continuous testing is that you don't even actually have to wait until you have a thing to test, like an actual product. You can be thinking about it in a way where you're testing your ideas before you even start to code. That's the thing that I like to keep in mind. Like just this morning, I was at my desk trying to plan out like, exactly how are we going to approach this architectural refactoring that my team was talking about? Then half an hour into it, I thought, do we even need to do this? Is there another way? I felt like that was a little bit like testing. I was testing the idea that we even need to do this activity. That's maybe another way of thinking about it is that it doesn't always have to be testing of code.
The Adoption Curve of Continuous Testing
Losio: I was actually ready to ask you another question about where we are in the adoption curve. Because one of the question I had as a developer was, I'm always curious as to how many are doing, basically, continuous testing? How many are using? The standard topic of, is there something that just innovators do, something that early adopters do, something everyone is doing? From what you've mentioned before, I've also had the feeling that many do it without knowing it. I don't know if there's any feedback there, if the majority are really doing it in some form. Just your feedback about where we are at the moment. Is it something that few are still using, and many are hoping to use?
Assad: In the last couple of roles I've been in scale-up companies, I think what I've seen in that context and in smaller companies is that we're either early majority, or late majority, or maybe right at that peak there. I don't think companies are thinking about moving forward without doing some form of DevOps and continuous testing. To me, I think it'd be very unusual to find a place that isn't doing that stuff. Maybe just to build a little bit on what Dan said, I think why it's so important is because so many places are doing continuous integration, and the platform that we work on is shifting constantly. You don't really have any choice, but to ensure that you're making changes and testing them as you go, because there's no real test environment or dev environment that represents production anymore. I think continuous testing is part of that safety net of making sure that you're not breaking things for your users, as you're moving so fast.
Kenyon: I think one of the pinnacles as well of using continuous testing in DevOps is actually for it to produce, refresh, and refactor your environments constantly. It's not just about looking at the code, producing and refactoring the environment itself quite often than not is one of the main problems, and blockers with your speed and your velocity going through to production, and having the ability to uncommit. You do that. I think that's where we're doing a lot more of now, where previously, we had static environments that were just built on. With more DevOps, continuous testing, we're actually then refreshing, scrubbing, and rebuilding environments every time to get a better integration.
Fletcher: I'd be surprised to encounter a company today that thought like, "No, we're just going to save all our testing for the end, and that'll be fine." I think that would be pretty surprising. It's probably the variation in just, how do you move from that to truly getting good quality feedback at all stages? Probably every team is doing some amount of continuous testing, but the question is, how well is it working for them? What opportunities could they explore to get better feedback or even more timely feedback?
Losio: It's more, what's the rate of adoption and how well it is done, than doing it or not doing it? Probably, as you mentioned before, people might do a commit and have some automation testing in place. The point is, is that a reliable one? Is that covering something or it's just there just to pretend that you have something in place?
Assad: You can probably think about it in terms of intent, like how intentional is a company about whether or not they're doing these practices? Is it happening totally organically because some folks just feel passionate about it, or is it part of the culture? Is it part of the policy around how the company develops software?
Belcher: This is probably controversial. I'd argue that we got it wrong in the first generation of DevOps, where there was a growing sentiment that we just need to do some unit level tests. We're going to have a microservices based architecture, have good contract tests. We'll do staged rollouts and rely on users to find quality issues. I think that backfired. It caused a lot of teams to struggle with their DevOps deployments.
Assad: I think there's a bit of a spectrum on relying on users. I think I can agree with that. When you're the product, like when you're on Reddit, or you're using Facebook or whatever, like, "I'm not paying to use this. I'll find bugs for you. That's great." When my bank wants me to find bugs, that's not ok.
Kenyon: I also think as well, there's a cultural demand now for I want it now and I want it working, and I want it good as a consumer. The old way of putting betas and alphas out. I remember 10 years ago, you've got a beta or an alpha on a site, and it clearly said, "Give us feedback. We want feedback." We don't really see that much anymore because it reduces the confidence in the software and product from a consumer level. If I say the testing pyramid, everyone knows the UI, API, the pyramid. Actually, I think what we've done is we've focused on that just so much to be a peak, where actually it's more of a blob, because everything just move ever so slightly around. It's not as linear as that. The fantastic thing about automation, it actually frees up time for us to do more exploratory stuff and more TDD. I think the more automation we build, the more monolith we build, the more maintenance goes into it, and you then hit the seesaw. Really, then you can then go backwards. There's a lot of companies at the moment in a lot of fields out there that are pushing forward, quite harsh and quite fast on DevOps and continuous testing, just trying to automate as much as they can. Not understanding that in a year's time, they're going to have a monolith, to actually maintain, and to still add value, especially with lots of other integration changing. To me, it's a slow walk, not a sprint, to get it good, and get it well, and get it in culture.
Is Test Automation Good?
Losio: I'd like to go back to something that a couple of you mentioned during the intro, was the mythical word of automation. A few people say, I agree, don't agree if automation is good, automation is not good. Of course, it always depends. What's the feeling here from the panelists?
Fletcher: Certainly, you're going to want some level of automation, in order to get that fast feedback. If the only continuous testing you have relies on asking another person to look at something as a human, that isn't going to be a very fast feedback loop. I think a reasonable goal in general is to choose some things that you want to test in an automated way that are going to give you a solid level of confidence about what you've just changed, or what you've just built, and maybe stop there. I've always been one to believe in good testing. What I mean by that is, you get to a level where you're pretty confident, you've caught most of the things, but don't try to fool yourself into thinking that you can catch all the things before you release your product. I'd rather spend the rest of the time on being comfortable with the idea that I can quickly detect and resolve issues that are happening in production, because there will always be issues, rather than building up a giant automation suite. You can write automation all day long, and you'll always miss something. Those are some thoughts to maybe get us started.
Assad: I definitely agree with the notion that automation is kind of, everything is ok alarm. It's not going to go very deep in terms of what's really happening. It's pretty good at maybe proving that your happy paths are working, and maybe some really obvious negative things. I wouldn't say that it makes the quality of your code better, necessarily. It just ensures that the main thing you wrote it to do is probably going to work. If you start refactoring, it's a bit of a safety net to ensure that you haven't changed any of that behavior a little bit.
What I've noticed is, I feel like automation has started to move more into the responsibility of the developer and less so into the tester, which I actually like because it frees up the tester to think more about some of the things that Scott has mentioned, like being exploratory, thinking about your users and other risks outside of just how the software is built. Traditionally, developers writing the automation had been a black box to them. It'd be a testing team writing some automation, and the developers are like, we don't know how this works. We don't understand why it's failing. Now it's on them to prove through some tests that their code at least passes a happy path. I think that's pretty interesting.
Kenyon: I think as we move more DevOps, we have a lot of developers now writing the code, what we've got to ensure is that you don't mark your own homework. Because I feel that at the moment, that's what's happening quite a lot. What we have is if we have more integration between a tester and a developer, we can actually then start writing some tests. The developer can automatically, with that massive coding knowledge, build them out quite nicely. I'll go back to what Tina said about what automation to look at. Just a status 200 on your environment is automation. Just finding that it's alive and it's kicking is automation to a point. I think automation is a tool, it's not a practice, the same way as performance testing is or security testing is. It's just a thing that we do. We've got to treat it like that. I think it's been the shining diamond for a couple of years. There's quite a lot of focus on it. I think we're coming to the cusp of that now.
Belcher: I do agree strongly with Josh, and the idea that testing can't be a black box. I tend to focus the other way. I'd rather find a way to make automation accessible to the millions of people who are testers who don't typically write software, than the inverse. The risk to me with automation is that because it becomes easy to automate things, then we end up with a lack of rigor and sprawl. We saw that in virtualization. We saw it in cloud computing. Now you see it in case after case where a thing that was hard to automate becomes simple, and then people just do a lot of it. I'd love to see a world where automation is accessible to the whole team, but we're spending most of our time on making sure that we're testing the right things the right way, rather than the code that drives a browser, or drives an app, or that kind of thing.
Managing Tests to Balance Execution Time against Value
Losio: How do you manage the volume of tests and the size of your test suite to balance execution time against value?
Kenyon: I think that also talks about overnight batching. I think overnight batching on QA suites, is a dangerous road to stay on. Because what you're doing then is you're spending the entire day doing something and you're only finding out that it doesn't work the day after. I think the best thing to do is to slice it into as many thin slices as possible, and try and go MVP, one line. Then, if you need to run it as a block, run it as a block, but to manage it, just thin slice. Then make it like a shopping cart, "I only need to do this test and this test to get feedback." Take them off the shelf, run them. Put them back. If you're doing overnight batches, that will fail in the future. You won't know for 8 hours, 10 hours. If you're on a 24-hour deployment cycle, which a lot of fast paced internet sites are. It isn't going to be well for you. It's a challenge. I know that it's quite fun.
Unit Testing
Losio: Does unit testing get included in automated test, and so is done regularly?
Assad: I personally count unit tests as part of automation. The cheapest, easiest, fastest part of it, I suppose.
The Future of Continuous Testing
Losio: Actually, we discussed about automation. I thought it was a hot topic. A few years ago, everything was about automation. I was wondering what we're going to talk about continuous testing in 5 years, or 10 years, if you can see so far ahead. Or it's going to be so much part of the obvious that we have to do, that we are not going to talk about it anymore.
Assad: I'm trying to think about anything 10 years ago we talked about in software that's still relevant today. I think it changed so fast. I agree. It's going to be ubiquitous and like expected, or they'll have evolved it to something else that's ubiquitous and expected.
Kenyon: I think 10 years ago, we had The Bolton brothers, and stuff like that, and we talked a lot more about exploratory testing. I think 10 years ago, exploratory testing became the thing. Then automations took off in the last five. I think we're just on a big circle. I think we'll go back to more customer based solutions. What is the end customer, end product going to look like? How are they going to interact with it? I think at the moment as well, accessibility. There's laws in place. We have to make it a lot more accessible. There's lots of really good tech companies coming out with tools that manage that and support us. Rather than having really a dedicated, niche skill set. There's some really good tools out there that do it for you. The same with automation and white coding tools. I think that's where we're moving to, off-the-shelf tools to help us get better outcomes for customers, or users.
Belcher: I agree with the point on accessibility. I would put that in a broader theme. In my view, five years from now, that continuous testing is not only ubiquitous, but it's automatic. We're not spending our time worried about, am I breaking core functionality? That part should be easy. I think what we'll do is we'll make much better use of the data and observations that the test can make, to move to quality engineering. To move beyond functional regressions, and think about usability, performance, accessibility, security, all these things that the tests can unearth. Issues that right now, we're just having a hard time as an industry getting there, because it's so hard to make sure you didn't just break the core stuff.
Kenyon: I think as well, if we talked about security, performance, accessibility, there at the moment, in my view, are very much the very specialisms. The incredibly specialists. The same way, five years ago, automation was a specialist. You had to have a person in to do it, where now automation is more widespread. A lot of people can do it, and these tools are there. I think in the next five years, those specialisms will just become normalisms. All the specialists will just start filtering down with the tools, and the knowledge will get spread out. There might be a new specialism that pops up. Machine learning, AI sort of thing, might come in, which is for data and analysis.
Assad: I think we've seen that trend with accessibility, in particular. It used to be no one really knew anything about it. I'm starting to encounter more developers and testers who are at least somewhat versed or know where the resources are to follow.
Kenyon: Now it's the law on majority of public facing websites. It was a necessity. A friend of mine, Eddie, he lives and breathes it, and it's fantastic. We need to spread that knowledge out a lot more.
The Role of a Developer or QA, in the Future
Losio: We just discussed about where we see continuous testing in a few years' time. I was wondering if you have any feedback, for example, about how the role of a developer, of a QA, if it still exists, is going in a few years' time. I know it's a weird topic there.
Fletcher: If we're imagining a world where, all of a sudden, all the testing that we wanted to do before shipping something is really easy, and we're confident that it's being done very well. We're getting good feedback. If we're imagining that all that side of things is good, and we almost don't need anyone to work on that anymore. I would put people with a quality mindset on monitoring what's actually happening in production. Just better data analysis of, who's using it? What's going on? What are they encountering? What are ideas for improvements based on the usage patterns that we're seeing? "This thing's going wrong over here, let's quickly switch off this flag for this change that we shipped recently." Analyzing, like A/B testing, and all those kinds of things that you can do to analyze what's really going on in your product out in the world. I think that's where I would spend my time if I certainly didn't have to spend any of my team's time on writing automated tests and maintaining them, and all of that.
Kenyon: I come from a release and QA area, and I completely agree, the post-release information. You just assume that we've got this agile and continuous testing until it goes live, and then we stop at the moment. I think the post-release or what happens afterwards, using simple Google Analytics to find out, our feature we've put forward, how is it actually being absorbed? What's its usage rate? What's our failure rates? Using good log files or automation at that point to find out all the errors in the points. I think we will move there as the testing in the middle gets easier with tools. I think it comes to the point of just what the end impact is for the customer. That's what we're trying to improve that with.
Testing Tools, and having all Eggs in one Basket
Losio: Given your experience, what do you think about testing tools, and if you should put all your eggs in one basket or better spread around?
Belcher: That's certainly going to depend on your environment. Every team is comprised of different skill sets and focus areas, and the like. I do believe that, in a lot of cases, teams get in analysis paralysis on the best of breed stuff, and focus maybe a little bit too much on the tools and frameworks, and not enough on the culture, and processes, and expectations.
Culture Shift to Automation
There's another question about, how do we shift our culture to automation for the teams that have people who are testers, and who are developers? Personally, I believe that the answer is to find ways to get them to collaborate more on quality earlier in the process, as opposed to throwing a tool at some rule and saying you're going to be able to solve all the problems now.
Kenyon: We have a solution where, get people around the table on the define phase. Before you even get to build, speak about it in the define phase. Have that tester mindset inbuilt and ingrained. No pairing. That's how we'll get there, I think. Because that's the only way to evolve the culture forward is to get early on in the process as much as possible.
Assad: I would just say my general wisdom about any tools is, if that's not part of your IP, and you can buy it and not build it, then you should do that. Because you hired people to build your IP, not your framework.
Kenyon: There are some fantastic tools out there. Virtually, there's hundreds of them. Just use them. If you try and build it yourself, you're burning money that you don't need to.
The Culture Shift to Continuous, Automated Testing
Losio: I see as well that you all agree, and shifted the topic from basically tools to culture. Can you share any thoughts of making a culture shift to continuous and automated testing during that journey that you are shifting from the QA team creating automatic tests, to the development team being made responsible for writing tests?
Fletcher: One way is to just try it for whatever circumstance you might find a team in, or maybe you could manufacture a circumstance. For example, a team that I was managing previously had the only tester specific people who were on the team, just both happen to leave at the same time. Then it was like, "Here we are. We're a team of developers. How about for a little while, we just try having these team members do the things that were previously done by the people who had test in their title." Just see how that goes, and try it. If you don't happen to have that situation on your team maybe just take the testers away from a particular team. In some of these shifts, it can take so long and be so gradual, and it's so easy to fall back into the old ways, unless you have to do it differently. That's something that I've seen work really well is just go for a period of time with maybe no test specific role on your team, and see what you learn, see where you struggle. You're going to have some failures, and some things that don't go well. Sometimes that's the best and quickest way.
Kenyon: On the back of what Tina said there, I think that's a good way of doing it: frying pan, fire, for the job. I think a good way of how to look at it is if you are going to remove that testing mindset with yours, it's actually, have your test as separate, and have them defining the outcomes of the test. Don't say what they're going to test, just define the outcomes. Then the developers can take that and then make the tests, to match the outcome way. You might get a good hybrid there. You might move a little bit quicker.
Assad: Manufacturing the situation is a good way to put it. I've been at two companies now that manufactured the situation by not hiring testers. The culture is that the developers have to worry about quality and have to worry about automation, and all of that stuff. The one thing I wanted to share, the insight that I've seen as these companies have evolved, is not that their wheels are falling off, although sometimes I wonder. The testing mindset people used to do is still happening, but now it's being done by the product manager or the designer, because it's a gap. Those two roles are our default customer-facing concern. When they get something that that team of developers have put out, their first thought is like, how is this actually going to work for a person? They're actually kicking the tires and then trying some of that stuff themselves and finding really good bugs. Clearly, it's not the answer to just get rid of testers.
Kenyon: That's where I am at the moment with the division I'm in, because the testing resource is quite light. It's falling back to product owners, and BAs. That resource, although it is fantastic to have, I think is better utilized on their specialty. It's a cost benefit world. Manufacturing a scenario and then having like a product owner define an outcome, and then write tests on the back of it.
Measuring Success with DevOps
Losio: Actually, I was wondering, the way you talk about continuous testing and how to adopt it is, how do continuous testers change the way you measure success with DevOps and quality in general? It's great to say we're doing this shift, but how do I measure somehow in a reasonable way that what I'm doing is good or not good?
Kenyon: I think one of the ways I'm looking to measure at the moment because I'm currently writing that metric is using Jira on tickets, how many times a ticket's moved? How many times a ticket's a flag? How many times do we reevaluate this thing to get a benchmark, and then move into a slide of automation, a slight bit? Then do the same again. I don't think it's something you can measure over a short period of time. I think it's a 6, 12, 18-month metric, because it is a culture shift and that takes time.
Fletcher: If you are trying to measure, how well does the entire team feel responsible and feel ownership of quality, and if, like my team, you do typically have a person who has got tester in their title, what happens when that person goes on vacation for a week? How does the team run? How are things going when that person steps away for a short period of time? It can be a good, quick measurement of just what is the overall understanding and ability to do the quality related activities.
Kenyon: I think as well it's actually sitting down in refinement sessions, or planning sessions, or defining sessions, sitting and actually listening to the context and language individuals use. You can normally see quite well. It's not just the tester in the corner, barking up now and again, it's actually the same verbiage that's coming from the team. That's normally quite a good measure of success, because if the tester goes off for a couple of weeks, then the mindset spreads, and we then all get good quality.
Belcher: I think we need executive support in a lot of cases, to make the investments required to achieve the right quality bars. We published a report a couple weeks ago that showed strong correlation between test coverage, and customer satisfaction, and test coverage and team's ability to deploy changes very quickly, like a fix to a regression. I wonder if you start with the business outcomes and the measures of those, and then back into, what are the SLOs or the metrics that we think are highly correlated with those from a quality perspective, and let's focus on those.
Assad: I think I agree with Dan, on that point. My mind was going to, I think, continuous testing and having this whole pipeline in your build machine, frees you up to not worry so much about measuring code things and start looking at your business outcomes. Picking a metric that really means something to your customer, and seeing if you're shifting that in the right direction.
Losio: At a much higher level, not simply as the number of bugs.
Assad: Absolutely.
Kenyon: I think the number of bugs, or number of tests, or time to execute tests, that still exists in the world. There's still quite a lot of executives who see that as a metric. The reason I say this, I've got that metric. I disagree with it. I think as we move more and we develop more over the last couple of years into more continuous testing, we'll then look at business outcomes. Look at how many service outputs we have, or service downs. That's where the final outcome should come from, rather than, have you got 1000 tests? Because if that's the metric, we can all sit here and write 1000 tests. The values are relevant, but that also is a culture as well from top-down.
Losio: It's not just the engineering team culture, but should be at that point a much high level, not a company culture.
Kenyon: It's an investment. Investing in continuous improvement and DevOps, it's an investment cycle. You don't see massive improvement straight away. It's a long term thing to get it right, and get it done well.
The Requisite Skill Sets for Teams Adopting Continuous Testing
Losio: What do you see as a skill set that is required from teams that are adopting continuous testing in terms of scale? When you build a team, or when you hire someone, what are you looking for? What do people need to have?
Assad: The critical thing for me in all this, because we're talking about continuous integration in DevOps context, is teams have to be able to adapt to change. Change is going to be constant and almost like daily or weekly. You need to be able to just let stuff go and move on to the next thing based off of what your current information is telling you. I think that's like the fundamental skill sets I'm looking for, for a team I'm building up today.
Kenyon: Mine is neurodiversity. It's the most neurodiverse team I can have available. People who are autistic or along the spectrum, as many different minds because we can incorporate that. Having lots of different minds, openness, and honesty, and neurodiversity are the three things that I'm interviewing. The things I look for, and try and pull out. Because I know full well that if something's not going well, I will get that feedback early: if it's a process, if it's procedure, if it's the actual software. I think over the last couple of years, we've got quite a lot of IT who are insular. Especially, working from home at the moment, are very insular. To get rid of that, I look at neurodiverse people. I've been called a lot of things, and I enjoy it all.
Fletcher: It sounds like so far, we're more describing traits or qualities of people than actual skills. Maybe that's an interesting point in itself, that there isn't a technical skill or a tool or a framework that comes to mind for all of us. I was going to say something along the lines of people who question things, like question the value of what you're doing every day. Don't keep writing more automated tests because some time ago we said we should have more automated tests. Tell me or tell someone when you start to feel like we're not getting the value out of this time anymore. Tell me if you've heard about something new or better that we could be doing instead. I just never want to work with people who just do the same thing day after day because that's what they did yesterday. I always want to be thinking about what's next.
Kenyon: It's innovators, is the people who will think, is this the right way of doing it? It's innovators, and people to push. I always say, you can teach them on a tool, but it's really hard to teach them on a mindset. You come with that mindset, and you'll learn it all, not the other way around, when I look to employ.
Automatic Testing on Mobile Devices
Losio: What about automatic testing on mobile devices?
Kenyon: There's lots of companies out there that you can do it through tunneling, VPNs. Don't do it yourself. Use a separate company to do it. It is very difficult to do it yourself, because of security, the three or four different operating systems that are on mobile phones. When I've previously done it, I used a company at this side of the world. You just hot it on to their device.
Losio: I see as well that the cloud providers will provide some services too, to go with that.
Kenyon: If you think about why you want to do it as well, a lot of mobile automation is much of different operating systems, double click, click, 2-finger stuff. You can get most of that done just in dev tools. It's very expensive.
Belcher: Don't skip over mobile web and responsive. If you're building web apps, I feel like that's a mistake that so many teams make, especially with automations. They fail to consider what the application does at the wide diversity of devices and resolutions that their user is going to access it.
Kenyon: A tester once told me if you ever get given a device to test, then you do turn it. If it fails at that stage, give it back, because that's the most basic one, just turn it.
See more presentations with transcripts