Video Transcript
0:00 | Joe Colantonio
All right. Let’s see if people are joining before we just start up. I’m going to go into the chat, You’re all here. Let me know where you’re all from. I’m from Tennessee. So really excited about this event While we kick it off though. I just want to make sure everyone’s here and ready to go. So just type, okay or yay in the chat or something. Just let me know you’re all here while we get started, It should be a great event. Good session. Great roundtable. Of course. Yeah, We’re experts with three different perspectives. All right, Cara, thank you, Cara’s here. Awesome. Good to see you. All right. I see some people jumping in. Yay says Ted, Brittany’s here. It’s Anita. All right. Cool. Let’s get into it. So welcome everyone to our Champion Automation, How Quality Leaders Can Drive Organizational Support Roundtable, And really excited because we have three different experts with three different angles of perspective. So today, we have Chris Huff, CTO of Rent Group, who will share how to align automation with core business objectives, providing the executive stakeholder kind of perspective. So, hey, Chris. Good to have you. Awesome. We also have Lee joining us who is the engineering leader at Weight Watches really cool gig And she’s going to provide insights into managing change effectively and also draw from her awesome experience leading a global engineering team. So welcome Lee.
1:16 | Li Rajaraman
Thank you. Hi, everyone.
1:18 | Joe Colantonio
Great to have you. And, John is here. He’s chief storyteller at Next IT And he’s going to offer techniques on crafting compelling business cases to really garner further investment in automation. So John, good to have you. Thanks for having us. Awesome. So also I just want to tell the audience if you have any questions at any time, just jump in the chat or something. If it’s relevant. I’ll try to interject throughout the round table. So let’s get into it. I guess the very first question is I don’t know if you’ve had this experience. I went to an onsite event recently and a lot of the vendors told me, hey, a lot of people are coming here that have no automation whatsoever. I was kind of surprised by that. So it got me thinking How do you justify maybe the investment in automation over maybe traditional engineering time to executives? I guess we’ll go around the room really quick. We’ll start with John as I see you and then Lee and then Chris. So John, any thoughts?
2:12 | Jon Robinson
So, I mean, for me, I think the big thing is understanding why you’re doing automation in the first place, When you say test automation, what flavor of it, are you talking? Unit testing, functional testing, security testing, smoke, testing, performance, Test, automation comes in a number of flavors. And it’s really a question of what you mean when you say automation and how you want to justify it. And I think understanding what you’re going into with the objectives of this is what I’m trying to accomplish. This is what automation is going to bring me. This is the value that I am going to bring to the organization with it. I think starting there and understanding what that is before you start trying to justify it. I think that that’s probably the first thing you should consider before you ever even think about a tool or anything else is, why would we do this?
3:02 | Joe Colantonio
Absolutely, Lee, I’m just curious to know from a global engineering perspective, how do you justify it or how do you get buy in from your different teams as well all across the organization?
3:12 | Li Rajaraman
Our position has been automation is not always the answer. If there is a root problem to solve where automation would bring that return of investment, then we try and see how we can implement that. In our organization. We’ve been trying to implement the RFC process, request for comment, We consult on what the problem is, What the solution should be. And if it turns out that any type of automation, whether it’s integration testing, unit testing or UI testing, we bring consensus and then figure out different approaches and provide a solution. It’s the stance that we’ve been taking recently?
4:01 | Joe Colantonio
Awesome. And Chris, any thoughts on that you can add to the conversation around this?
4:05 | Chris Huff
Yeah. I have to say, John, I thought every comment was going to be a story. So I was,
4:10 | Joe Colantonio
What now?
4:12 | Jon Robinson
Joe knows, well enough I can do that. He’d want me to.
4:17 | Joe Colantonio
I’m trying to restrain myself.
4:20 | Chris Huff
No, no, I’m just messing. So look, every business, especially high growth businesses are all about speed to market nimbleness and just being able to respond to instantly changing what seems like instantly changing business needs. So this is where I bring automation quality engineering into that conversation. So for those of you who are old enough to remember, unfortunately, I’m old enough to remember when muscle cars first got their start, right? Like you had 300 horsepower, I had a 350 horsepower Cutlass Supreme and the seat belt was like tucked up at the top of the door, right? Like it was like this is not supposed to be used. In fact, I never used it because I won’t have to fold it back into the console. So, you know, you had all of this speed but no safety, right? And so people were dying, It was actually a national kind of emergency at one point to come up with better regulations, airbags, all these different things. And so when I talk about to leadership quality engineering, it’s all about going super fast but doing so in a safe way. So that’s from a CTO perspective, That’s how I think of automation, how I think of quality engineering.
5:34 | Jon Robinson
Engineering in general.
5:36 | Joe Colantonio
Love it. So just to remind everyone if you have any questions, just drop them in the QA, I will be monitoring them. So based on that, once you get buy in, I think you all made the point that necessarily automation isn’t always the answer but also what type of automation are we talking about? Do we also try to push maybe DevOps automation? Is it always functional automation? Do you notice sometimes teams miss out on benefits of automation because they’re only focusing maybe on the functional testing aspect of it.
6:05 | Jon Robinson
I literally had this conversation 15 minutes ago. One of the challenges with it is a lot of times people start to equate automation with testing specifically and not really considering the fact that there’s a lot of other things that automation can do for you and bring those time savings and safety guards to the table that aren’t the actual testing itself but bring a lot of value in that way. And I think that’s something that people need to start remembering is that it’s not just test automation but it’s automation in general. And we’re talking about this whole quality landscape that allows us to really be effective. And I think that’s an important factor to be considered.
6:50 | Chris Huff
Yeah. I mean, like again just from my seat, it’s a little different vantage point, I think in some cases, but I’ve had QA teams report directly to me at earlier stages of my career as well and have a healthy respect for the discipline. I think, I guess how I think about automation is If I think about the anti pattern, I love talking about anti patterns with my team. So indulge me a little bit. The anti pattern is you just have a bunch of things that are always at 100 percent. It’s always green or it’s finding the same issues that come up… every time. And so if you think about signal to noise ratio, it’s really not signaling anything. It’s just noise for the team Every time the automation is just telling, you know, basically the same story. So, aside from code coverage, one of the biggest things I push the team around is the effectiveness of the automation. Like how often I mean if it’s not finding bugs, then why are we doing it? Right? And so like that’s, kind of how I think of it. And I think is an important aspect of it is to make sure that everything you’re doing in automation is actually adding value and it’s not just testing things that are painfully obvious.
8:02 | Jon Robinson
I’ll throw in there. A failed test is not always a bad thing. In fact, it’s oftentimes a good thing because it tells you your automation is working. I mean that’s exactly what you want out of your outcome.
8:14 | Li Rajaraman
Yeah, absolutely. If I can speak to a real world example that we are just responding to. Today. We think of automation as integration with our tools, also tools that we use to monitor crash that is integrated with Slack. We have a channel where we monitor It alerts when crashes spike. So that type of automation is also important where we are responding to real world issues, not just responding to automation failures within the organization within our ecosystem, but automation that is integrated from outside the world, real world that informs us of what’s going on and respond to it. So that automation is also necessary and having checks in place to Review, look at, respond to those types of information that come in.
9:09 | Joe Colantonio
Absolutely Great. So have the right automation at the right place, at the right time. I love that. So, as I mentioned at the beginning, you know, you all have different perspectives based on pretty much where you are in your organization. So I have handcrafted questions for each of you to get your insights. The first one is we were kind of teasing John about being a storyteller but John, how do you craft a compelling narrative to executives? I mean, investment?
9:34 | Jon Robinson
Yeah. So one of the things that, you know, you’ve got to really start to look at is what is the story you’re trying to tell? I mean you can’t really craft a narrative if you don’t know what the actual plot of that narrative is, right? What are you actually trying to convey? And, you know, One of the biggest aspects of it I think. And this is why I talk about it all the time and I’ll hop on it till I’m blue in the face is really having a good strategy in place for what your automation is going to bring to the table so that you can go into it with a clear picture of, okay, this is where we’re trying to get to, But it’s also really important to make sure whatever that automation strategy is, aligns with the actual organization’s goals and objectives. So if you’re not directly contributing to those goals and objectives, then your story is going to fall apart right away. Right There’s. That twist or reveal that happens in your traditional story is going to be, well, the twist here was it’s not actually adding any value. Sorry, that didn’t land for me And that’s a really like so many people forget that doing automation for automation’s, sake is not useful, right? If you can’t articulate why you’re doing automation and the value that it’s actually bringing. You need to stop because you’re just wasting everybody’s time. Now, it may be that you have something that is of value in there, but you have to understand what you’re trying to accomplish with that automation first before you’re going to understand, do I have anything that’s useful? Or am I just wasting everybody’s time? And, you know, that sounds blunt and brutal. But I went into a place one time and we threw away seven years of automation scripts because they were doing absolutely nothing Like they were running to Chris’s, point. We got a whole board full of green check marks and they were doing a heck of an effort to keep those things running over those seven years. But what they were actually getting value wise? Didn’t mean anything? So we just scrapped them, threw them away, started over.
11:36 | Joe Colantonio
So, Chris, as a CTO, how does someone pitch something to you? Like, how do you keep informed of what’s happening or do people come up to you and say, hey, we need to change this here. Like how did someone pitch to you in a compelling way to get your buy in to say, okay, all right. Go do that. Yeah.
11:51 | Chris Huff
I mean, it’s generally about solving problems, right? Like, you know, unfortunately, I guess the downside of my role is like I end up having to gravitate towards where the problem areas are, right? The firefighter or just going to dive into areas that need help. Oftentimes it’s not the space at all, but oftentimes it is, I will say one of the, I guess one of these emerging it’s been around for a little while but there’s been this perception. It’s like, hey, engineers write automation, They can do all of this stuff. Let’s just let the devs do all of the quality engineering work. And I really don’t love that approach. Sure the developers absolutely can write the code, But the value add is the people who understand the craft of quality engineering, people who understand that discipline and are adding to it and solving business problems with it. So, I guess the way… I look at it and how I’ve managed this with my teams, is you’re either just doing QA for QA’s sake or you’re leaning into that calculus of solving business problems and are creating value through that. If you do that, you become irreplaceable in terms of like devs are not doing that. Like I was a developer and I did not like to write my own test scripts, right? So it’s not something that they want to think about. They want to think about their code working. They don’t want to have to go in and think about all these different edge patterns and stuff like that. Quality engineers love that They love breaking code. So you have to continue to push the envelope right There’s. Always this tension of speed performance quality. You’ve got to you’ve. Got to lean into, that tension and problem solve. I’ll give you one example is one of my last jobs. We had broke into product teams and we had, the QA folks that were involved in embedded in each team. I love that model. But, we were hitting a bottleneck in terms of actually writing the test cases like, what are we trying to test? Because we were building a lot of new features to your point. One of the teams came up with, hey, we need to go and implement specification by example. This will be a way we can move fast because we’re actually testing features with customers as we build them. And so it was a great way to get the product managers in on the game of actually automating some of this stuff by using the Gherkin language to actually go and write these things. And so that actually became a huge problem. Solved. We, were, we had all of this kind of, I guess, you know, a little bit of just repetitive work going on, right? Like product managers weren’t enjoying writing them down and the QA analysts had to go in and then I got to interpret them and basically duplicate that work. So that’s the kind of thing, that I love because it creates value not just for the team but for the business as a whole.
14:48 | Jon Robinson
I’ll throw a follow on to yours really quick for Legos. That one of the things a lot of people especially developers when they’re testing, they don’t tend to look at it, they’re testing to make sure they built it correctly and built it to specification as opposed to does it do what the customer actually needs it to do? And the end user expects it to do? And while we think on the surface, those two things should line up all the time. That is not often reality. Because the way people like just the mental models of looking at, how do I build something versus how do I use something? They’re totally different. And I love that you use your example because it’s exactly why you have the disconnect between developers and QAs a lot of times because they’re literally looking at it through two different lenses and their perspectives are just totally different. So. Sorry.
15:43 | Joe Colantonio
Yeah. Lee. Great. So how do you keep your teams, global teams? I used to work on a team with eight sprint teams and that was a disaster. You’re global worldwide like how do you even do it and keep everyone on board and make sure that everything’s matching up with your core business objectives and you are getting things done in an efficient manner.
16:01 | Li Rajaraman
I’ll answer to that. And also as a follow up to what Chris said, right? We’ve gone to a place of quality is not just owned by the quality engineering team but quality is everybody’s ownership, right? So we are not that one team trying to break the code anymore from product owners to product managers, to engineering managers, to the engineering teams, Everybody owns quality. And that’s how we are Managing change in our ever changing world. If you are able to clearly articulate, like John said, your root problem and say I am able to solve this with automation, then we take that route. Otherwise we are always in the shift left mode. How quickly we can get things tested, how early we can get things tested and continue to iterate on that testing whether it’s automation or manual testing. And if you’ve built up automation frameworks, to again, John’s point, if you worked on it for almost three years and you realize that use case is not working for you anymore. That approach is not working for us anymore, being willing to look at it and say, okay, I’m ready to change my approach. Not married to the idea of I’ve worked on this three years. I don’t want to give up on it just yet Not being in that position but looking at it and saying it is not working. How do we change the approach? How do we get buy in from engineering teams, engineering leadership to change the approach, evolve the approach so that it continues to work for the business needs. I think those have been some of our ways in responding to changes that we are bringing in.
18:01 | Jon Robinson
I’ll tell a great analogy that I love using with people is, you know, oftentimes they’re like we got this whole this problem that keeps getting worse and worse. How do you get out of a hole? What’s the fastest way to get out of a hole? Stop digging, right? I mean, you keep digging. It’s just going to get deeper. So just stop It’s okay. It’s a sunk cost at this point. Let it go.
18:26 | Joe Colantonio
Great point. So you did mention everyone owns quality. I’ve worked maybe at dysfunctional companies where everyone owns quality but no one owns quality. The quality team still owns quality. How do you actually get, you know? Because a lot of people see QA as almost like a cost center or QA activities as a cost center, not necessarily developers adding to the bottom line, How do you change the culture then? Like how do you get there?
18:52 | Li Rajaraman
Put processes in place that work, right? So as simple as responding to a production issue, it used to be that we would hear about it, somebody would toss it to QE and we would try to reproduce and respond to it, right? It’s not that anymore… Customer center hears about it. They bring in both engineering, product owner and Including the quality engineering team into the conversation and everybody is looking at it together. When more eyes are on the problem, you have more perspective, Somebody knows what change could potentially have broken that. And somebody could tell that from the QA team that I tested it in a certain way. I missed to test this certain way That’s why you are seeing it in production, right? So putting in processes in place, identifying this problem and then analyze the root cause, Bring in the expert to fix it and work together with quality engineering team to put it out there, put the fix out there, right? So it’s having checks in place implementing the process, socializing the process. It’s not just happening in somewhere like in a hole, deep hole. It’s out there. The engineering team knows how the process works. How QA is involved Bringing everybody in, socializing the process, keeping at it, right? It’s not, you follow that process once a few weeks, few months and then forget about it, right? Keep at it, keep improving it, learn from it, make changes. So it’s how we adopt the process, how we socialize the process is how we bring everybody. Thank you.
20:45 | Jon Robinson
It kind of sounds like you’re talking about setting expectations for everybody. It’s such a novel idea.
20:52 | Li Rajaraman
And accountability, right?
20:54 | Jon Robinson
Accountability, Yes. If you don’t have accountability, expectations are useless.
20:59 | Li Rajaraman
Yes, absolutely. Yeah.
21:02 | Joe Colantonio
So, Chris, as a CTO, how do you, how do you like balance the approach? Yes, you want quality, you want quality culture, but, you know, your management’s coming down and you, we need this out the door, you know, this is a money making feature or something like that.
21:15 | Li Rajaraman
With that.
21:16 | Chris Huff
Yeah. I mean, there’s some, there’s definitely some up level leadership issues there that can lead you down a bad path, right? I mean, if I say, hey, go build feature A, and then the team is working down building feature A, it’s hard to then start to talk in terms of the, a common currency around what is the impact of that? Right? Then I’ve got to surface up the risks. I’ve got to start to bring that to light. The way I tend to rate. I think one of the things that I want to lead my teams is around outcome orientation, right? What are the goals we’re looking for? Because that’s ultimately what. I want more MAUs, I want more, you know, happier customers. So that then levels the playing field and it becomes much easier to have those trade off conversations. But I’m achieving the goals, The customers are happy with the product and the performance of the product and it’s achieving what I needed to do Then I’m good Like don’t keep going. You know, I mean, you want to keep improving it, but I don’t want to, I don’t want to hold up value in effort, you know, for the sake of something that’s not going to create value. So, it creates this currency that then the QE team can then exchange on, right? Because now, I can talk about value creation or risk to that value. If I’ve got something out there that’s broken… and QE has identified that right now the status of that issue especially if it’s impacting critical outcomes, becomes massively higher. And so, that’s generally how I approach it, It’s a little harder if you’re more, in a feature shop where you’re talking in terms of the nouns, it can get a little bit more difficult. But I think of, I think of QE teams as really managing risk for the business. And, it’s never a, you know, all or nothing. It’s it’s all about balance.
23:09 | Jon Robinson
Balancing the two. Yeah, I think having QA people and QEs that have a voice and feel like that voice can be heard is very important in an organization especially when you’re asking them to be, that voice of quality or the gatekeepers to quality, Having people that are willing to stand up and use their voice, but also having an organization that’s willing to listen to those people, is, really important.
23:36 | Joe Colantonio
How’s that done? John? Sorry, were you going to say something?
23:39 | Li Rajaraman
Yeah. And the best way I’ve seen that, empowerment for the QE org is by owning release management. I think from my perspective, it’s not just about, saying you’re good to go, but actually owning the release process along with the engineering team has been really empowering, to own that value that we get out of and truly understand what we should stop something for, and what we can say. Okay, this is fine. I know from my experience, this is good to be out there.
24:17 | Jon Robinson
I would say to answer your question, Joe, right? Along the lines of what Lee was saying, one of the best ways, to do that is just taking ownership in general, right? Owning well, quality is everyone’s responsibility At the end of the day, somebody has to own it, and take ownership of it. And, and be, that voice of it, It doesn’t take much to do that. Just being the person that’s willing to stand up and say this isn’t good enough. We have to do something different Sometimes that’s enough. But, you have to have people that are willing to do that. The only way they’re willing to do that is the first time it happens, somebody listens to them… Right? If you’re given an excuse, and kind of pushed to the back, and you, for lack of a better word, gaslight them and say no, that’s not, you’re not seeing reality here. This is actually something different. We don’t have to worry about that. Then you’re lessening the chance of that happening in the future And, you, even if you don’t understand the reasons why they pull the plug and say we can’t do this yet or they’re saying this isn’t good enough, Take the time to understand why listen to them and then come back and have a conversation about it if you have to, But give them that voice and allow them to be willing, to pull that emergency stop and say we have to do something different. I think that’s important.
25:40 | Joe Colantonio
Nice. So, I guess in order to pull this all together though, you need to have some sort of heartbeat of how the teams are doing, how things are going in production, especially when it comes to automation. So, I know this is kind of a tricky topic, but metrics, a lot of people care about metrics ROI. So do you have any metrics for measuring success in automation? Especially how your team’s doing with this? Chris? We haven’t heard from you in a bit, so we’ll start with you on this one. Yeah, I mean, certainly.
26:04 | Chris Huff
The effectiveness I was telling you code coverage and effectiveness of the automation is certainly top of mind. I think it’s one of the ways to measure that. Or I guess the way that I look at measuring that is how often is the automation creating bugs, creating work to be fixed, invaluable work to be fixed? Again? If it’s telling me the same thing that no one cares about, that can be less valuable.
26:31 | Jon Robinson
And are you fixing actual bugs or are you fixing broken tests? Those are different things. And it is something you need to factor into that conversation.
26:40 | Chris Huff
Yeah, A little bit different I guess in my approach is that I kind of have what I call dashboard metrics and then windshield metrics. And so for me, dashboard metrics are like if you’re driving down the road, like if you’re running out of gas, you probably need to know about that. If your engine’s overheating you probably want to know about that. But if you’re staring at the dashboard, you’re probably going to run off the road at some point or at least not get to your destination. So, the windshield metrics to me, for me are always around the consumer, the client, the customer facing work that we’re doing. And I think part of the challenge is that like QA and QE teams get obfuscated from that, And that’s only to the team’s detriment, right? And so, I think as much as possible you guys have like take ownership, Don’t be afraid to raise your hand and say, hey, you know what We want, this NPS metric as part of how you grade our team right? And so those kind of things are really cool. It goes back to the first question you asked us around, how do I justify the cost? The value? Well, it’s like, well, hey, the last five customer facing issues that were core to this, you know, NPS metric we were targeting were found by this team. So I definitely, you know, look at those first. And then I have kind of a smaller set that I look at that I call dashboard.
28:11 | Jon Robinson
Awesome.
28:15 | Joe Colantonio
Lee, any thoughts on how do you measure your teams? Any metrics you use, you find? Helpful?
28:20 | Li Rajaraman
I think issues that we catch early detection of issues. One thing, as a people leader, I’m always concerned about this time, how much time my quality engineering team can spend on something with the world of experimenting AB tests features having multiple variants. We just don’t have enough time to test control and variant always. So if my automation can help towards finding impact on control while we spend manual testing time on variants, then that’s a good metric for me to sort of say that I’m saving time. I can spend time with my people over in these areas. Time. Yeah, My people and where they spend their time.
29:13 | Jon Robinson
I would say for me, well, I think a lot of the things that Chris mentioned are, super useful and important For me. The one that I care about the most typically is the, I don’t care how many we find Before we go to production. I want to know how many escape into production and over time, does that improve or degrade? And that’s what I’m looking at because it’s great that we find stuff before production, but I want to make sure we’re finding the things that are going to cause loss of revenue if we get into production with them. I want to make sure we’re hitting the right areas before we get into production. And so I look for escaped issues, things that actually got past the QA process and understand what holes do we need to plug to make sure we don’t miss those kinds of things in the future That’s to me, my measure of quality is, are we getting better at putting out something that is stable consistent and bug free? Because you’re never going to be 100 percent.
30:18 | Joe Colantonio
Maybe I have seven bugs in there, but it doesn’t impact value or risk. So, how do you not use that as like a perverse measurement? Now? Like, ah, you got seven that escaped, but you’re like.
30:29 | Jon Robinson
Yeah. But I think it’s also down to understanding how those bugs tie back to both your revenue and the business critical function of the system. Are they impacting your users? Chris made this point earlier and I think it’s very important that I don’t care if certain types of bugs get out, right? If we’ve got a font that’s different on this page versus this page or a button is different colored here versus here. Well, are those bugs Technically? Yeah. But are they causing revenue loss? Or are they causing any, you know… brand harm? No. Okay. That’s not the kind of stuff that I’m talking about. I want to classify them but that’s going to be low on my priority list. What I’m talking about are those P1 or P0 issues that get out that cause revenue loss and cause brand harm or some other business impacting problems. Those are the things that I care about. And all of them should be given some sort of RCA, right? You want to know what you’re doing, what’s getting out, why did it get out? How did it get out? All those things are important. But the things that I care about are those P ones and P zeros that are really like that’s? Cost that’s? Costing me money?
31:45 | Joe Colantonio
Awesome. All right. We have our first user questions from Carlos Hernandez wants to know QA should not only be accountable but also should be seen as a revenue making entity for the organization. Good, best quality will improve revenue generation, bad quality, buggy releases will impact revenue. We’d like to hear your thoughts on that Anyone have any strong thoughts around this sentiment here?
32:13 | Jon Robinson
I think it’s possible for that to be true, But I think it goes back to the very first thing that I said early on, was, have you aligned your quality and automation initiatives to business goals and objectives and the initiatives there where you can actually tie your efforts back to revenue and, you know, increase decrease whatever, but you can actually tie it to actual business outcomes If you can get, if you can say that definitively Making it a global kind of overarching blind statement is really hard because it’s hard to quantify that, right? There’s no way to prove that. But if you’re actually tying it to business objectives and business outcomes, absolutely love that.
32:56 | Chris Huff
Yeah. And do engineers take the time to understand the business? Like I said, I kind of manage my team this way. And one of the things I’ve observed repeatedly is developers love to complain And I’m lumping in QA with that as well. Developers love to complain about their lack of empowerment. And no one asked me if this was a good feature. And then as soon as we moved to this model, you still get the same comments like, hey, you know, this was their decision. I’m like Why You guys should be deciding this together? And it’s interesting that, you know, there is a tendency to want to lean back and that means not understanding the business.
33:37 | Jon Robinson
You’re nailing exactly what I was talking about earlier, that voice and being willing to stand by it and say this isn’t good enough. Like if you believe that, say that Like put your hand up and say guys, this does not make sense.
33:48 | Chris Huff
Yep. And look, you know, in my career, you know, the things that I love the most is if I get a, I get it. When I’m going to a team meeting, they’re talking about an issue. If I get a product manager explaining why we need to pause to go pay down tech debt, I’m like, oh man, this is awesome. And similarly, if I hear a QE leader go, those defects don’t care. These defects don’t matter, right? Like I love that they understand how it’s used and the impact on metrics To be able to inform me with confidence like that is just awesome. So, but it doesn’t come for free And it’s not, I think, it’s not the gravitational pull for a lot of engineers including myself when I was.
34:35 | Jon Robinson
It’s a personality type that some people aren’t willing to insert themselves into that conversation in that way. They don’t like to put that stake in the ground that says, you know, good or bad. It’s just not something that comes naturally to a lot of people in that field.
34:53 | Joe Colantonio
Nice Lee, any strong thoughts?
34:55 | Li Rajaraman
Or… I’m learning, I was thinking about, I was, all I was thinking about was app reviews in the mobile world, how high quality your product is directly corresponds to the app reviews. So that’s how QV quality engineering teams can feel that they are part of the bigger picture, right? Their contributions directly contribute to the app reviews.
35:26 | Jon Robinson
Well, and one of the things I tell especially QA people and people in the quality space, whether you’re doing the development side of it or whatnot is it’s just super true in the mobile space, But are you creating something that you yourself would actually want to use? Because if you’re in the mobile world, how high quality your product is directly corresponds to the app reviews, willing to let something go out that you would not use. You’ve. Got this need and you would not use this product or you would not like using this product? Why are you saying that it’s thumbs up? Good to go Like you should at least be raising and highlighting the fact that well it technically works. The user experience here is not great. Guys… Really be thinking about that.
36:10 | Chris Huff
Yeah, that’s a great point. And I’ll sometimes say this, it’s kind of a harsh statement, but it gets to the point of what you’re saying, John Users suffer by what we build, right? They don’t have to. And so sometimes you just need to sit and read those reviews. Sometimes they’re over the top but you can feel when you read those, you can feel the frustration that your software created. And so it’s a good. It’s a great point. Lee, and I think it’s good to keep that top of mind.
36:42 | Jon Robinson
I think it’s I’ve seen so many places where the developers never use the finished product Like they never actually use the product to see how it actually works. And so even if the QA people are saying that there’s still this disconnect of not like I hear what you’re saying, but this is actually how it’s built. And this is what this is what like, no, but, you’re not actually using the product to realize It doesn’t work the way you think that it works, The way the users use it is actually this way. So, yes, you may have technically checked every box and built it exactly to specification even in an over engineered way. It doesn’t matter. But if you can’t, actually use it to do what it’s supposed to be. For it doesn’t matter how well you build it. It could be the most amazing piece of crafted artistry you’ve ever built in your entire career. And it could be completely useless to the end customers.
37:37 | Joe Colantonio
Love it. The chat is blowing up. Sunita says, great point. Jason says eat your own dog food, exclamation point. I love it. So, is this part of I always think like getting the user experience, being aware of how your application is actually being used and perceived in the wild. Do you have any suggestions on maybe quickly like maturity models or roadmaps for folks that maybe are on different parts of the journey with automation. When I speak to vendors, a lot of times, they either have super hyper automation users or you have people that haven’t done it at all. Do you have some sort of mental model you can give folks to help them on their road to automation also?
38:18 | Jon Robinson
I feel like you set me up for this question?
38:20 | Joe Colantonio
Yeah, I did.
38:24 | Jon Robinson
For exactly the reason you just talked about, I have over the years kind of started to put together this very high level mostly targeted at leadership in higher level organizations to kind of understand Where do you start from a QA standpoint? And like where you’re trying to get to And it goes from, you know, you don’t have any QA or testing to manual QA. You got some automation. You’re doing a little bit more automation. It’s kind of a 50 50 split. Now you’ve got some CI CD stuff going on. And then finally you’re in a full CI CD… You know, continuous delivery, continuous deployment model like that’s great. But in the mix of that, there’s seven different phases you have to go through. And each one of those have their own life cycle of success and failure that you kind of have to work through Understanding where you’re at in there and that the end of that stage does not mean the end of your journey. But you’ve got more that you can do to help the organization Whether it’s you know, the organization as a whole or just your team, understanding that there is a continuum there. It’s not a black and white answer.
39:30 | Joe Colantonio
How do you make sure all your teams and your organizations are up to the same level? I assume different teams will be at different levels. Yeah, Is that something you monitor or you have to keep a close eye on?
39:41 | Jon Robinson
I mean, part of it is having again some sort of strategy in place to set goals and objectives for the organization. And being on different levels may be entirely okay, right? It may be that some teams are more advanced than others and they’re ready to move on to different levels. But having something that is kind of your guiding light or north star to keep you on track regardless of what team you’re on that’s important because otherwise, you’re just going to flounder and you’re going to say, well, they’re doing this. So why aren’t you guys doing that Well? Because we’re not there yet. We’re not ready for that. We don’t have the skills for that or the product’s not to that stage yet You’ve got to have something that can give that some context. And I think that’s important.
40:23 | Joe Colantonio
Lee, any thoughts?
40:25 | Li Rajaraman
Like John said, I was going to say, it’s all right to be at different levels. What would matter is being one engineering team, You can have pillars domains within the engineering team, but being one engineering team, having some sort of standardized process, then you can fit people into whether manual testing is needed. Some people might be really doing well with shift left process. Some people like… personality, right? Some people might not.
40:58 | Jon Robinson
They’re the far end of the right spectrum and they’re only in production baby That’s it, That’s all we’re doing.
41:03 | Li Rajaraman
Yes. So some form of standard process that solves the, why, the root problem being one And another thing we’ve experienced is not making decisions in a silo as a QE organization. Yeah, That helps also, You can be at different levels but don’t make decisions on tech stacks or processes in a silo.
41:32 | Chris Huff
And Chris… Oh, yeah. I mean, these experts here, I agree with all that.
41:40 | Joe Colantonio
Love it. All right. So, we have three minutes left. So let’s random question, AI overhyped underhyped or properly hyped, And Lee, what do you think on AI and how maybe you can help QA? Qe?
41:53 | Li Rajaraman
I’m personally in research mode, My team is in research mode. We are looking at how it can help. We are slowly starting to get in that area. But at this point, very much research mode Can’t say whether it’s overhyped or rightly Not yet?
42:18 | Chris Huff
Yeah. So I think it was overhyped for a, long time And so now it’s starting to deliver. It feels like a step function change. I don’t think that we’ve seen this step function change yet. I think there Like with anything the capabilities there. But now the solutions that actually meet consumers’ needs need to be built. And I think that’s happening in small steps. But I think we haven’t seen a step function change just yet. One point on this one, I think there’s a couple with anything whether when things commoditize or automate, You can either fight it like you push back against it or you embrace it to create value. And the teams that embrace it to create… value are the teams that just went out, right? Like if you fight it, you’re going to get displaced by it. If you’re like great. I don’t have to do this anymore. Like my teams are using Copilot for unit tests and some even some other kind of broader string tests and stuff. But it’s and so like the attitude should be awesome. How do I shove that thing that was adding value that I was doing into, this automated realm or this kind of lower level realms so that I can do higher order work? Because you always, if you’re layering in higher level work, then you can do kind of what we were talking about at the beginning. You can create value and demonstrate that value more. Otherwise you’re going to just be seen as being displaced by this technology.
43:59 | Jon Robinson
To me, I think it’s to Chris’s point, it was overhyped for a long time. And now that it’s here, there’s a lot of things that you can do and get value out of it. I challenge organizations that don’t have QA leadership to use it. As, how do I do QA? Better? It’s actually quite good at that. How do you, how do I create automation and actually help you build that automation and help non technical people quickly start to contribute? I think those things are incredibly invaluable And like Chris said, the people that embrace it are going to be successful and the people that fight it will likely end up finding out that they should have embraced it.
44:36 | Joe Colantonio
Awesome. Thank you, John. Thank you, Lee, Thank you Chris for joining us today. Thank you Kobiton for putting this together and this excellent event and for having us today to cover these important topics. So thank you everyone for joining us and hopefully we’ll see you here next time As always test everything, keep it good. Cheers.