
Championing Automation: How Quality Leaders Can Drive Organizational Support
In this presentation from the Mobile Testing & Experience Summit, Joe shares insights from his latest book, Automation Awesomeness. Drawing on interviews with over 500 top Test Engineers, Joe uncovers the key secrets to mastering automation testing and advancing your career. Learn practical tips and actionable strategies you can implement right away to enhance your testing skills and achieve success in the fast-evolving world of automation.
Automation Awesomeness: 260 Actionable Affirmations to Improve Testing Skills
Unlock the secrets to mastering automation testing and advancing your career with insights from Joe’s latest book, Automation Awesomeness, and interviews with 500 top Test Engineers.
0:00 | Adam Creamer
and we are in the home stretch here. So we have a final keynote from Joe calantonio over at test guild. I know his name has come up a little in some earlier sessions today and I think I saw him poking around the chat as well. So I’m going to play him in right now and then we will have some live Q a at the end. So, feel free to be active in the chat, drop in questions as they come, and we’ll chat to you guys in about 26 minutes, I believe.
0:50 | Joe Colantonio
Hey, welcome to my session today. I want to share with you some of the insights I’ve learned from writing my latest book automation awesomeness, 260 actual affirmations to improve your QA and automation testing skills. So in this session, I’m going to share with you some of the frameworks based on key insights for some of the smartest end to end testers including folks from kobiton, I’ve had the privileged interview on my test guild podcast. And for those that don’t know, I’m Joe colantonio founder of test guild, which is an online education platform dedicated to helping testers succeed with all things automation testing related. Also four years ago, I decided to sell everything I had in Rhode Island and start fresh on a hobby farm in east Tennessee. So to make things fun and memorable, I want to introduce some of the concepts based on this book and what I’ve learned from hobby farming to some key concepts that can help you with your automation in your career. But first a special message by my Kyle bowie… more about him later. So this is one of my favorite quotes of all time from my podcast. And that is nothing is more soul crushing than having a test suite that is all over the place and just randomly fails. Anyone else here. Feel that way, be honest. I know I have. And so I want to share with you a framework I think is going to help you feel this less often. And this groundbreaking framework is called the pigs framework. So I’m going to break down each of these areas of the framework. So in the first phase of the pigs framework, think about the goats method and the goats method is getting the whole team involved operationalizing a framework, anticipating and setting expectations, testability, and selecting the right tools. A common theme throughout many of my interviews on the automation podcast, and in the book is all around this whole team approach. And the quote that probably sums it up best is by Lisa Crispin that says in episode 218 that we need a whole team, we need diverse skill sets and a perspective we need to work together and it sounds simple. But this is one of the hardest things I’ve seen over my 25 year career as well. Is getting the whole team involved, not just saying this person’s involved for testing or automation, but getting the whole team to take responsibility for all aspects of software development. With testing being one of the most critical pieces. And so once you have your whole team on board, you want to start getting them involved with planning the high level part of what your framework is going to be. And this isn’t necessarily the nitty. Gritty of the tooling. Yet, it’s more the principles of who’s going to be involved, what the process is going to look like, who’s going to be responsible for remediation? If a test fails, what teams will be involved, what the expectations are? And I think gumesh says it best when starting or defining a framework, you should set up some guiding principles of what it’s all going to look like when automation is finally implemented. Hey, do you know that chickens also make great testers why? Because they always find bugs.
3:55 | Joe Colantonio
Another quote that summarizes a key piece of advice I would recommend to anyone is to get your boss to invest in having an automation architect. So this is someone that sits outside of maybe your sprint teams that doesn’t necessarily do all the automation, but they’re the ones that drive the initiatives. They look at your teams, they see what each team needs if they need education, if they need the right tooling, what that tooling looks like, and also the processes and best practices that’s going to fit for your team. Having an automation architect that drives that from the top to all the teams really is beneficial. Rather than having say eight sprint teams and each team doing their own thing. You really want to have a high level architect that is driving the vision for what the effort is going to look like and kind of managing it from the top to make sure everyone’s on board and that any roadblocks any team member faces is addressed and brought to the top. So the automation architect can work it out with everyone to make sure automation thrives in your environment. So here’s my wife shale. So in a hobby farm, you really need someone that knows all the animals, what their needs are, that can tell you here’s, the food they should be fed. This is when they should be fed. This is how you clean out all the stalls. This is when they need their shots. And without my wife, this would not happen. So she’s pretty much the architect of the farm. She doesn’t necessarily do all the work she will delegate to myself but I go to her to know what the best practices are, what I should be doing to make sure all the animals’ needs are being met. And this is one thing that drives me nuts. And where I see a lot of teams fail is they try to build their own frameworks, their own solutions, rather than looking to see a vendor based tool that already has it packaged and ready for them to go. Especially if you have a team full of testers, not necessarily developers or even if you have developers, they don’t necessarily know how to create a framework that’s going to be successful to help you with automation. And this is one thing a lot of studies show as well is that building a custom testing solution is labor intensive. And the world quality report suggests 63 percent of companies struggle to quantify their return on investment due to this. So don’t spend your time developing tools, look to solutions. It doesn’t have to be a vendor base. It could be an open source solution that already is built and ready to go for you. Another key aspect of making automation successful in your organization is testability. But the problem is if your software is not designed from the beginning to be both testable, or automatable, if developers aren’t thinking about testability from the beginning when they’re writing the code. So you really need your developers from the beginning to have this mindset of how can I make this testable? How can I make this automatable? Can I create a unique id that’s going to make it easier to automate later on? Is there an API I can expose to my testers to make it so they can quickly understand and populate data or get things in the right state before they have to automate it to make automation quicker and faster. And more scalable. Developers need to be thinking about that while they’re writing the code from the requirements. So that when it finally gets to a tester or an automation engineer that it’s already built into be testable. Or automatable. I can’t highlight this enough. This is something key to any successful automation project. And hopefully you’ll pay attention to this as well. Some other key advice I’ve heard over and over again from my guests on the podcast is don’t jump to selecting an automation solution right away. Consider which solution can bring the most robustness and automation, given the speed that aligns with your goals and your team needs. So I always get asked Joe, what’s the best automation tool I should use? Is it a vendor? Is it open source? The answer is always different because each team and each technology stack is completely different from one another. So where selenium may be a great fit for one team, may not be a great fit for another. So if you have like more developers that are doing automation than java front end developers, maybe cypress is a better choice for them. If you have more testers that have java experience, maybe selenium with the java binding is the best. Maybe you have python developers and maybe using pytest is the better approach. You really need to look at your tech stack and your team skills before you decide on a tool. And obviously all this information applies to all types of automation frameworks including mobile app testing frameworks. And since this is the mobile testing experience summit here’s, a piece of advice that applies to frameworks for mobile testing as well. It’s by Shannon Lee from kobiton, when she said it’s critical to choose a mobile app testing framework based on your unique needs and apps, key aspects including its performance on iOS and android. All frameworks have pros and cons, native frameworks. Like xui tests and expresso are fast but can be unstable and challenging to maintain. Appium is ideal for cross platform testing, but it tends to be slower than native options. So you want to evaluate the pros and cons of both types to make an informed decision. Okay? So here’s an example of what happened when we didn’t think through our plans and we said yes to adopting a caf that is similar to a team jumping at selecting a tool or process before really thinking it through. So, a few years ago we adopted a caf named Beatrice. Her mother had rejected her and the farmer was moving and had to sell all his cows. So we asked us, hey, do you want to adopt her? Unfortunately, without thinking, my wife and I said immediately, yes, keep in mind, neither of us were raised on a farm. We both grew up in a city and we had no idea what taking care of a cow would involve, but we fell in love with this caf. We fed her with a bottle, we built her a makeshift stall. B was cute, intelligent, and we sometimes wondered if she believed herself to be a dog rather than a cow. She also loved kids. Every time my nephew and niece would visit, B would run up to them excited and want to play with them. To everyone’s amusement. The problem was Beatrice grew.
9:50 | Video Voice
How did you get up here?
9:55 | Joe Colantonio
And grew and grew. The once cute playfulness turned dangerous as B’s weight shot up to about 500 pounds especially to my young niece and nephew. When a 500 pound cow was trying to play with them. Adding to our problem because we never planned to get livestock, our property wasn’t fully fenced. This became an open invitation for B to occasionally wander off our property into the streets and really become a nuisance to our neighbors. Our neighbors still say, oh, you’re the couple that have the pet cow. So as she grew herding her back home became more and more difficult. So faced with these challenges, we had a tough decision. It was clear we lacked the infrastructure to accommodate a full grown cow. Selling. B was out of the question. She was a pet and not destined to the slaughterhouse for someone to eat. Luckily, we found a rancher seeking a cow for breeding with his bulls, and he promised not to eat her. And we conceded comforted by the thought that she would have a fitting home. So, what does this have to do with automation? Well, as I mentioned, the point is be careful before choosing a tool, we chose an animal for our farm without looking at our situation or even seeing if it actually made sense for us to have a cow. So the next phase is the initiate phase. So when you think initiate think tractor. So when it comes to test selection criteria, here are the two things you should focus on first risk and money. So you focus your automation efforts on the part of your application that if they were to fail would cause your customer harm or would cause loss in business or loss of money. So those are the areas you want to focus in on. A lot of times. What I know is a red flag is when I talk to a team, they say we have 20,000 selenium tests, right? And you look at them and they’re all testing pretty much the same thing. They’re not tied to risk and money. If you’re not focusing on the areas of your application that are risk based or the areas that make your company money within the application, then your automation testing effort is not going to be successful. So why is that a red flag? If when I speak to someone, they tell me they have 10,000 selenium tests and I know they’re not focusing on the risk in the money areas of their application. Quick story a few years ago. I thought, hey, it’d be a good idea to get my wife a egg incubator. And what do you think happened with this? You’re probably right. She went on a duck hatching frenzy. It’s just like automation. Testing, sometimes people get a tool and they start off with one and then they go crazy just trying to automate everything without even looking at? Is this adding the company value? If this test fails? Is it really giving me a signal that something is wrong? Is this focusing on risk? Is this focusing on money areas without doing that? They just end up just creating tests just for the sake of creating tests? Just like my wife did with these ducks. If you don’t know on the farm, ducks are the most disgusting animal on the farm. Their pen is nasty, hard to clean. It’s just like tests, when you have too many of them, too many automation tests, it turns into a maintenance nightmare. So you want to avoid this as much as possible. And just like the ducks, the same thing with automation and tests. The more tests, the more code the more maintenance. So this is one of the only ducks that was ever born in the wild. My wife hatched every other one of them. And like the picture kind of demonstrates, you want to also keep your test as atomic tests or as small as possible. And this also goes to kind of like the, I know it’s overdone but the test pyramid a majority of your tests, you want to make them to be unit tests, which is smaller. And then only once you say, OK, we can’t test this at the unit level, let’s go at an integration test which gets a little bit bigger. You say, OK, can I cover this with an integration test? If so stop there, if you can’t then move on to the higher level end to end test flow. You’re not going to get rid of all end to end tests. You need end to end test. But if you can handle it at a lower level, at a smaller type of test, a quicker test, you should go with that before you develop a full blown end to end automation test for it. And once again, I hear this on my podcast all the time. You want to make sure your tests are atomic, meaning they’re independent and don’t have any dependencies on other tests. You want to write tests that are atomic, simple and fast. You want to make your tests as independent as possible. So no matter where you run it, what environment it’s self contained, it can run on its own because it is designed to be resilient. So another thing that gets testers in trouble when creating automated tests. Especially when I hear on a lot of guests on my show is tool usage. So you want to use the right tool for the right job. So if you need to do a full blown performance testing effort, I would not use selenium. I probably would use more of an enterprise testing performance solution or a lot of times I’ll use a functional tool to populate data rather than using an API testing tool to populate it. So any time you’re writing a test, make sure also you’re using the right tool at the right layer. So if it’s an API call that can get the job done, don’t use selenium to do it, use something like restashard, a postman or something that was meant to do API testing. So, one of the reason why we chose these sheep, this is dolly and Dutch because the katadine sheep and this particular type of sheep don’t need to be sheared. So they just rub up against trees and their fur comes off automatically. So we built our farm to be as maintainable as possible, self maintained as possible. So that we don’t have to do this ourselves that it’s built into the farm. Same thing with your tests. You want to make them as maintainable as possible and built maintainability into your framework as much as possible as well. And so another way to do that is to narrow the scope of what you’re trying to test. You want to divide things up as much as you can, and this will allow. And this is one of the things that allow you to have more maintainable tests. Here’s. A quote from Kevin lampin, some other advice that a lot of people have told me is if you’re not using things like page objects or screenplay, patterns of other type of patterns, figure out what those are and start using them. They’ll make your tests immensely better and more readable and more maintainable. Another way to make your tests more maintainable, is making them as readable as possible. And this is one of the most overlooked parts of automation I think is readability. I think it’s huge. And a lot of people on my show have mentioned it as well. This is from Corey house who said he had great advice that when you’re writing code, write for the reader in mind, don’t write your code for the computer, write your code for your coworkers. Also I’ve noticed in my career that I spend most of my time not writing tests but maintaining tests, troubleshooting tests, debugging tests. And one way to help speed that up is to make them more readable. So this might be controversial and you probably heard it a lot. Hey, automation code is software development and that’s true. But you have to keep in mind it’s not production code. There’s a big difference. The way you design production code is completely different than the way you would design automation code. You want to do the best practices, you want to make sure you’re making things as readable as possible, but you don’t have to over engineer your automation, the way you have to do with production code. And Ramona really highlighted this in one of the podcasts when she mentioned herself that test code isn’t production code yet it holds equal value, not saying it doesn’t however conventions like don’t repeat yourself, principles are not always suited for test code. And also over abstraction can really compromise readability. So you want to aim for clarity, not over engineering your automation test. Also when you’re running your tests, here are four key areas to focus in on. These are the four big causes for failures. The first one is not having a good locator strategy. The second one is not understanding synchronization. The third is test data, not having the proper test data for your test and environments, not having the right environments to run your test. Another piece of great advice I got was from Jonathan wright on one of my podcasts and he rolled out what he called an automation scorecard. And this is how you could tell how you’re doing with your automation to check whether is it maintainable. Is it relevant to the business? Does it have clear traceability? Is it reusable? Is it manageable and scalable? And can it be accessed across the company? We don’t have time to go over this automation scorecard. But to make it easier, what I did is I created a free calculator for you to actually calculate how well you’re doing and what areas you may need help with. Along with tips, all you need to do is go to testskill, com, go to tools and then click on that automation score quiz. And this will walk you through each year of the scorecard rate yourself. And at the end, you’ll get a score that you can monitor over time to see if you’re getting better, and also tips on how to achieve a better score based on how you answered each of these particular areas. All right. So let’s dive into the third stage, which is the donkey method which we’re gonna go over each of these areas as well to help you test with the execution piece. So you really wanna make your tests. So they almost act like safety nets. So when a developer checks in code, they have confidence to know that if a test fails, it’s failing for a real reason and that they better look at the code that they checked in and fix it before it. Actually gets committed to production if your developers don’t have this confidence. If testers don’t have this confidence, then this is something that you need to look at here’s. A quick example, I know beginning of last year, there was a run on eggs, prices soared at 60 percent. We have a bunch of chickens that free range and we get eggs every morning. So, while this was going on and people were complaining, all I had to do was walk out my door, grab an egg and then cook it because basically, these chickens act like a safety net that if we ever have a shortage, I don’t have to worry about it because I have a plan in place and that they’re giving me good value and they’re giving good results, same thing with your tests. They should be giving you a feeling of safety that if something fails, they’re failing for a real reason. You also want to observe and listen to your test. So if something fails that you pay attention to it, if for some reason you have a test that’s failing all the time and everyone ignores it, I would just delete it because obviously, it’s not adding value. Your test should be adding value so that when they fail or if they’re flaky, you go. Oh, let me look into that because that probably is highlighting a real issue. So I don’t know if you’ve ever seen an alpaca, anytime an alpaca is nervous, they’ll start humming and I always listen and go, OK, there’s something going on that’s making them nervous. Or these are Guinea fowls and Guinea fowls when they’re in the yard with the chickens. Anytime they see a hawk or a fox or something, they stop making the loudest noise in the world and the chickens will stop getting cover and hiding. The chickens. Didn’t listen to the guineas. These chickens, probably a lot of them would die due to a predator, but because they’re listening to the Guinea fowls, they’re actually taking the alarm or they’re taking it seriously. They’re saving their life. Same thing with tests. When a test fails, they need to be enough confidence in them to let your developers and your testers take them seriously. So listen to your tests when they fail. And that is one indicator of whether or not your tests are really adding value that if they do fail, everyone does listen because they know they’re adding real value and that’s something that is probably really wrong. Another area of the execution phase, a lot of people miss out on is the ability to include non functional tests. With your automation suite and a lot of times people just focus on extreme end to end functional tests and that’s great. But you’re missing out on probably 60 percent of the value of other types of tests. You should be running like performance testing, security testing, accessibility testing, resilience testing, chaos engineering, all these other tests you should be including in your test suites as well. And a lot of companies now are coming out with solutions that actually leverage some of your UI test to add some extra functionality so that it’s a functional test. But it also can give you some performance data. You can use it to give you some security information. You can use it to give you some performance test information. You also want to have some way to measure your team, not something that’s arbitrary or something that’s like 90 percent automation, something ridiculous like that. You need the measurements to demonstrate the right things for your organization. So all organizations have different metrics, different things that matter to them. Same thing for you. Not all test business intelligence means everything to all people. So make sure you also have some really actionable KPIs that are encouraging the right behaviors for the right things that your organization needs to measure to know to let you know how you’re doing from sprint to sprint, build to build. It really should be used to make efficient guided business decision making, not overhead or overwhelm or unnecessary data. So another thing people struggle with, I hear over and over again on my podcast is environments, a lot of times people try to run a test on an environment that isn’t just like production or it doesn’t have the test data or they’re using a environment that is shared across teams. So sales teams are doing a demo within an environment that developers are pushing to or testers are testing on, which causes chaos, save yourself a lot of time and invest in the right environment environments for your test to run. And one way to do that is to use ephemeral environments. So ephemeral environments solve the problem of it works on my machine. So one suggestion I highly recommend if you’re not doing already is using something like an ephemeral environment that could spin up and spin down. So a lot of times people have a known good state of a infrastructure that when they automated test runs, they start up a new instance of all these ephemeral environments, the test runs, it uses all the data. And then when the test is done, it tears down the environment. And so when the next test run comes, it has a fresh environment from a known good state. This will save you a lot of time with maintenance and reliability. Issues as well. So having on demand environments help you with a lot of things like scalability environment, consistency, isolation of different things, version control and rollback and also simulating different scenarios. You can also use these environments for destructive testing without impacting anyone else. So if you’re doing chaos engineering or reliability engineering type activities, this is a great solution for you. And so examples of how to get a good Roi on your test is to look to speed cost savings, enhanced productivity and improved test coverage. All right. The last phase of the pigs framework is the rake method and that’s going to focusing on refactoring analyzing, keep learning and enjoying automation? Awesomeness. So here’s a common theme on the podcast as well is make sure you’re refactoring or you’re deleting tests that aren’t adding value. So anytime you’re writing a test, you should always ask yourself, is this test adding value? Or is this older test still adding value, not just a checkbox? And so ask yourself, is this actually adding business value? And if it’s not delete it because it’s not adding business value, it’s only going to add more noise and more effort with maintenance. So make sure you delete anything that’s not adding value. So always ask yourself, is this adding value? Anytime you’re running a test or writing a test, Melissa Tandy mentioned as well always look for ways to improve. So you should always be analyzing your tests. So a lot of times I see a company’s having a process in place and they never change the process. It’s static processes are meant to grow with information. You’re learning from production and building it back into the sprint teams. And you’re slowly changing things to get better and better over time. And the only way to do that is to analyze, always look for ways to improve. So if you find yourself repeating a task also that’s adding say data manually or testing the same type of scenario over and over again, take a few seconds to ask yourself. Is this something that can be automated? So improving things, little things over time, automating little things over time are going to save your team a lot of time in the long run. So it’s not just looking at automation of functional tests but anything to make your software development lifecycle quicker faster that you can automate. I highly recommend automating that as well. And as testers, you know, we always need to keep learning. Angie jones highlighted this and many other people on my podcast have highlighted it if you’re a tester, stay current and up to date. On your skills, it’s really critical. Make an effort to try out different tools and techniques to keep your knowledge current. You can experiment with new tools and techniques during your nine to five job or create personal projects in your free time. Great advice by Angie jones. And by you just watching this session and by watching this event, I already know you’re one of the top one percent of people that are always learning. So keep it up. So that is the pig framework plan initiate go and streamline. So next action, hopefully you enjoyed this session. If you did, I highly recommend you check out the book automation awesomest. You can get it on Amazon at theautomationbook com, or you can get a PDF version if you don’t have Amazon in your country at testskill. Com, automationbookpdf. Thanks again for joining. And as always test everything and keep the good cheers.
27:04 | Adam Creamer
That was a great session. I loved all the analogies and dad jokes thrown in there. Gotta love it. Yeah, as you kind of touched on at the end, if anybody is interested in the docs tab here, we do have a link to Joe’s book as well as some other test guild resources. So please check those out. Joe has been working with us and I’ve worked with him in past lives as well at other companies and there’s some good stuff over there. So Joe, I know there’s one question in here. I see so far and if any more questions come through, please pop them in the Q a and we’ll be sure to get those answered. But your buddy, Chris, testability is a huge topic. So do you have any words of wisdom for getting buy in to allow time for that, especially when there’s layers of integration that are needed. Yeah, you know, the.
27:51 | Joe Colantonio
Biggest way to do is to demonstrate how much value it’s going to give you. I could just think of when I was working for a company that had like really old tech stack with the mainframe in the back, I just made friends with the developer and I got him to open up an API to do something to make it easy to automate for me a little piece. And once I created a test to demonstrate how that’s able to make me automate things to make their lives easier. He started adding more and more hooks into the code in the back end to make it more testable. So if you could do like a small POC or like get a quick win to show people because that’s the hottest thing to do with the culture is if you could show someone how you can help them out and you show that value. Once you do that, it’s like it ends up to me my experience really opening things up. So I know it sounds low tech but make friends with your developers and try to get some buy in and show them the value and show them that it’s not only benefiting you but it’s benefiting them as well.
28:46 | Adam Creamer
Awesome. Yeah, I think that kind of goes back to your whole farm analogy as well as like sometimes it helps to start with little things first like you start with dirty animals. You started with a couple here and there and kind of built up exactly.
28:59 | Joe Colantonio
Yeah, exactly. Great point. And also I wanted to point out a lot of this is also relates to mobile testing and this is the summits for mobile testing, but this is for any kind of testing. Any of these principles would still apply even if it’s mobile.
29:12 | Adam Creamer
Testing. Yeah, absolutely. And I know you have a nice library of a bunch of people you spoke to there. So I’m sure if people do check out the book, I know Chris looked like he shouted it out in the chat but there’s a lot more tips in there. I’m sure. Excellent. Any other questions from the team? I know this was a great session and Joe never disappoints so.
29:37 | Joe Colantonio
Thank you for that. Well, thank you, woody said, thank you, nafseef. Thank you, matt. Good crowd, Adrian, awesome. Cristano, Derek, good to hear from you all any questions? Let me know. But like I said, there’s a lot of stuff in this book. What I was surprised by a lot of it is soft skills. So a lot of people always think technical when they’re dealing with anything, a lot of times. The answer surprisingly and I’ve spoken with a lot of people at all levels. It tends to be more culture or getting the team involved. So it’s kind of interesting that one of the key findings was, it tends to be not necessarily a technical issue that sinks people, but a lot of times the culture or the team that’s involved in this effort type deal?
30:18 | Adam Creamer
Yeah, that was actually something that came out of the bluebeam session we had earlier during the breakouts too was like they were able to figure out a solution for their support issue because their culture allowed them to kind of collaborate and be innovative like that. So, yeah, those soft skills and that ability to work together is super important.
30:38 | Joe Colantonio
Absolutely. Often overlooked for sure. I was also surprised by some of the, I have to go with session ones. I think it was the mobile testing trends, how they talked about how a lot of people think manual testing will go away with AI. And I was just surprised by that, you know, interesting. Yeah. So a lot of things like AI, once again, we have a lot of people like Jason Arbon that talks about AI. And even though AI is great, I don’t think it’s going to replace anyone. I know that’s a common question. You probably all get asked also Adam, when you do these types of sessions?
31:07 | Adam Creamer
Yeah, absolutely. I know that’s something. I mean, Frank could talk me under the table with AI all day, but it’s one of those things where it’s like it’s going to turn you into a super tester rather than like take over because AI is really good at like tiny little tasks. But when you tell it to be like, hey go look at this whole app like it has challenges. So I think kind of taking some of the mundane nature out of some of the tasks can help you guys focus on more interesting testing scenarios?
31:36 | Joe Colantonio
Absolutely. So Adam, I may have missed it. Are there any cool features in kobiton with AI that’s coming out or that is going to help specifically with mobile testing?
31:46 | Adam Creamer
Ooh, yeah. You’re setting me up here? Yeah, there are definitely a few features that we have on our side. So, I know one we actually had released previously is like the self healing functionality. I know that’s been big in the testing world for a while especially when you’re trying to run across multiple different devices. Like if you’re testing on a galaxy and then you run that test on a pixel six, it’s like different screen relative solution. The element may move around. So that sort of thing being able to identify that. And then like I said, Frank and our ses can probably talk about this better than I, but it’s one of those things where using AI to do some of the validation work while you’re doing a manual test, I think is beneficial as well. If anybody was able to join the tips and tricks session earlier from Billy. He touched on accessibility, but there’s also functional testing and performance testing and things like that. That AI can kind of help run for you simultaneously while you’re going through your normal manual session and I know there’s a lot out on the horizon as well. But, oh.
32:52 | Joe Colantonio
That’s cool. So you’re doing exploratory testing in the background to give an insight. Hey, I noticed this accessibility or something like that. Sounds like very nice.
33:00 | Adam Creamer
Exactly. Yeah. There’s there’s Billy in the chat self healing scripts. Love.
33:05 | Joe Colantonio
It. Love it.
33:06 | Adam Creamer
Excellent. I know we have about a minute and a half here and I saw you poking around some of the chats earlier today as well. Any other takeaways you have in general, from the event or just feedback for any of the other sessions you saw?
33:20 | Joe Colantonio
No, I, just like I said, I was surprised by some of the trends with AI. It’s good to see that as something in mobile, it always seems like mobile is lagging a little bit behind general functional automation. But based on what I’ve seen a lot of these sessions, it looks like mobile automation is catching up. So that was kind of nice to see. And I’m happy to see that you all are invested in that because I think it’s something a lot of people struggle with for.
33:41 | Adam Creamer
Sure, yeah, absolutely. And I think that’s something just from looking around the industry. You’re definitely seeing a lot more people dipping their toes into mobile. Yeah, being able to do that well, I think, is always important and tricky as Frank mentioned. But hopefully it’s becoming easier.
33:57 | Joe Colantonio
For sure.
33:59 | Adam Creamer
Excellent. Well, thank you again, Joe. If any other questions do pop up, I know you’re hanging out in the chat, and feel free to jump around any of the booths as well. We have our closing remarks coming up here. And, yeah, couldn’t be happier to have you with us, Joe. Thank.
34:14 | Joe Colantonio
You. Good to be here.
34:15 | Adam Creamer
Appreciate it. There you go. Love the soundboard. Awesome. Thanks Joe.