Webinar

Shifting Left from Anywhere: Testing Mobile Apps Faster with a Dispersed Team

Abstract

Watch this on-demand session as Li Rajaraman, Sr. Manager of QA at WeightWatchers, joins Kobiton’s Matt Klassen to discuss how to successfully shift left in QA—even with a globally dispersed team.

Learn how Li’s team overcame the unique challenges of mobile app development, such as accessing real devices, to deliver higher-quality apps faster. Discover practical strategies for fostering collaboration, optimizing workflows, and achieving excellence in QA, no matter where your team is located.

Shifting Left from Anywhere: Testing Mobile Apps Faster with a Dispersed Team

Discover how a globally dispersed QA team can successfully shift left in mobile app development, delivering higher quality apps faster, with insights from Li Rajaraman and Matt Klassen.

Learn More

Video Transcript

super excited uh for this session shifting left from anywhere so again I’m Matt clawen I run marketing at kobiton and uh we just had a great lineup today including this session and we’re going to learn a lot about Weight Watchers uh progression in the kind of their Journey around testing and test automation around mobile and uh I think it’s going to be really interesting because I think they have had a shift as an organization um if you think about what you know the organization of Weight Watchers they’ve had a shift in terms of a business strategy that has created an opportunity not only an opportunity but a criticality around mobile for their business um and in fact it’s Shifting the demographic of their customers and consumers and um I think I think we’re going we’re going to dig in on that so so before uh we get started into the into kind of the the meat of the discussion um Lee could you just just introduce yourself and and your role at Weight Watchers for sure thanks Matt for that intro um I am Lee uh I’m from Weight Watchers I’m part of the quality engineering Arc at Weight Watchers um I own quality uh for uh a lot of the member experiences in our applications mobile mobile and web and uh the coaching experience uh which is part of a service that our members consume um I lead a team of Highly technical full stack qes uh who work on both front-end and backend um UI testing integration testing and all of that good stuff um so yeah that’s that’s my RO at rate watches yeah thank you for that intro um so you said something actually really really interested me and I didn’t when we kind of were preparing for this I didn’t catch it or glean on to it uh but you know today’s event is called the mobile testing and experience Summit and when you talked about you know what you own you actually talk about um I think you said member experience and uh coach coaching experience or coaches experience um talk about talk a little bit about what that word experience means to you right from a QA perspective at Weight Watchers I’m just I’m just curious I’m going to dig in on that a little bit um my understanding uh is that the experience is what uh our members go through in our applications um they are mostly it’s mostly black box for them right they don’t know what happens behind the scenes uh what they want to be able to do is use the app in their day-to-day as part of their Journey Journey towards uh great health and fitness uh so they are trying to figure out their activity their food um their weight and all of those uh areas in their life uh and they won’t be able to have a good experience tracking that going towards their goals and all of that so that’s that’s what our mobile uh applications do provide features um to um keep track of and move towards the their goal their personal goals yeah the coaching experience is a lot of internal tools uh where the coaches uh we we have workshops uh real life and virtual workshops so our coaches use these platforms to uh create the events uh look at members who would be joining their uh workshops and all of that so a lot of internal applications uh which the members are interfacing with from the mobile applications uh and the coaches are sort of responding to those requests uh and events from web uh experiences that’s two two worlds yeah so so talk a little bit about how mobile is is sort of at the heart of Weight Watchers transformation right I think you you called it a digital transformation but um I’m kind of get we’re kind of getting a a picture here of you know your constituents your users right you have coaches and you have members but but what is that shift that’s occurring digitally and how is mobile kind of part of that so a lot of our experiences the new features are built for mobile so when when a product owner is thinking about building something for member they think of building it in the mobile applications so they it’s become mobile forward iOS or Android depending on where the demography is and all of that uh but the First Choice has become mobile for Weight Watchers um and from from my understanding engagement uh with the Weight Watchers experiences happens through mobile um we we could sign up we could um enter the program through web uh but most of the engagement uh and retention happens in the by uh experiences is that is that also partly due to the demographic I mean I I think you know Weight Watchers maybe has an image that’s been shifting I think you’ve I think you’re kind of Shifting the image but is that also shifted demographic um I in terms of age of the age of the let’s say age of the participants or members is that something you’re you’re interested in yes some and ease of use right uh most people today uh have the have an iPhone or an Android phone uh it’s easy to get on the program and experience the program while you are doing in between something right instead of going to a laptop opening it up and doing it in the web so I think it’s ease of use um and you can set reminders and you you get notifications and act on those notifications and all of those things so um yeah it’s a combination of uh moving towards demography and ease of use okay that that’s awesome okay that makes sense so um so how crucial is mobile app testing right um to ensuring the success of your digital initiatives right as as you’re um as you shift to this this new experience right through mobile so like I said we are thinking of uh mobile as our first choice whenever we are building something new so it’s it’s definitely important for us to uh deliver a high quality experience uh High usability easy to access features and all of that and we are also moving towards a lot of Av testing approach um making tweaks small tweaks in existing features or uh launching new features uh as an AB test so it’s more important for us to ensure we are able to uh um we are not breaking the control experiences and the variant would if it succeeds it would perform well uh and not cause any issues uh so quality that that’s how quality comes into play we we are always um working with two different experiences within a feature set so more more QE more eyes on quality more Hands-On QE and all of that so yeah yeah okay so so um that makes a lot of sense so so I think one of the things that a lot of organizations struggle with as um as they’re trying to scale right QA and especially for critical mobile apps um is you know what’s the right approach around let’s say automation around right shifting left around they have to make there’s a lot of decisions right what methodologies Frameworks um so can you describe maybe a little bit about how Weight Watchers approaches mobile app testing um today um we we are getting to a phase where we are moving more towards the shift left approach uh we’ve been uh formalizing and standardizing the ship left approach since the beginning of the year um more uh it it was always shift left in a in a way QE would start early on in the cycle uh now we’ve gotten more functions like engineering uh project management product owners also contributing in a way to QE uh especially the engineering or where um they are more involved in the QE process they test their changes before it even gets merged and all of that um our we we use a combination of uh manual and automated testing uh like I said AB testing we we don’t want to invest in writing automation for variants that will not succeed so those are primarily tested manually uh and we have uh good automation coverage across the board whether it’s back end or front end in all the different areas so it’s a combination um we we follow the test pyramid uh approach test more at the competent level early on um more end to end testing towards end of cycle when features are stable and integrated so um more automation at the competent level UI tests and integration tests is the approach we are going towards and very um low maintenance UI tests that are only capturing business critical use cases um will find issues for us in the mission critical business critical use cases has been our approach okay okay so there’s a couple things I want to I want to touch on and we had talked ear we talked before about potentially doing doing a poll I’m I’m not going to be able to do it um just FYI just so we don’t interupt the flow everybody else in the audience didn’t know we were goingon to do a poll but um but Le we had talked about it so um we’re going to skip that but um but on but specifically there’s two things I want to touch on one the first one is um is shift left so you mentioned shift left you talked a little bit about it and obviously we we as we were discussing this session right there’s a lot of different aspects we could have we could have probably put into the into the spotlight in terms of the the title but we ended up selecting shift left as part of the title as well um because I think it’s a really important part of your story but what what um what does that really mean kind of dig in a little bit more on on you know what shiftleft the approach is and maybe it’s all you know maybe get into um you know the impact of that as well um we we were in a phase where you have to test everything always right regression testing is needed um only because we were the the mobile experiences were evolving and uh we were trying to group them under pillars and all of that so uh manual regression testing uh was the case uh until we got to this phas um now that we are in a more stable uh evolved technical organization um what the uh framework has come to be is developers also start testing their code changes um it’s not quality is owned by all right quality is not just Q’s ownership the QE ORS ownership everybody should worry about quality uh and that’s also part of the engineers work stream when you make a code change you don’t break anything uh while you’re merging it in the big picture so developers test their code uh at the feature level targeted testing uh to ensure nothing is broken and it’s being tested through the review code review and the merge process and once the code is merged the change is available QE goes in and performs surgical manual testing uh again understanding what the impact is uh where QE is needed because usually in most organizations the QE org is much smaller compared to the engineering org so you you there is only so much bandwidth you can spread across the areas that you want to spread across so be very intentional conscious about where that effort is spent is is what our shift left approach is um write tests that matter that’ll find issues uh P0 issues um that’ll reduce manual testing or spent on something so make an impact if you’re writing a code even a line of code it has to make an impact um if you’re performing manual testing the same applies use your time to only perform testing where it matters and not just because we as qes learn that regression testing exploratory testing you have to test everything all the time if something breaks you’ll be blamed that’s that’s not the case anymore um use your time well make an impact is what our uh shift left approach has been and no continue sorry continue uh we Leverage are automated tests towards the same also whether it’s a code change made by an engineer we run uh the mission critical use cases and whether it’s QE we use our automated tests to find the same do the same for us okay can you talk um maybe talk a little bit about the auto the other part I want to talk about a little bit is the automation so um can you talk a little bit more about uh like how what do those automated tests look like you know anything you could share around um Frameworks around um you know executing those against simulators emulators versus real devices it’s you know Etc like what does that what does that environment look like and and how have you how kind of how have you scaled automation right so I I want to say sometime in 2020 2021 uh is when we started looking at how we leverage our automation Frameworks uh the mobile IOS and Android automation Frameworks and we went through a sort of a transformation uh in approach in how we build the framework and who is going to be using it who’s going to be contributing it all of those aspects as well uh so we took a look at even before writing tests we took a look at what that framework should be how it should serve us um and where we run our tests when we run our tests uh we usually write automation with physical devices local devices but when we are executing them in our nightly runs or for app release sign offs we are leveraging citon for iOS and uh uh another platform for Android um so we use device Farms to um sort of expand our device OS Matrix and find those unique issues that are unique to OS versions or device uh form factors and all of that um we we are integrated the automation Frameworks are integrated created within the dev repositories so that both engineering and QE they have access to both the automation Frameworks and the code repositories so they they are not residing separately that’s something that we sort of uh consciously de Ed we will not write a framework that’s outside of the repository where there’s no visibility for engineering to say what why something is broken uh where how to root cause an issue and all of that so we are integrated within the code repositories we run our tests against the cicd pipeline um early on we started that as we built the framework that’s one of the things we took care of we use uh GitHub actions and we set up our tests to run against um the nightly bills and when we have u a release candidate we execute our automated tests against F relase candidate as like a final um seal to say this we have confidence in this bill to release it to the App Store and play store so we we were very conscious about how we were building the framework and where we are running it when we are running it to give us maximum gains yep okay okay that’s that’s good that makes sense um how and how so when you talk about the device Farm um how did you um H how did you figure out which right device configurations how many which device configurations sort of like a sweet spot you know for your organization for um and I guess we’ll we’ll focus on iOS you can talk about Android too but let’s talk about iOS primarily uh um yeah so we leverage our uh data member usage data uh to make decisions on what to run what to test with uh so our data informs that we run our tests against the most used most popular uh device OS combination um the lowest supported only because it’s supported and we do have users on that version and uh somewhere in between right maybe a problematic OS version or where we have the second most used OS version and things like that so we we look into analytics data uh every single time uh we are we are talking with Kiton about upgrading uh an OS version in a device we are looking at that data and making decision as to when adoption will increase or when uh a version’s adoption has decreased and we make a call on uh updating our devices so it’s it’s data um okay that’s good that’s good um so H so I want to go back to um something you know kind of go back to the shift left a little bit because you’ve actually sounds like you you’ve been a little bit you know been successful there um it sounds like from the very beginning you knew that your approach was only going to be successful if you know the quality organization worked together right QE sort of worked together with Dev in combination um you’ve made some purposeful choices there which I think is which is obviously seems to be working for you um but I think when we were talking you’d mentioned that there’s still skepticism uh you know here and there there’s still skepticism in the dev organization and around I don’t know if it’s you that you know why am I doing testing or they don’t want to do testing or that it’s going to make an impact or you know help um maybe talk a little bit about your experience there and how did you kind of overcome that and and kind of you know um you know work with Dev to kind of to move past that so two two things came up when we uh as part of the ship left discussion we we piloted the shiftleft approach with um a combination of Crews uh a crew on mobile a crew on web and back end uh to see where it is working well what what kind of changes we should make and when we were going to uh uh formalize it more widely uh the the question from the mobile engineering team was um what and we we wanted to as part of shift left we wanted to integrate uh validation test a small subset of our test Suites into the merge CU process uh it’ll run against any PR that is getting merged and if the tests fail the pr will be kicked out of the queue and the author of the pr has to look at uh why why that uh that happened so uh someone from the engineering team asked why why should we do that uh how do I even know that the failure matters so we were uh when we start started running our nightly tests regularly uh we started tagging issues that we would find uh from the failures caused by the automated test and we deliberately didn’t do any automation to create issues today I think we are going to continue to do it we are creating those issues manually uh somebody in the QE team would to look at failures and then say okay this is a genuine failure I’m going to log a defect against this function this area so we had that data going into this discussion to show that hey look these are so many issues found across six months uh that we’ve found fixed that has a Cod change associated with it so the framework is working it is finding issues for you if you integrate it with your merge process and if for some reason the tests fail there is a possibility that it’s a genuine failure that you your code change caused uh an issue somewhere else so that data helped us have that conversation and sort of uh get by in uh for integrating our tests move testing early in the cycle uh even before a code gets merged into our different feature branches develop branches and all of that Automation and our process of tagging defects drove that conversation yeah yeah that’s awesome so I think uh data is key I think we all probably know that but uh data speaks for itself um most of the time so what um as you as you think about sort of like um the I’m GNA kind of move a little bit forward in our discussion because we have about five minutes left and there’s a there are a couple questions so kind of net out like what what are some of the key kpis uh um like how do you know you’re you know you’re moving in the right direction what do you measure you don’t have to tell us the actual measures necessarily but what are you measuring and and um what kind of impact are you making as you as you progress through this you know through your transition in terms of like um again in terms of those metrics um in in what case while using our Frameworks yeah so I think if if you were just to look at your organization right how how do you measure your organization’s success I guess let’s just put it that way just really simple the the QE Arc yes you yes you you’re yes exactly okay um I I want to start from our automation Frameworks the kpis that matter to us that kind of um goes or falls under the tech ORS uh kpis also so we want to be able to uh be agile um be be nimble flexible um if we are releasing something and if it fails we fail fast roll back and repeat uh that’s our approach to uh uh what we deliver so if we want to do that then our QE the quality uh process should also uh sort of follow those principles uh we are not too rigid about the processes when using the Frameworks don’t be too rigid about how how we are uh using the data that we get and all of that so what we write should make an impact if if our Frameworks are stable and reliable they run every single day for any code change that happens then we are in a place where we are only responding to genuine failures and that rolls under how engineering responds to the issues that we find we are at a two week release Cadence so we have to be moving faster uh every other week we are releasing so we need to um our sign off process has to be quick and fast um we need to make sure we are not uh in any way disrupting engagement uh in the member experience with what we deliver so that is key to us our test should uh go towards not uh disrupting normal P0 flows and all that so those are some of the uh I would say metrics stable reliable move fast um right code for what matters yep yep I think I think I think those are some some really good some really good um sort of a good summary of Best Practices uh so I again if you have any questions I’m going to pause for a minute here if you have any questions go ahead and put them in the QA um section I think there was there was a sort of a question but it was answered in the in the in this discussion or the chat so I don’t know that we need to ask that specific question um but I will ask one that’s kind of General in that in that area but um what um what security considerations do you have in in the you know as you’re thinking about mobile uh the mobile quality engineering like is there any any particular Frameworks or compliance or anything that you’re looking for that’s really particularly you know important in how you address um sort of security we I want to say we don’t uh use our Frameworks in towards security but we do ensure let’s say for example if we are using the public devices in the cidon platform we ensure we don’t leave any test environment related data uh something that we are very conscious about um we are also conscious about um uh using production data so we don’t uh necessarily run our tests against any production data that would be out there or that would consume production data uh pii and whatnot right so uh when we are setting up our Frameworks uh we are conscious about that what environments we test against and what is out there if we are using external platforms um to extend our test execution capabilities does that answer that’s good that’s that’s that’s a good answer um I made it kind of semi generic I took one that was more specific I made it generic so that’s a good answer um I think those are good considerations and here’s another question again you can um choose to answer it if it’s if you can uh is there any particular reason you use separate device services for IOS and Android um the the short answer is the Android engineering team wanted to be within the Google Android ecosystem um we uh on uh when we’re doing the tech selection for device Farms we wanted to be on the same platform for both but our process just showed us one worked really well for one platform versus the other what we got out of tech test execution and Reporting dashboarding and all of that showed us that it’s okay for us to be on different platforms so we chose what worked best with each platform IOS and Android okay um and then the last question I have is and this isn’t necessarily specific to Android versus iOS but um is there uh do you see it is there a case to be made for testing against Real devices versus just emulation simulation it as as a QE I would say yes uh because that’s where our users are going to be they are not going to use the application on simulators and emulators uh so it depends on how the experience is implemented is how we decide right uh but the QE in me wants to say test even if it’s host in the cloud test with real devices um so yes yeah okay okay good all right great hey thank you very very much um that’s all the time we have for um and people are transitioning to the next session so but uh super appreciate your time I think it was a very very very good session and education for everyone so thank you thank you Matt thanks a lot for the conversation thank you thanks

Ready to accelerate delivery of
your mobile apps?

Request a Demo