Spring OnlineTestConf 2020 - Program

Speakers

Sessions & Schedule

May 19th

Session are in EDT (USA)

When is this in my timezone? 

 

Alan Page and Brent Jensen have been talking about “Modern Testing” for nearly 5 years.

 

It began as a way to talk about what they were seeing as they led and worked with software teams – but the stories they told, and the Modern Testing Principles were quickly recognized by many as a name for things that were already happening.

 

In this talk, Alan will share some of these stories – some will be his own, but most will be from those who have discovered Modern Testing in one way or another and made the most of it when improving their own organizations.

 


 

Alan Page has been improving software quality since 1993 and is currently theDirector of Programs for Monetization Services at Unity. Alan spent over twenty years at Microsoft working on a variety of operating systems and applications in nearly every Microsoft division, and also spent two years as Microsoft’s Director of Test Excellence. Alan blogs at angryweasel.com, rants on twitter (@alanpage) hosts a podcast (w/ Brent Jensen) at angryweasel.com/ABTesting, and on occasion, he speaks at testing and software engineering conferences.

 

When is this in my timezone?

 

I work at a unique company, where quality is at the core of our development process. Since we serve the emergency healthcare industry, we can’t afford bugs, as they could be dangerous to real humans. What is our secret to maintaining high quality in a fast-paced start up atmosphere where product areas are broad and include mobile, web, and a public API? Communication. In a healthcare emergency, poor communication can literally mean death. In software testing, poor communication can mean a drastic decrease in product functionality and usability.

 

Effective communication across teams is key to quality, and I have 3 actionable ways for individual testers to increase communication at their companies which I will be presenting in my session.

 


 

Rachael Lovallo is a software tester driven by a passion for using technology to create positive social change. She currently puts her skills to work as a Senior Test Engineer at Pulsara, a software company that connects healthcare teams to improve patient outcomes and bring acute healthcare into the 21st century.

 

Rachael is a tester to her core, persistently seeking software bugs and bullet-proof steps to reproduce them. She finds daily fulfilment collaborating with Pulsara’s development team to build a healthcare communication system that safely handles patient data, is intuitive to use, and truly improve patient outcomes.

When is this in my timezone?

 

In the past Olympics, we watched as Michael Phelps did something that no other Olympian had ever managed to accomplish.  He won more gold medals than ever before.  We spent weeks watching as he won gold medals time after time.

 

It’s easy for someone to imagine that athletes such as Phelps are born winners.  It’s easy to think that it’s in their DNA to experience such greatness.  What many fail to realize is that while Phelps showed us his master skills in swimming for a few weeks, he prepared for these events for many months and years.  He spent day after day practicing, refining his techniques, modifying his strategies, and improving his results. 

 

Being an expert tester is no different.  While the art and craft of testing and being a thinking tester is something that is built within you, simply going to work every day and being a tester is not always enough.  Each of us have the opportunity to become “gold medal testers” by practicing, studying, refining our skills, and building our craft.

 

In this presentation, we will evaluate extracurricular activities and practices that will enable you to grow from a good tester to a great tester.  

 

Key Takeaways:

1. Inputs from the testing community on how they improve their skills

2. Suggestions for online training and materials which should be studied

3. How to leverage social media to interact with the testing community

4. Contributions you can make to the testing community to build your brand

 


 

Mike Lyles is a Director of QA and Project Management with over 25 years of IT experience in multiple organizations, including Fortune 50 companies. He has exposure in various IT leadership roles: software development, program management office, and software testing. He has led various teams within testing organizations: functional testing, test environments, software configuration management, test data management, performance testing, test automation, and service virtualization.

 

Mike has been successful in career development, team building, coaching, and mentoring of IT & QA professionals. Mike has been an international keynote speaker at multiple conferences and events, and is regularly published in testing publications and magazines. His first published motivational book, “The Drive-Thru is Not Always Faster”, was released in 2019.

 

You can learn more about Mike at www.MikeWLyles.com where you can also find his social media links and connect with him there also.
His book site is at www.TheDriveThruBook.com .

When is this in my timezone?

 

Lisa, Gitte, Alex and Lena sit down and discuss their different experience regarding how to grow a culture that allows both people, software and business to grow.
Join us for a panel discussion around topics like continuous learning, psychological safety, having a growth mindset and modern leadership.
Discussion will start with a few prepared questions and after that will continue with questions from the audience.

 


 

Lisa Crispin is the co-author, with Janet Gregory, of More Agile Testing: Learning Journeys for the Whole Team (2014), Agile Testing: A Practical Guide for Testers and Agile Teams (2009), the LiveLessons “Agile Testing Essentials” video course, and “Agile Testing for the Whole Team” 3-day training course offered through the Agile Testing Fellowship. Lisa was voted by her peers as the Most Influential Agile Testing Professional Person at Agile Testing Days in 2012. She is a testing advocate working at mabl to explore leading practices in testing in the software community. Please visit www.lisacrispin.com and www.agiletester.ca for more.

 

Lena Wiberg has been in the IT-industry since 1999 when she got her first job as a developer. 2009, after a decade of code, she found her calling in testing. Since then she has worked in most testing-related roles, from lone tester in a team so building and leading testing organizations. She believes continuous improvement is something we should all strive for by keeping up to date and always challenge ourselves, our assumptions and the way things are done.  
She is one of the directors on the board for Association for Software Testing and an avid blogger, speaker and workshop fascilitator.
Lena lives outside of Stockholm, in a big house filled with gaming stuff and books with her family. 
She is currently working as an Engineering Manager at Blocket, Sweden’s largest marketplace.
You can find her at http://www.pejgan.se or on twitter as @LenaPejgan

 

Alex Schladebeck is a passionate tester whose favourite topics are quality, agility and humans. She is CEO and Head of Quality at Bredex GmbH. In these roles, she supports colleagues, customers and teams on their journey to better quality – be it in products, in processes or in their communication. 

In previous roles, she was responsible for enabling teams and growing quality. Now she enables others to do that work, and works on nurturing a system in the company where everyone can flourish. 

Alex views the world through the curious eyes of a tester and loves learning new things. She shares her knowledge and experience in workshops, coaching sessions and as a speaker or keynote speaker at conferences. 

Find her on Twitter: @alex_schl

 

Gitte Klitgaard is an agile coach, hugger, friend, and much more. She lives and love agile. She took the oath of non-allegiance. Why fight over methods when we can use the energy to help people? Gitte Klitgaard wants to change the world by helping people. Her preferred tools are listening, intuition, and caring. And: the retrospective. Inspecting and adapting is essential.

She has a great interest in how people function, psychological safety, courage and vulnerability,, how the brain works, what motivates us, how we can feel better about our selves, how to be perfect in all our imperfections.

She is a geek and passionate about a lot 🙂

Find her on Twitter: @nativewired

 

May 20th

Session are in CEST (EU)

When is this in my timezone?

 

In this hands-on track, you’ll explore very simple systems to discover surprising emergent behaviours. You’ll find dynamic systems that (under some conditions) resonate, explode, jitter or simply die. We’ll look at the ways that systems produce such behaviours, how to trigger them, how to observe them, and how you might be able to regulate them in the complex systems we build. You’ll find this track useful if your system is more than the sum of its parts.

Bring a device with a browser, and have fun with systems analysis.

Suitable for: anyone who works with systems, and particularly suits those who build or adjust systems.

Take aways 

* direct experience of complex behaviour emerging from simple parts 
* insight into damping and amplification of behaviours 
* recognition that these simple systems are displaying their nature, not a pathology

 



James Lyndsay is an independent consultant, specialising in systems testing, helping people to make informed and practical decisions about their testing. Teams use him to find surprises, to adapt their approaches, and to keep their testers interested. Organisations use him to build communications between testers and the board.

A regular keynote speaker and teacher at international events, and an active participant in a variety of testing communities, James has written award-winning papers, built the Black Box puzzles, kicked off the TestLab, and run the London Exploratory Workshop in Testing. He received the 2015 European Tester Excellence award.

Twitter: @workroomprds 
Site: https://workroom-productions.com

When is this in my timezone?

 

Sometimes you’re asked to start testing in a context that is not ideal: you’ve only just joined the project, the test environment is broken, the product is migrating to a new stack, the developer has left, no-one seems quite sure what’s being done or why, and there is not much time. 

Knowing where to begin and what to focus on can be difficult and so in this talk I’ll describe how I try to meet that challenge.

I’ll share a definition of testing which helps me to navigate uncertainty across contexts and decide on a starting point. I’ll catalogue tools that I use regularly such as conversation, modelling, and drawing; the rule of three, heuristics, and background knowledge; mission-setting, hypothesis generation, and comparison. I’ll show how they’ve helped me in my testing, and how I iterate over different approaches regularly to focus my testing.

The takeaways from this talk will be a distillation of hard-won, hands-on experience that has given me

* an expansive, iterative view of testing
* a comprehensive catalogue of testing tools
* the confidence to start testing anything from anywhere

When is this in my timezone?

As testers, we typically focus on metaphors from Engineering and Manufacturing and learn from the related disciplines, yet there is so much to gain by learning about other industries and Disciplines.

 

The Aviation industry recently has had a lot of bad press yet it still much we can learn from in terms of Quality Culture. Aviation has a history of continually learning and improving through the use of checklists, black box recorders, blameless culture, cockpit re-design and Crew Resource Management (CRM).

 

In this talk, the stories of the origins of these innovations will be shared based on the findings from two significant Books that have focused on the Aviation Industry, The Checklist Manifesto by Atul Gawande, and Black Box Thinking by Matthew Syed.

 

– The Relationship between Culture and Quality.
– Willingness to learn from other Industries and Disciplines, and apply the lessons learned.
– Keys Learnings from the Aviation Industry that you can apply in your testing role

 


 

Conor Fitzgerald is a Quality Advocate with 15 years of experience. He is passionate about whole team testing and working with teams on quality improvements. Currently, he is working as Head of Testing for Poppulo in Cork, Ireland. He has spoken at a number of conferences in recent years, including SoftTestDublin, TestBash, and RebelCon.

Conor is an active member of the test community and is a Co-Founder of the Ministry of Testing Cork.

Previous positions included Test Consultant, Test Lead/Manager and SDET. These positions were held in a wide variety of industries from embedded systems to financial systems with companies ranging from startups to large multinationals such as Intel.

Occasionally blogs at conorfi.com

When is this in my timezone?

 

How can we improve our testing, automated or not, by looking at trends and patterns?

We’ll look at some interesting findings and how root cause analysis could make the impossible feasible

 

– Insights from looking at your test results over time
– Common anti-patterns
– Trends to look for and what to do about them

 


 

Lena Wiberg has been in the IT-industry since 1999 when she got her first job as a developer. 2009, after a decade of code, she found her calling in testing. Since then she has worked in most testing-related roles, from lone tester in a team so building and leading testing organizations. She believes continuous improvement is something we should all strive for by keeping up to date and always challenge ourselves, our assumptions and the way things are done.  
She is one of the directors on the board for Association for Software Testing and an avid blogger, speaker and workshop fascilitator.
Lena lives outside of Stockholm, in a big house filled with gaming stuff and books with her family. 
She is currently working as an Engineering Manager at Blocket, Sweden’s largest marketplace.
You can find her at http://www.pejgan.se or on twitter as @LenaPejgan

 

May 21st

Session are in AEST (AUS).

When is this in my timezone?

In this session, Jason details the existing quality challenges and problems at Freelancer, the world’s largest freelancing and crowdsourcing marketplace.

 

He will present 3 main strategies that transform quality engineering in the organisation, and elaborate on them using case studies. Attendees will leave knowing how to apply the 3 strategies for changing the quality engineering mindset and culture within 30 days.Audience will learn and identify key initiatives in transforming quality engineering in the web domain organisation.

 

Audience will also learn the challenges faced when rolling out these transformation strategies in the organisation and how to overcome them. Finally audience will be able to apply the THREE strategies in transforming quality engineering in their workplace within the short period of time – 30 days.

 


 

Jason Lee has been practising in the software testing field for more than 14 years, both on the research and industry sides. He started his career as a Software Test Engineer with Motorola Technology and iWOW Pty. Ltd. He then took up a PhD research scholarship to further advance his passion in software testing at The University of Melbourne. His PhD thesis focused on spectral debugging, a white-box testing technique that uses test coverage information to help programmers locate bugs effectively.

He joined Dolby in 2011 and involved in leading QA effort and hands on test automation development. He worked in mobile application and embedded software testing especially in setup boxes for Amazon and Google.

He recently joined Freelancer as Director of QA overseeing the quality transformation for the organisation. He leads a team of 26 quality engineers across the organisation. He actively provides quality and testing best practices to development, products team and all the key stakeholders.

When is this in my timezone?

 

What’s the testability?
Why should you consider testability?
Why is it difficult to improve the testability on your application, especially E2E testing automation?

 


 

Takuya Suemura is a speaker, blogger, and testing automation specialist at Autify, the E2E testing automation platform. He is passionate about getting rid of the complexity from an E2E testing and its cross-browser testing as much as possible and helping all agile teams to automate their acceptance level tests from day one of their development.

When is this in my timezone?

So the mantra goes, in today’s development world testing is the responsibility of the whole team. But for 90% of organizations (or more) this does not really translate into practice and testing is still mostly the responsibility of the testers.

 

There are many reasons for this but one of the most common is that people who were never testers (in the present or past) do not really know how to test. Even when they want to test, they still need someone to manage the process, guide their testing work, and orchestrate the testing process for the project or team.

 

Testing can be everyone’s responsibility, but Test Management still needs to be the responsibility of a Test Architect or Test Specialist.

In this session Joel will review some of the main responsibilities and differences between managing testing in a traditional team versus the work needed on a whole-team-testing-approach.

– How to create testing artefacts for the different types of team members
– Differences in infrastructure needed to facilitate whole-team-testing

– Recommended practices for ensuring testing tasks are done by all team members

– Common pitfalls of all-team-testing and how to work around them

 


 

Joel Montvelisky is the chief solution architect and QA manager at PractiTest, where among other things he works with hundreds of organizations worldwide to improve their testing and most importantly their testing results. During the past twenty years he has been a tester, QA manager, and consultant/trainer for companies in Israel, Europe, and the United States. Joel is the cofounder of a number of cool testing-related projects such as the Annual State of Testing Report and the OnlineTestConf, and publishes his thoughts on QA and testing on his QABlog.practitest.com.

 

 
Have you ever thought of flipping your corporate desk and going solo? Choosing your own hours, your own clients, and not having to answer to anybody? Maybe making some of that sweet consulting money while funding your very own startup?
 
That’s what I did, and I did it for three years. I started work as a consultant and also became an artist, a designer, a business analyst, an app developer, an entrepreneur, a trainer, a speaker, an evangelist, a salesperson, a marketer and so many more things. There were times when I thought this was the easiest job ever, and times when I was scared to look at my bank account. It was a time of epic wins, colossal failures, plenty of hustle, a side of romance, and a lot of valuable lessons learned.
 
Let me tell you my whole story, and the lessons that have stayed valuable to me to this day.

 


 
Trish Khoo is a technologist working as an engineering manager at Clipchamp in Brisbane, Australia. She has worked in software for more than 20 years, including Google and Microsoft. She has a global reputation of expertise in software testing technology and practices and delivers keynotes, talks, training, and mentoring to other technologists around the world. When not doing stuff like this she’s doing a million other things, but mainly she’s taking pictures of her cats and putting them on the internet.

Learn more about Trish at her website: http://trishkhoo.com

Follow us on Twitter!

For continuous updates and sneak peeks at what’s to come

Quality Advocacy: The Next Generation of Testing Excellence​

In this talk, we will explore the inevitable shift from traditional Quality Assurance to Quality Advocacy. This evolution moves beyond defect detection, driving a proactive and strategic approach to quality at every stage of the development lifecycle. We’ll discuss how automation and Continuous Integration are key drivers of this revolution, and how T-shaped QA professionals are becoming architects of digital excellence, focusing on delivering value-driven services rather than just software. This new paradigm advocates for quality as a collaborative, organization-wide effort that aligns with the modern enterprise’s needs.

Are You Having Cheese or Steak?​

Building an automation solution that will support our teams for years to come can be a challenge. But sometimes what we hope to milk actually turns out to be a rodeo. What are some early indicators that the solution we are working on might not work, and should we be better off shooting it?

Model Based Testing: A Powerful Way to QA​

Model-based testing is a novel technique that makes QA teams more powerful. It focuses on intended system behavior, and then automatically derives test plans and scenarios from it. The intended behavior is visualized and verified against regulations/policies, helping early detection of requirement errors.

The Future of QA: Integrating AI for Intelligent Test Management​

In this session, we will explore the transformative potential of AI in Quality Assurance, particularly how it can be leveraged for intelligent test management. We will discuss practical implementations, the benefits of AI-driven testing, and strategies for integrating these tools into existing QA processes. Attendees will gain insights into the future of QA and how to stay ahead in an increasingly automated landscape.

Automate Smarter, Not Harder: GitHub Copilot your AI Test Buddy

Explore how GitHub Copilot streamlines test automation by generating and optimizing scripts, from unit to API tests. Learn how it reduces development time, suggests best practices, and improves code quality. This session is essential for those aiming to boost efficiency and maintainability in test automation.

This one is for our audience in Australia

Part One

Introduction and Greetings

Test Automation: Friend or Foe?
by Maaret Pyhäjärvi

Break

Break

And here's for our audience in the Americas

Part Two

Introduction and Greetings

Taking Your IT Leadership to the Next Level
by Mike Lyles

Break

Break

A Holistic Approach to Testing in an Agile Context

In the software world, we talk a lot about quality. Business leaders say they want the best quality product – though they often fail to understand how investing in quality pays off.  Customers have their own views of what quality means to them, which may be surprising to the business. Delivery teams are concerned about code correctness, and the many types of testing activities.  

With so many different perspectives, it’s no wonder organizations get confused about how to deliver a product that delights their customers. The Holistic Testing Model helps teams identify the levels of quality they need for their product. It helps them plan the types of testing activities they need all the way around the continuous software development loop. Using this holistic approach to agile development helps teams feel confident in delivering changes frequently. Lisa will share her experiences with this whole-team approach to quality and testing. 

Key learnings: 

  • A holistic quality and testing approach throughout the continuous loop of software  development, using the Holistic Testing Model 
  • Apply the Holistic Testing Model to create an effective test strategy
  • The importance of bug prevention and value injection over bug detection How to plan and fit testing activities at all levels into short agile iterations with frequent  delivery, and continuous delivery

Gil Zilberfeld

Has been in software since childhood, writing BASIC programs on his trusty Sinclair ZX81. He is a trainer and mentor working to make software better.
With more than 25 years of developing commercial software, he has vast experience in software methodology and practices. From unit testing to exploratory testing, design practices to clean code, API to web testing – he’s done it all.
Gil speaks frequently at international conferences about testing, TDD, clean code, and agile practices. He blogs and posts videos on these topics at testingil.com and YouTube channel. Gil is the author of “Everyday Unit Testing”,
In his spare time, he shoots zombies, for fun.

Lisette Zounon

Is an award-winning tech executive, serial entrepreneur, and engineering leader with two decades of experience helping people and companies improve the quality of their applications, with solid tools, a simple process, and a smart team. She firmly believes that industry best practices including implementing agile methodologies, DevOps practices, and leveraging Artificial Intelligence are invaluable to the success of any software delivery.

Lisette was responsible for leading and managing high-performing quality-testing teams throughout all phases of the software development testing cycle; ensuring that all information systems, products, and services meet or exceed organization and industry quality standards as well as end-users requirements. This includes establishing and maintaining the Quality strategy, processes, platforms, and resources needed to deliver 24×7 operationally critical solutions for many of the world’s largest companies.

Lisa Crispin

Is an independent consultant, author, and speaker based in Vermont, USA.  Together with Janet Gregory, she co-authored Holistic Testing: Weave Quality Into Your  Product; Agile Testing Condensed: A Brief Introduction; More Agile Testing: Learning  Journeys for the Whole Team; and Agile Testing: A Practical Guide for Testers and Agile  Teams; and the LiveLessons “Agile Testing Essentials” video course. She and Janet co-founded a training company offering two live courses worldwide: “Holistic Testing:  Strategies for Agile Teams” and “Holistic Testing for Continuous Delivery”. 

Lisa uses her long experience working as a tester on high-performing agile teams to help organizations assess and improve their quality and testing practices, and succeed with continuous delivery. She’s a DORA Guide for the DORA community of practice. Please visit: https://lisacrispin.com, https://agiletester.ca, https://agiletestingfellow.com, and  https://linkedin.com/in/lisacrispin/ for details and contact information.

Suzanne Kraaij

With almost 15 years in the field of testing, Suzanne has gained a lot of experience with various clients in various industries. In the company where she works, Suzanne has a pioneering role as a core member within their testing community. In this position, she is actively involved in knowledge sharing and further development of the field of software testing and quality engineering.

Mike Lyles

Is an international keynote speaker, author, and coach. He is the Head of IT with Maxwell Leadership, an amazing company founded by leadership expert, author, and speaker, John C. Maxwell. Mike has over 30 years of experience in IT, coaching, mentoring, and building successful teams with multiple organizations, including Fortune 50 companies. As a Maxwell Leadership Certified coach and speaker, Mike’s “purpose” is to inspire others with value-based leadership and growth principles and to serve others in their journey toward significance and success. Mike has traveled to dozens of countries and hundreds of events to share his experiences with thousands through keynotes, workshops, and other special events. Mike is the author of the self-help motivational book, “The Drive-Thru Is Not Always Faster”.

George Ukkuru

Is a performance-driven technocrat with over two and a half decades of experience in Test Engineering, Product Management, and User Experience. He specializes in optimizing costs, improving market speed, and enhancing quality by deploying the right tools, practices, and platforms. Throughout his career, George has worked with several Fortune 500 companies, delivering impactful solutions that drive efficiency and innovation. Currently, he serves as General Manager at McLaren Strategic Solutions, where he continues to leverage his expertise to lead teams and projects that align with business goals, ensuring high-quality outcomes and strategic growth.

Esther Okafor

Is a Quality Assurance Engineer at Storyblok, bringing unique experience in API testing and a strong passion for building high-quality software. She has previously worked with renowned companies like Flutterwave, Renmoney, and Venture Garden Group. Over her four years in the tech industry, Esther has trained and mentored over 100 women in tech through initiatives such as She Code Africa and Bug Detective. Her perspective offers valuable insights into the world of QA, and she is committed to helping others succeed. Additionally, she has authored several blog posts that provide essential guidance to Quality Assurance professionals, helping them excel in their day-to-day roles.

Michael Bar-Sinai

Software engineer by training, with a mid-career PhD in formal methods and requirement modeling. Created various information systems for NGOs, ranging from work accident tracking to geopolitics information systems. Worked on data science tools at Harvard’s IQSS. CTO and Co-Founder at Provengo, a start-up creating model-driven software engineering tools. Married, 3 kids, sadly no dogs.

Tim Munn

Technical Test Leader with almost 20 years experience in the field. Have automated apps and lead international technical qa teams in areas from Pharma to Fintech. Currently a Senior SDET at Spotlight.

Joel Montvelisky

Is a Co-Founder and Chief Product officer at PractiTest, and has been in testing and QA since 1997, working as a tester, QA Manager and Director, and Consultant for companies in Israel, the US, and the EU. Joel is a Forbes council member, and a blogger and is constantly imparting webinars on a number of testing and Quality Related topics.
In addition, Joel is the founder and Chair of the OnlineTestConf, the co-founder of the State of Testing survey and report, and a Director at the Association of Software Testing.
Joel is a seasoned conference speaker worldwide, among them the STAR Conferences, STPCon, JaSST, TestLeadership Conf, CAST, QA&Test, and more.

Maaret Pyhäjärvi

Is an exploratory tester extraordinaire and Director, Consulting at CGI. She is a tester, (polyglot) programmer, speaker, author, conference designer, and community facilitator. She has been awarded prestigious testing awards, Most Influential Agile Testing Professional Person 2016 (MIATPP) and EuroSTAR Testing Excellence Award (2020), Tester Worth Appreciating (2022), and selected as Top-100 Most Influential in ICT in Finland 2019-2023.

Francisco Di Bartolomeo

An experienced Test Discipline Lead with over ten years of expertise across diverse testing areas. Passionate about cultivating a quality-driven culture, he excels in coaching and mentoring teams both technically and professionally. He has led numerous quality assurance initiatives, advocating for risk-based testing and shift-left practices to integrate quality at every stage of development. Guided by the belief that “Quality is a habit,” Francisco is dedicated to making quality a constant practice, establishing himself as a visionary in software testing.

API Test Planning LIVE

How do you come up with cases for your APIs? Is it enough to check they return the right status? No.

APIs are complex, so even a couple cause us to be overwhelmed by the options. But the options are good. We want the ideas, so we can prioritize based on our needs. We just need to understand our system and come up with proper ones.

History has proven that the best way to come up with ideas is collaboration. So that’s what we’ll do.

This is an interactive session on API Test Planning. Given only two APIs (and a semi-sane moderator), we’ll come up with creative ways to test them.

Sounds easy? APIs are complex. And in this session, we’ll see how much, and how to think of different aspects of APIs when testing.

Search for a Tool Is Like Dating​

Choosing the right toolset for your ecosystem can be a lot like dating. You need to know what you’re looking for, be prepared to make a list, and be willing to check off those boxes to find the right match. In this lightning session, we’ll explore the similarities between choosing a toolset and choosing a date, and learn how to make the best decisions for your needs.

You will learn the following:

  • What should be on your list of requirements when evaluating new toolsets, and how to approach the search to ensure you find the right match for your ecosystem
  • We’ll discuss the importance of compatibility, communication, and trust in both dating and tool selection, and how to use these principles to make the best decisions for your organization.

 

By the end of this session, you’ll walk away with a clear understanding of how to approach the tool selection process like a pro, and how to make the best decisions to support your organization’s success.

OSRS: Your New Test Strategy Multitool​

Have you ever inherited a testset and wondered about its worth? If it’s complete? If it’s good? A lot of test strategy approaches focus on new projects where you start from scratch, but in reality, you’ll often inherit an existing testset of an already running project. So how do you evaluate the value of that testset? How do you see if you are missing something or if you are overtesting things? How to choose what to automate? What to put in a regression testset? What to test first when under time pressure?

This approach can be applied to all of these questions and helps give the whole team insight into the potential value of testing. It will open up the conversation about what you will and won’t be testing with evidence to substantiate those choices.

Taking Your IT Leadership to the Next Level​

What does a day in the life of YOU look like at work? Do you struggle to complete projects on time? Are there issues that seem to pop up with every deliverable? Does your team respect you and your contributions? Does your boss understand what you are trying to accomplish? Do your stakeholders appreciate the outcomes that you and your team provide?

 

It’s very likely that one of these questions resonated with you when you read it. In fact, there is a chance that ALL of them do! We live in a fast-paced world where it’s easy to get so caught up in a routine where every day looks the same.

 

There seems to be minimal time allowed for growing, improving, and building connections.

 

Imagine a new world where you focus daily on personal growth. A world where you engage effectively with your boss, your peers, and your subordinates. A world where you go from just “showing up” or “keeping up” to “growing up” and improving not only your workplace but your personal life.

 

Join Mike Lyles as he shares decades of experiences in IT and leadership roles and how he has used these experiences to help him grow as a leader in IT.

 

Key Takeaways:

  • Key learnings from years of IT experiences
  • Suggestions for how to move from “communicating” to “connecting”
  • Tips to move from leading “followers” to leading “leaders”

Test Automation: Friend or Foe?

Supporting evidence does not teach us as much as opposing evidence. We are people who support and oppose on principle, in search of knowledge. We are balancing the perspectives of friends and foes, professionally, all day long.

We’ve been at this test automation thing quite a while, and three decades have given me the space to come to a principle that helps my projects succeed slightly more often with test automation: Time used on warning about test automation is time away from succeeding with it. We know from a particular literature genre of romance novels that tropes starting with friends or foes both end up with love, and we could adult up to improve our communication to more purposeful change.

A year ago, we set up a panel conversation to seek ideas for ending well off with test automation. In this talk, we lend the tension of disagreements of the past, to enable learning, combined with the stories of real projects.

Maybe at this time of AI and getting computers to ‘act humanly’, we need to team with an old enemy to make sense of how we peacefully coexist with the new, with healthy boundaries keeping the friendship in check.

Key takeaways:

  • When the stakes are high, we work with friends and foes
  • How to increase the odds of good results we can like with the tools
  • How to navigate the recruitment trap and the in-company growth plains

Not Too Little, Not Too Much: How to Test Just the Right Amount​

As business demands for a shorter time to market continue to rise, testing teams can struggle to find the right balance between adhering to these demands and maintaining sufficient coverage to ensure the released products meet the desired quality standards.

Using the power of AI, we offer a model that combines the value each test provides along with the current time limitation to determine which tests are best to execute.