Spring OnlineTestConf 2020 - Program
Speakers
Sessions & Schedule
May 19th
Alan Page and Brent Jensen have been talking about “Modern Testing” for nearly 5 years.
It began as a way to talk about what they were seeing as they led and worked with software teams – but the stories they told, and the Modern Testing Principles were quickly recognized by many as a name for things that were already happening.
In this talk, Alan will share some of these stories – some will be his own, but most will be from those who have discovered Modern Testing in one way or another and made the most of it when improving their own organizations.
Alan Page has been improving software quality since 1993 and is currently theDirector of Programs for Monetization Services at Unity. Alan spent over twenty years at Microsoft working on a variety of operating systems and applications in nearly every Microsoft division, and also spent two years as Microsoft’s Director of Test Excellence. Alan blogs at angryweasel.com, rants on twitter (@alanpage) hosts a podcast (w/ Brent Jensen) at angryweasel.com/ABTesting, and on occasion, he speaks at testing and software engineering conferences.
I work at a unique company, where quality is at the core of our development process. Since we serve the emergency healthcare industry, we can’t afford bugs, as they could be dangerous to real humans. What is our secret to maintaining high quality in a fast-paced start up atmosphere where product areas are broad and include mobile, web, and a public API? Communication. In a healthcare emergency, poor communication can literally mean death. In software testing, poor communication can mean a drastic decrease in product functionality and usability.
Effective communication across teams is key to quality, and I have 3 actionable ways for individual testers to increase communication at their companies which I will be presenting in my session.
Rachael Lovallo is a software tester driven by a passion for using technology to create positive social change. She currently puts her skills to work as a Senior Test Engineer at Pulsara, a software company that connects healthcare teams to improve patient outcomes and bring acute healthcare into the 21st century.
Rachael is a tester to her core, persistently seeking software bugs and bullet-proof steps to reproduce them. She finds daily fulfilment collaborating with Pulsara’s development team to build a healthcare communication system that safely handles patient data, is intuitive to use, and truly improve patient outcomes.
In the past Olympics, we watched as Michael Phelps did something that no other Olympian had ever managed to accomplish. He won more gold medals than ever before. We spent weeks watching as he won gold medals time after time.
It’s easy for someone to imagine that athletes such as Phelps are born winners. It’s easy to think that it’s in their DNA to experience such greatness. What many fail to realize is that while Phelps showed us his master skills in swimming for a few weeks, he prepared for these events for many months and years. He spent day after day practicing, refining his techniques, modifying his strategies, and improving his results.
Being an expert tester is no different. While the art and craft of testing and being a thinking tester is something that is built within you, simply going to work every day and being a tester is not always enough. Each of us have the opportunity to become “gold medal testers” by practicing, studying, refining our skills, and building our craft.
In this presentation, we will evaluate extracurricular activities and practices that will enable you to grow from a good tester to a great tester.
Key Takeaways:
1. Inputs from the testing community on how they improve their skills
2. Suggestions for online training and materials which should be studied
3. How to leverage social media to interact with the testing community
4. Contributions you can make to the testing community to build your brand
Mike Lyles is a Director of QA and Project Management with over 25 years of IT experience in multiple organizations, including Fortune 50 companies. He has exposure in various IT leadership roles: software development, program management office, and software testing. He has led various teams within testing organizations: functional testing, test environments, software configuration management, test data management, performance testing, test automation, and service virtualization.
Mike has been successful in career development, team building, coaching, and mentoring of IT & QA professionals. Mike has been an international keynote speaker at multiple conferences and events, and is regularly published in testing publications and magazines. His first published motivational book, “The Drive-Thru is Not Always Faster”, was released in 2019.
You can learn more about Mike at www.MikeWLyles.com where you can also find his social media links and connect with him there also.
His book site is at www.TheDriveThruBook.com .
Lisa, Gitte, Alex and Lena sit down and discuss their different experience regarding how to grow a culture that allows both people, software and business to grow.
Join us for a panel discussion around topics like continuous learning, psychological safety, having a growth mindset and modern leadership.
Discussion will start with a few prepared questions and after that will continue with questions from the audience.
Lisa Crispin is the co-author, with Janet Gregory, of More Agile Testing: Learning Journeys for the Whole Team (2014), Agile Testing: A Practical Guide for Testers and Agile Teams (2009), the LiveLessons “Agile Testing Essentials” video course, and “Agile Testing for the Whole Team” 3-day training course offered through the Agile Testing Fellowship. Lisa was voted by her peers as the Most Influential Agile Testing Professional Person at Agile Testing Days in 2012. She is a testing advocate working at mabl to explore leading practices in testing in the software community. Please visit www.lisacrispin.com and www.agiletester.ca for more.
She is one of the directors on the board for Association for Software Testing and an avid blogger, speaker and workshop fascilitator.
Alex Schladebeck is a passionate tester whose favourite topics are quality, agility and humans. She is CEO and Head of Quality at Bredex GmbH. In these roles, she supports colleagues, customers and teams on their journey to better quality – be it in products, in processes or in their communication.
In previous roles, she was responsible for enabling teams and growing quality. Now she enables others to do that work, and works on nurturing a system in the company where everyone can flourish.
Alex views the world through the curious eyes of a tester and loves learning new things. She shares her knowledge and experience in workshops, coaching sessions and as a speaker or keynote speaker at conferences.
Find her on Twitter: @alex_schl
Gitte Klitgaard is an agile coach, hugger, friend, and much more. She lives and love agile. She took the oath of non-allegiance. Why fight over methods when we can use the energy to help people? Gitte Klitgaard wants to change the world by helping people. Her preferred tools are listening, intuition, and caring. And: the retrospective. Inspecting and adapting is essential.
She has a great interest in how people function, psychological safety, courage and vulnerability,, how the brain works, what motivates us, how we can feel better about our selves, how to be perfect in all our imperfections.
She is a geek and passionate about a lot 🙂
Find her on Twitter: @nativewired
May 20th
In this hands-on track, you’ll explore very simple systems to discover surprising emergent behaviours. You’ll find dynamic systems that (under some conditions) resonate, explode, jitter or simply die. We’ll look at the ways that systems produce such behaviours, how to trigger them, how to observe them, and how you might be able to regulate them in the complex systems we build. You’ll find this track useful if your system is more than the sum of its parts.
Bring a device with a browser, and have fun with systems analysis.
Suitable for: anyone who works with systems, and particularly suits those who build or adjust systems.
Take aways
A regular keynote speaker and teacher at international events, and an active participant in a variety of testing communities, James has written award-winning papers, built the Black Box puzzles, kicked off the TestLab, and run the London Exploratory Workshop in Testing. He received the 2015 European Tester Excellence award.
Twitter: @workroomprds
Site: https://workroom-productions.
Sometimes you’re asked to start testing in a context that is not ideal: you’ve only just joined the project, the test environment is broken, the product is migrating to a new stack, the developer has left, no-one seems quite sure what’s being done or why, and there is not much time.
Knowing where to begin and what to focus on can be difficult and so in this talk I’ll describe how I try to meet that challenge.
I’ll share a definition of testing which helps me to navigate uncertainty across contexts and decide on a starting point. I’ll catalogue tools that I use regularly such as conversation, modelling, and drawing; the rule of three, heuristics, and background knowledge; mission-setting, hypothesis generation, and comparison. I’ll show how they’ve helped me in my testing, and how I iterate over different approaches regularly to focus my testing.
The takeaways from this talk will be a distillation of hard-won, hands-on experience that has given me
* an expansive, iterative view of testing
* a comprehensive catalogue of testing tools
* the confidence to start testing anything from anywhere
As testers, we typically focus on metaphors from Engineering and Manufacturing and learn from the related disciplines, yet there is so much to gain by learning about other industries and Disciplines.
The Aviation industry recently has had a lot of bad press yet it still much we can learn from in terms of Quality Culture. Aviation has a history of continually learning and improving through the use of checklists, black box recorders, blameless culture, cockpit re-design and Crew Resource Management (CRM).
In this talk, the stories of the origins of these innovations will be shared based on the findings from two significant Books that have focused on the Aviation Industry, The Checklist Manifesto by Atul Gawande, and Black Box Thinking by Matthew Syed.
– The Relationship between Culture and Quality.
– Willingness to learn from other Industries and Disciplines, and apply the lessons learned.
– Keys Learnings from the Aviation Industry that you can apply in your testing role
Conor Fitzgerald is a Quality Advocate with 15 years of experience. He is passionate about whole team testing and working with teams on quality improvements. Currently, he is working as Head of Testing for Poppulo in Cork, Ireland. He has spoken at a number of conferences in recent years, including SoftTestDublin, TestBash, and RebelCon.
Conor is an active member of the test community and is a Co-Founder of the Ministry of Testing Cork.
Previous positions included Test Consultant, Test Lead/Manager and SDET. These positions were held in a wide variety of industries from embedded systems to financial systems with companies ranging from startups to large multinationals such as Intel.
Occasionally blogs at conorfi.com
How can we improve our testing, automated or not, by looking at trends and patterns?
We’ll look at some interesting findings and how root cause analysis could make the impossible feasible
– Insights from looking at your test results over time
– Common anti-patterns
– Trends to look for and what to do about them
She is one of the directors on the board for Association for Software Testing and an avid blogger, speaker and workshop fascilitator.
May 21st
In this session, Jason details the existing quality challenges and problems at Freelancer, the world’s largest freelancing and crowdsourcing marketplace.
He will present 3 main strategies that transform quality engineering in the organisation, and elaborate on them using case studies. Attendees will leave knowing how to apply the 3 strategies for changing the quality engineering mindset and culture within 30 days.Audience will learn and identify key initiatives in transforming quality engineering in the web domain organisation.
Audience will also learn the challenges faced when rolling out these transformation strategies in the organisation and how to overcome them. Finally audience will be able to apply the THREE strategies in transforming quality engineering in their workplace within the short period of time – 30 days.
Jason Lee has been practising in the software testing field for more than 14 years, both on the research and industry sides. He started his career as a Software Test Engineer with Motorola Technology and iWOW Pty. Ltd. He then took up a PhD research scholarship to further advance his passion in software testing at The University of Melbourne. His PhD thesis focused on spectral debugging, a white-box testing technique that uses test coverage information to help programmers locate bugs effectively.
He joined Dolby in 2011 and involved in leading QA effort and hands on test automation development. He worked in mobile application and embedded software testing especially in setup boxes for Amazon and Google.
He recently joined Freelancer as Director of QA overseeing the quality transformation for the organisation. He leads a team of 26 quality engineers across the organisation. He actively provides quality and testing best practices to development, products team and all the key stakeholders.
What’s the testability?
Why should you consider testability?
Why is it difficult to improve the testability on your application, especially E2E testing automation?
Takuya Suemura is a speaker, blogger, and testing automation specialist at Autify, the E2E testing automation platform. He is passionate about getting rid of the complexity from an E2E testing and its cross-browser testing as much as possible and helping all agile teams to automate their acceptance level tests from day one of their development.
So the mantra goes, in today’s development world testing is the responsibility of the whole team. But for 90% of organizations (or more) this does not really translate into practice and testing is still mostly the responsibility of the testers.
There are many reasons for this but one of the most common is that people who were never testers (in the present or past) do not really know how to test. Even when they want to test, they still need someone to manage the process, guide their testing work, and orchestrate the testing process for the project or team.
Testing can be everyone’s responsibility, but Test Management still needs to be the responsibility of a Test Architect or Test Specialist.
In this session Joel will review some of the main responsibilities and differences between managing testing in a traditional team versus the work needed on a whole-team-testing-approach.
– How to create testing artefacts for the different types of team members
– Differences in infrastructure needed to facilitate whole-team-testing
– Recommended practices for ensuring testing tasks are done by all team members
– Common pitfalls of all-team-testing and how to work around them
Joel Montvelisky is the chief solution architect and QA manager at PractiTest, where among other things he works with hundreds of organizations worldwide to improve their testing and most importantly their testing results. During the past twenty years he has been a tester, QA manager, and consultant/trainer for companies in Israel, Europe, and the United States. Joel is the cofounder of a number of cool testing-related projects such as the Annual State of Testing Report and the OnlineTestConf, and publishes his thoughts on QA and testing on his QABlog.practitest.com.
Learn more about Trish at her website: http://trishkhoo.com
Follow us on Twitter!
For continuous updates and sneak peeks at what’s to come