Skip to main content

Tutorials

Begin your experience by attending half- or full-day tutorials. Please note that you must register for the tutorial(s) you want to attend as space is limited and many sell out quickly.

Tutorials
MA A Rapid Introduction to Rapid Software Testing
Michael Bolton, DevelopSense
Monday, April 29, 2013 - 8:30am - 4:30pm

You're under tight time pressure and have barely enough information to proceed with testing. How do you test quickly and inexpensively, yet still produce informative, credible, and accountable results? Rapid Software Testing, adopted by context-driven testers worldwide, offers a field-proven answer to this all-too-common dilemma. In this one-day sampler of the approach, Michael Bolton introduces you to the skills and practice of Rapid Software Testing through stories, discussions, and "minds-on" exercises that simulate important aspects of real testing problems. The rapid approach isn't just testing with speed or a sense of urgency; it's mission-focused testing that eliminates unnecessary work, assures that the most important things get done, and constantly asks how testers can help speed up the successful completion of the project. Join Michael to see how rapid testing focuses on both the mind set and skill set of the individual tester who uses tight loops of exploration and critical thinking skills to help continuously re-optimize testing to match clients' needs and expectations.

More Information
Learn more about Michael Bolton.

Large-scale testing projects severely stress “normal” testing practices. This can result in a number of less than optimal results. A number of innovative ideas and concepts have emerged to support industrial-strength testing of large and complex projects—some successful and others not so successful. Hans Buwalda shares his experiences and the strategies he's developed over the years for large testing on large projects. He describes the possibilities and pitfalls of outsourcing test automation. Learn how to design tests specifically for automation, and how to successfully incorporate keyword testing. The automation discussion will include virtualization and cloud options, how to deal with numerous versions and configurations common to large projects, and how to handle the complexity added by mobile devices. Hans’ information is based on his nineteen years of worldwide experience with testing and test automation involving large projects with test cases executing continuously for many weeks on multiple machines.

More Information
Learn more about Hans Buwalda.
MC Fundamentals of Risk-based Testing
Dale Perry, Software Quality Engineering
Monday, April 29, 2013 - 8:30am - 4:30pm

Whether you are new to testing or looking for a better way to organize your test practices and processes, the Systematic Test and Evaluation Process (STEP™) offers a flexible approach to help you and your team succeed. Dale Perry describes this risk-based framework—applicable to any development lifecycle model—to help you make critical testing decisions earlier and with more confidence. The STEP™ approach helps you decide how to focus your testing effort, what elements and areas to test, and how to organize test designs and documentation. Learn the fundamentals of test analysis and how to develop an inventory of test objectives to help prioritize your testing efforts. Discover how to translate these objectives into a concrete strategy for designing and developing tests. With a prioritized inventory and focused test architecture, you will be able to create test cases, execute the resulting tests, and accurately report on the quality of your application and the effectiveness of your testing. Take back a proven approach to organize your testing efforts and new ways to add more value to your project and organization.

More Information
Learn more about Dale Perry.
MD Managing Application Performance: A Simplified Universal Approach
Scott Barber, PerfTestPlus, Inc.
Monday, April 29, 2013 - 8:30am - 12:00pm

In response to increasing market demand for well-performing applications, many organizations implement performance testing programs, often at great expense. Sadly, these solutions alone are often insufficient to keep pace with emerging expectations and competitive pressures. Scott Barber shares the fundamentals of implementing T4APM™ including specific examples from recent client implementations. T4APM™ is a simple and universal approach that is valuable independently or as an extension of existing performance testing programs. The approach hinges on applying a simple and unobtrusive "Target, Test, Trend, Tune” cycle to tasks in your application lifecycle—from a single unit test through entire system production monitoring. Leveraging T4APM™ on a particular task may require knowledge specific to the task, but learning how to leverage the approach does not. Scott provides everything you need to become the T4APM™ coach and champion, and to help your team keep up with increasing demand for better performance, regardless of your current title or role.

More Information
Learn more about Scott Barber.
ME Leading Change—Even If You’re Not in Charge
Jennifer Bonine, tap|QA, Inc.
Monday, April 29, 2013 - 8:30am - 12:00pm

Has this happened to you? You try to implement a change in your organization and it doesn’t get the support that you thought it would. And, to make matters worse, you can't figure out why. Or, you have a great idea but can’t get the resources required for successful implementation. Jennifer Bonine shares a toolkit of techniques to help you determine which ideas will—and will not—work within your organization. This toolkit includes five rules for change management, a checklist to help you determine the type of change process needed in your organization, techniques for communicating your ideas to your target audience, a set of questions you can ask to better understand your executives’ goals, and methods for overcoming resistance to change from teams you don’t lead. These tools—together with an awareness of your organization’s core culture—will help you identify which changes you can successfully implement and which you should leave until another day.

More Information
Learn more about Jennifer Bonine.
MF Seven Keys to Navigating Your Agile Testing Transition
Bob Galen, RGalen Consulting
Monday, April 29, 2013 - 8:30am - 12:00pm

So you’ve “gone agile” and have been relatively successful for a year or so. But how do you know how well you’re really doing? And how do you continuously improve your practices? And when things get rocky, how do you handle the challenges without reverting to old habits? You realize that the path to high-performance agile testing isn’t easy or quick. It also helps to have a guide. So consider this workshop your guide to ongoing, improved, and sustained high-performance. Join seasoned agile testing coach Bob Galen as he share lessons from his most successful agile testing transitions. You’ll explore actual team case studies for building team skills, embracing agile requirements, fostering customer interaction, building agile automation, driving business value, and testing at-scale stories of agile testing excellence. You’ll examine the mistakes, adjustments, and the successes—so you’ll learn how to react to real-world contexts. Leave with a better view of your team’s strengths, weaknesses, and where you need to focus to improve.

More Information
Learn more about Bob Galen.
MG Measurement and Metrics for Test Managers
Rick Craig, Software Quality Engineering
Monday, April 29, 2013 - 8:30am - 12:00pm

To be most effective, test managers must develop and use metrics to help direct the testing effort and make informed recommendations about the software’s release readiness and associated risks. Because one important testing activity is to “measure” the quality of the software, test managers must measure the results of both the development and testing processes. Collecting, analyzing, and using metrics is complicated because many developers and testers are concerned that the metrics will be used against them. Join Rick Craig as he addresses common metrics—measures of product quality, defect removal efficiency, defect density, defect arrival rate, and testing status. Learn the guidelines for developing a test measurement program, rules of thumb for collecting data, and ways to avoid “metrics dysfunction.” Rick identifies several metrics paradigms and discusses the pros and cons of each. Delegates are urged to bring their metrics problems and issues for use as discussion points.  

More Information
Learn more about Rick Craig.
MH Implementing Crowdsourced Testing
Rajini Padmanaban, QA InfoTech
Monday, April 29, 2013 - 8:30am - 12:00pm

In today’s market, global outreach, quick time to release, and a feature rich design are the major factors that determine a product’s success. Organizations are constantly on the lookout for innovative testing techniques to match these driving forces. Crowdsourced testing is a paradigm increasing in popularity because it addresses these factors through its scale, flexibility, cost effectiveness, and fast turnaround. Join Rajini Padmanaban as she describes what it takes to implement a crowdsourced testing effort including its definition, models, relevance to today’s development world, and challenges and mitigation strategies. Rajini shares the facts and myths about crowdsourced testing. She spans a range of theory and practice including case studies of real-life experiences and exercises to illustrate her message, and explains what it takes to maximize the benefits of a crowdsourced test implementation.

More Information
Learn more about Rajini Padmanaban.
MI How to Break Software: Embedded Edition
Jon Hagar, Grand Software Testing
Monday, April 29, 2013 - 8:30am - 12:00pm

In the tradition of James Whittaker’s book series How to Break … Software, Jon Hagar applies the testing “attack” concept to the domain of embedded software systems. Jon defines the sub-domain of embedded software and examines the issues of product failure caused by defects in that software. Next, Jon shares a set of attacks against embedded software based on common modes of failure that testers can direct against their own software. For specific attacks, Jon explains when and how to conduct the attack, as well as why the attack works to find bugs. In addition to learning these testing skills, practice the attacks on a device—a robot that Jon will bring to the tutorial—containing embedded software. Specific attack methods considered include data issues, computation and control structures, hardware-software interfaces, and communications.

More Information
Learn more about Jon Hagar.
MJ Quantifying the Value of Testing
Lloyd Roden, Lloyd Roden Consultancy
Monday, April 29, 2013 - 8:30am - 12:00pm

“Testing costs too much.” “We don’t get the value we should from the investment we make.” “Testing just delays the project.” Familiar sayings in your organization? Although testing is accepted by most as an integral part of any software development lifecycle, some see it as a hole in which to throw money rather than as an investment in quality. In order to gain credibility and reduce the negative views of our work, we testers and test managers must show senior management a clear return on their investment. Lloyd Roden describes ten measures that demonstrate the value of testing in tangible business terms. Lloyd demonstrates how these highly practical measures can be used to quantify the value of testing and to predict future quality levels. He shows how to report these measurements in ways that are meaningful to management. In this highly practical tutorial, Lloyd demonstrates the use of various measurement utilities that each delegate will receive as a takeaway.

More Information
Learn more about Lloyd Roden.
MK Team Leadership: Telling Your Testing Stories
Bob Galen, RGalen Consulting
Monday, April 29, 2013 - 1:00pm - 5:30pm

It used to be that your work and results spoke for themselves. No longer is that the case. Today you need to be a better collaborator, communicator, and facilitator so that you focus your teams on delivering value. Join Bob Galen to explore the power of the story, one of the most effective communication paradigms. You can tell stories that create powerful collaboration. You can tell stories that communicate product requirements and customer needs. You can tell stories that inspire teams to deliver results. And you can tell stories that explain your value and successes to your customers and stakeholders. Explore basic storytelling techniques, specific techniques for framing stories for software testing activities, and test leadership storytelling that energizes and guides your teams. Take time to practice telling your stories—and become a much better storyteller and leader within your testing efforts.

More Information
Learn more about Bob Galen.
ML Exploratory Testing Explained
James Bach, Satisfice, Inc.
Monday, April 29, 2013 - 1:00pm - 5:30pm

Exploratory testing is an approach to testing that emphasizes the freedom and responsibility of testers to continually optimize the value of their work. It is the process of three mutually supportive activities done in parallel: learning, test design, and test execution. With skill and practice, exploratory testers typically uncover an order of magnitude more problems than when the same amount of effort is spent on procedurally scripted testing. All testers conduct exploratory testing in one way or another, but few know how to do it systematically to obtain the greatest benefits. Even fewer can articulate the process. James Bach looks at specific heuristics and techniques of exploratory testing that will help you get the most from this highly productive approach. James focuses on the skills and dynamics of exploratory testing, and how it can be combined with scripted approaches.

More Information
Learn more about James Bach.
MM Testing the Data Warehouse
Geoff Horne, NZTester Magazine
Monday, April 29, 2013 - 1:00pm - 4:30pm

Data warehouses have become a popular mechanism for collecting, organizing, and making information readily available for strategic decision making. The ability to review historical trends and monitor near real-time operational data has become a key competitive advantage for many organizations. Yet the methods for assuring the quality of these valuable assets are quite different from those of transactional systems. Ensuring that the appropriate testing is performed is a major challenge for many enterprises. Geoff Horne has led a number of data warehouse testing projects in both the telecommunications and ERP sectors. Join Geoff as he shares his approaches and experiences, focusing on the key “uniques” of data warehouse testing including methods for assuring data completeness, monitoring data transformations, and measuring quality. He also explores the opportunities for test automation as part of the data warehouse process, describing how it can be harnessed to streamline and minimize overhead.

More Information
Learn more about Geoff Horne.
MN Acceptance Test-driven Development: Mastering Agile Testing
Nate Oster, CodeSquads, LLC
Monday, April 29, 2013 - 1:00pm - 4:30pm

On agile teams, testers often struggle to “keep up” with the pace of development if they continue employing a waterfall-based verification process—finding bugs after development. Nate Oster challenges you to question waterfall assumptions and replace this legacy verification testing with Acceptance Test-driven Development (ATDD). With ATDD, you “test first” by writing executable specifications for a new feature before development begins. Learn to switch from “tests as verification” to “tests as specification” and to guide development with acceptance tests written in the language of your business. Get started by joining a team for a simulation and experience how ATDD helps build in quality instead of trying to test out defects. Then progress to increasingly more realistic scenarios and practice the art of specifying intent with plain-language and table-based formats. These paper-based simulations give you meaningful practice with how ATDD changes the way you think about tests and collaborate as a team. Leave empowered with a kit of exercises to advocate ATDD with your own teams!

More Information
Learn more about Nate Oster.
MO Essential Test Management and Planning
Rick Craig, Software Quality Engineering
Monday, April 29, 2013 - 1:00pm - 4:30pm

The key to successful testing is effective and timely planning. Rick Craig introduces proven test planning methods and techniques, including the Master Test Plan and level-specific test plans for acceptance, system, integration, and unit testing. Rick explains how to customize an IEEE-829-style test plan and test summary report to fit your organization’s needs. Learn how to manage test activities, estimate test efforts, and achieve buy-in. Discover a practical risk analysis technique to prioritize your testing and become more effective with limited resources. Rick offers test measurement and reporting recommendations for monitoring the testing process. Discover new methods and develop renewed energy for taking your organization’s test management to the next level. 

More Information
Learn more about Rick Craig.
MP Creative Techniques for Discovering Test Ideas
Karen N. Johnson, Software Test Management, Inc.
Monday, April 29, 2013 - 1:00pm - 4:30pm

Feel your testing’s stuck in a rut? Looking for new ways to discover test ideas? Wonder if your testers have constructive methods to discover different approaches for testing? In this interactive session, Karen Johnson explains how to use heuristics to find new ideas. After a brief discussion, Karen has you apply and practice with a variety of heuristics. Need to step back and consider some of your testing challenges from a fresh perspective? This workshop explores the use of the CIA’s tool—the Phoenix Checklist—a set of intentionally designed context-free questions to help you look from a fresh perspective at a problem or challenge. Karen reviews the use of the fun and useful tool of brainstorming and variations on brainstorming that you can use with your team. Come join a session designed to explore creative ways to strengthen your approach to testing.

More Information
Learn more about Karen N. Johnson.
MQ The Craft of Bug Investigation
Jon Bach, eBay, Inc.
Monday, April 29, 2013 - 1:00pm - 4:30pm

At testing conferences, many presentations mention techniques and processes meant to help you find bugs, but few talk about what to do when you find one. If it’s as simple as writing what you saw, how do you know that’s the real problem? What do you do when you find a bug but the developer wants you to provide more information? How do you reproduce those pesky, intermittent bugs that come in from customers? Join Jon Bach in this hands-on tutorial to help you practice investigation and analysis skills like questioning, conjecturing, branching, and backtracking. If you’re telling stories about the bug that got away, this tutorial gives you the opportunity to try some techniques that may trap it so you can earn more credibility, respect, and autonomy from your stakeholders. Collaboration is encouraged during the session, so bring your tool suggestions, tester’s notebook, and scientific mindset.

More Information
Learn more about Jon Bach.
TA Mobile Applications Testing
Jonathan Kohl, Kohl Concepts, Inc.
Tuesday, April 30, 2013 - 8:30am - 4:30pm

As applications for smartphones and tablets become incredibly popular, organizations face increasing pressure to quickly and successfully deliver testing for these devices. When faced with a mobile testing project, many testers find it tempting to apply the same methods and techniques used for desktop applications. Although some of these concepts transfer directly, testing mobile applications presents its own special challenges. Jonathan Kohl says if you follow the same practices and techniques as you have before, you will miss critical defects. Learn how to effectively test mobile applications, and how to add more structure and organization to generate effective test ideas to exploit the capabilities and weaknesses of mobile devices. Jonathan shares first-hand experiences with testing mobile applications and discusses how to address various challenges. Work on real problems on your own device, and learn firsthand how to be productive while testing mobile applications.

Note: This is a hands-on course. Participants must bring their own mobile device for course exercises.

More Information
Learn more about Jonathan Kohl.
TB Key Test Design Techniques
Lee Copeland, Software Quality Engineering
Tuesday, April 30, 2013 - 8:30am - 4:30pm

All testers know that we can identify many more test cases than we will ever have time to design and execute. The major problem in testing is choosing a small, “smart” subset from the almost infinite number of possibilities available. Join Lee Copeland to discover how to design test cases using formal black-box techniques, including equivalence class and boundary value testing, decision tables, state-transition diagrams, and all-pairs testing. Explore white-box techniques with their associated coverage metrics. Evaluate more informal approaches, such as random and hunch-based testing, and learn the importance of using exploratory testing to enhance your testing ability. Choose the right test case design approaches for your projects. Use the test results to evaluate the quality of both your products and your test designs.

More Information
Learn more about Lee Copeland.
TC Critical Thinking for Software Testers
James Bach, Satisfice, Inc.
Tuesday, April 30, 2013 - 8:30am - 4:30pm

Critical thinking is the kind of thinking that specifically looks for problems and mistakes. Regular people don't do a lot of it. However, if you want to be a great tester, you need to be a great critical thinker, too. Critically thinking testers save projects from dangerous assumptions and ultimately from disasters. The good news is that critical thinking is not just innate intelligence or a talent—it's a learnable and improvable skill you can master. James Bach shares the specific techniques and heuristics of critical thinking and presents realistic testing puzzles that help you practice and increase your thinking skills. Critical thinking begins with just three questions—Huh? Really? and So?—that kick start your brain to analyze specifications, risks, causes, effects, project plans, and anything else that puzzles you. Join James for this interactive, hands-on session and practice your critical thinking skills. Study and analyze product behaviors and experience new ways to identify, isolate, and characterize bugs.

More Information
Learn more about James Bach.
TD Management Issues in Test Automation
Dorothy Graham, Software Test Consultant
Tuesday, April 30, 2013 - 8:30am - 12:00pm

Many organizations never achieve the significant benefits that are promised from automated test execution. Surprisingly often, this is not due to technical factors but to management issues. Dot Graham describes the most important management issues you must address for test automation success, and helps you understand and choose the best approaches for your organization—no matter which automation tools you use or your current state of automation. Dot explains how automation affects staffing, who should be responsible for which automation tasks, how managers can best support automation efforts leading to success, and what return on investment means in automated testing and what you can realistically expect. Dot also reviews the key technical issues that can make or break the automation effort. Come away with an example set of automation objectives and measures, and a draft test automation strategy that you can use to plan or improve your own automation. 

More Information
Learn more about Dorothy Graham.
TE Security Testing for Testing Professionals
Jeff Payne, Coveros, Inc.
Tuesday, April 30, 2013 - 8:30am - 12:00pm

Today’s software applications are often security-critical, making security testing an essential part of a software quality program. Unfortunately, most testers have not been taught how to effectively test the security of the software applications they validate. Join Jeff Payne as he shares what you need to know to integrate effective security testing into your everyday software testing activities. Learn how software vulnerabilities are introduced into code and exploited by hackers. Discover how to define and validate security requirements. Explore effective test techniques for assuring that common security features are tested. Learn about the most common security vulnerabilities and how to identify key security risks within applications and use testing to mitigate them. Understand how to security test applications—both web- and GUI-based—during the software development process. Review examples of how common security testing tools work and assist the security testing process. Take home valuable tools and techniques for effectively testing the security of your applications going forward.

More Information
Learn more about Jeff Payne.
TF Rob Sabourin: On Testing
Rob Sabourin, AmiBug.com
Tuesday, April 30, 2013 - 8:30am - 12:00pm

Are you continually testing software the same old way? Do you need fresh ideas? Are your hum-drum tests not finding enough defects? Are your tests too slow for today’s fast-paced lifecycles? Then this workshop will help you spice things up, improve your testing, and get things done. Rob Sabourin outlines more than 150 different ways to test your software to quickly and efficiently expose relevant problems. Each is illustrated with custom artwork and explained with real world examples. Testing is examined from several perspectives—agile and otherwise. What objectives should our testing focus on? How can we design powerful tests? When does it make sense to explore different risks? Can tests be reused, repurposed, or recycled? How does automation fit in? When does checking make sense? Which static techniques are available? What about non-functional testing? “Test evangelist” Rob provides a lively, entertaining, and informative view of software testing.

More Information
Learn more about Rob Sabourin.
TG Test Management for Cloud-based Applications
Ruud Teunissen, Polteq Test Services BV
Tuesday, April 30, 2013 - 8:30am - 12:00pm

Because the cloud introduces additional system risks—Internet dependencies, security challenges, performance concerns, and more—you, as a test manager, need to broaden your scope and update your team’s practices and processes. Ruud Teunissen shares a unique approach that directly addresses more than 140 new testing concerns and risks you may encounter in the cloud. Learn how to identify cloud-specific requirements and the risks that can ensue from those requirements. Then, explore the test strategies you'll need to adopt to mitigate those risks. Explore cloud services selection, implementation, and operations. Then, take a dive in to the wider scope of test management in the cloud. Take back the ammunition you need to convince senior management that test managers should participate during the cloud services selection to help avoid risks before implementation and, further, why you should work with IT operations to extend test activities after the system goes live.

More Information
Learn more about Ruud Teunissen.
TH How to Break Software: Robustness Edition
Dawn Haynes, PerfTestPlus, Inc.
Tuesday, April 30, 2013 - 8:30am - 12:00pm

Have you ever worked on a project where you felt testing was thorough and complete—all of the features were covered and all of the tests passed—yet in the first week in production the software had serious issues and problems? Join Dawn Haynes to learn how to inject robustness testing into your projects to uncover those issues before release. Robustness—an important and often overlooked area of testing—is the degree to which a system operates correctly in the presence of exceptional inputs or stressful environmental conditions. By expanding basic tests and incorporating specific robustness attacks, Dawn shows you how to catch defects that commonly show up first in production. She offers strategies for making robustness testing a project-level concern so those defects get the priority they deserve and are fixed before release. Join Dawn to learn about robustness tests you can add to your suite and execute in just a few minutes—even if your test team is over-tasked and under-resourced.

More Information
Learn more about Dawn Haynes.
TI Exploratory Testing Is Now in Session
Jon Bach, eBay, Inc.
Tuesday, April 30, 2013 - 8:30am - 12:00pm

The nature of exploration, coupled with the ability of testers to rapidly apply their skills and experience, make exploratory testing a widely used test approach—especially when time is short. Unfortunately, exploratory testing often is dismissed by project managers who assume that it is not reproducible, measurable, or accountable. If you have these concerns, you may find a solution in a technique called session-based test management (SBTM), developed by Jon Bach and his brother James to specifically address these issues. In SBTM, testers are assigned areas of a product to explore, and testing is time boxed in “sessions” that have mission statements called “charters” to create a meaningful and countable unit of work. Jon discusses—and you practice—the skills of exploration using the SBTM approach. He demonstrates a freely available, open source tool to help manage your exploration and prepares you to implement SBTM in your test organization.

More Information
Learn more about Jon Bach.
TJ The Mindset Change for the Agile Tester
Janet Gregory, DragonFire, Inc.
Tuesday, April 30, 2013 - 8:30am - 12:00pm

On traditional projects, testers usually join the project after coding has started, or even later when coding is almost finished. Testers have no role in advising the project team early regarding quality issues but focus only on finding defects. They become accustomed to this style of working and adjust their mental processes accordingly. In agile, testers must collaborate closely with customers and programmers throughout the development lifecycle, where their focus changes from finding defects to preventing them. Janet Gregory shares ways to change the tester’s mindset from “How can I break the software?” to “How can I help deliver excellent software?”—a critical mental shift on agile projects. Another facet of the mind-set change is learning how to test early and incrementally. Janet uses interactive exercises and examples to help you understand how effective this mindset change is—and how you can apply it on your agile projects.

More Information
Learn more about Janet Gregory.
TK Production Performance Testing in the Cloud
Dan Bartow, SOASTA, Inc.
Tuesday, April 30, 2013 - 1:00pm - 4:30pm

Testing in production for online applications has evolved into a critical component of successful performance testing strategies. Dan Bartow explains the fundamentals of cloud computing, its application to full-scale performance validation, and the practices and techniques needed to design and execute a successful testing-in-production strategy. Drawing on his experiences, Dan describes the methodology he has used for testing numerous online applications in a production environment with minimal disruption. He explains how to create a performance testing strategy to give your team critical data about how your online application performs and scales. Learn how to create a robust lab-to-production ecosystem that delivers the answers about what will happen when peak traffic hits your site. Take back practical approaches to mitigate the three most common problems—security, test data, and potential live customer impact—that arise when embarking on testing in production.

More Information
Learn more about Dan Bartow.
TL Build Your Mobile Testing Knowledge
Karen N. Johnson, Software Test Management, Inc.
Tuesday, April 30, 2013 - 1:00pm - 4:30pm

Are you overwhelmed by the number of mobile devices you need to test? The device market is large and new devices become available almost weekly. Karen Johnson discusses three key challenges to mobile testing—device selection, user interface, and device and application settings—and leads you through each. Learn how to select which devices to test and how to keep up-to-date in the ever-changing mobile market. Need to learn about user interface testing on mobile? Karen reviews mobile UX concepts and design. Wonder what device settings can impact your mobile app testing? Karen reviews common settings you need to consider. In addition to these mobile testing challenges, Karen guides you on how to conduct a competitive analysis of mobile apps. Learning how to conduct a survey of mobile apps and becoming aware of your competitors’ offerings are important to grow your own mobile knowledge.

Note: Bring your own smartphone to class to enhance your learning.

More Information
Learn more about Karen N. Johnson.
TM High-flying Cloud Testing Techniques
Ruud Teunissen, Polteq Test Services BV
Tuesday, April 30, 2013 - 1:00pm - 4:30pm

The cloud can deliver services over the Internet in three ways—software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS). Each of these approaches requires testers to focus on more than classical functional testing. Ruud Teunissen explores the new techniques and skills testers need to master for testing cloud services. Examples include testing for elasticity; testing fall back scenarios to guarantee continuity of business processes; testing for adherence to laws and regulations; and testing apps, web services, and the numerous platforms that need to be supported. Join Ruud and learn how to test these additional cloud requirements to get a grip on technical test issues, explore cloud services operations, and jump-start the broader scope of testing in the cloud. Take back practical approaches for tuning and tweaking your present test techniques to fly high in the cloud.

 

More Information
Learn more about Ruud Teunissen.
TN Collaboration Techniques: Combining New Approaches with Ancient Wisdom
Rob Sabourin, AmiBug.com
Dorothy Graham, Software Test Consultant
Tuesday, April 30, 2013 - 1:00pm - 4:30pm

In our increasingly agile world, the new buzzword is collaboration—so easy to preach but difficult to do well. Testers are challenged to work directly, effectively, efficiently, and productively with customers, programmers, business analysts, writers, trainers, and pretty much everyone in the business value chain. Many points of collaboration exist: grooming stories with customers, sprint planning with team members, reviewing user interaction with customers, troubleshooting bugs with developers, whiteboarding with peers, and buddy checking. Rob Sabourin and Dot Graham examine what collaboration is, why it is challenging, and how you can do it better. Join Rob and Dot to learn about forgotten but proven techniques, such as risk-based objectives, checklists, entry and exit criteria, diverse roles, cross-checking, and root cause analysis. These techniques can help you work more efficiently, improve your professional relationships, and deliver quality products. Bring your own stories of collaboration—good and bad—and see how forgotten wisdom can help improve today’s practices.

More Information
Learn more about Rob Sabourin.
TO Introducing Keyword-driven Test Automation
Hans Buwalda, LogiGear
Tuesday, April 30, 2013 - 1:00pm - 4:30pm

In both agile and traditional projects, keyword-driven testing has proven to be a powerful way to attain a high level of automation—when it is done correctly. Many testing organizations use keyword testing but aren’t realizing the full benefits of scalability and maintainability that are essential to keep up with the demands of testing today’s software. Hans Buwalda outlines how you can meet what he calls the 5 percent challenges—automating 95 percent of your tests with no more than 5 percent of your total testing effort—using the proven, keyword-driven test method he uses. Hans discusses how keywords relate to other automation techniques like scripting and data-driven testing. The information and real-world application Hans presents enables you to attain a very high level of automation with the lowest possible effort.

More Information
Learn more about Hans Buwalda.
TP Distributed Agile Testing: Yes, You Can
Janet Gregory, DragonFire, Inc.
Tuesday, April 30, 2013 - 1:00pm - 4:30pm

When agile development first gained popularity, agile meant collocated teams, including testers, programmers, analysts, and customers who were expected to perform many functions. As agile methods have spread and expanded, many organizations with globally-distributed teams are facing challenges with their agile deployment. Having worked with many such teams, Janet Gregory has observed ways that testers in agile teams can be very productive while delivering a high-quality software product and working well with the rest of the team. In this interactive session, Janet shares her experiences and offers opportunities for all participants to discuss their specific issues and potential solutions. Whether your distributed team is scattered across time zones, has individuals working remotely from home, or is part of an offshore outsourced project, you’ll take away methods and tools to help develop open communication, deal with cultural differences, and share data and information across the miles.

More Information
Learn more about Janet Gregory.
TQ How to Actually DO High-volume Automated Testing
Cem Kaner, Florida Institute of Technology
Carol Oliver, Florida Institute of Technology
Tuesday, April 30, 2013 - 1:00pm - 4:30pm

In high volume automated testing (HiVAT), the test tool generates the test, runs it, evaluates the results, and alerts a human to suspicious results that need further investigation. What makes it simple is its oracle—run the program until it crashes or fails in some other extremely obvious way. More powerful HiVAT approaches are more sensitive to more types of errors. They are particularly useful for testing combinations of many variables and for hunting hard-to-replicate bugs that involve timing or corruption of memory or data. Cem Kaner presents a new strategy for teaching HiVAT testing. Instead of describing what has been done, Cem is creating open source examples of the techniques applied to real (open source) applications. These examples are written in Ruby, making the code readable and reusable by snapping in code specific to your own application. Join Cem Kaner and Carol Oliver as they describe three HiVAT techniques, their associated code, and how you can customize them.

More Information
Learn more about Cem Kaner.