Skip to main content

Tutorials

Begin your experience by attending half- or full-day tutorials. Please note that you must register for the tutorial(s) you want to attend as space is limited and many sell out quickly.

Tutorials
MA A Rapid Introduction to Rapid Software Testing
Paul Holland, Testing Thoughts
Monday, September 30, 2013 - 8:30am - 4:30pm

You're under tight time pressure and have barely enough information to proceed with testing. How do you test quickly and inexpensively, yet still produce informative, credible, and accountable results? Rapid Software Testing, adopted by context-driven testers worldwide, offers a field-proven answer to this all-too-common dilemma. In this one-day sampler of the approach, Paul Holland introduces you to the skills and practice of Rapid Software Testing through stories, discussions, and "minds-on" exercises that simulate important aspects of real testing problems. The rapid approach isn't just testing with speed or a sense of urgency; it's mission-focused testing that eliminates unnecessary work, assures that the most important things get done, and constantly asks how testers can help speed up the successful completion of the project. Join Paul to learn how rapid testing focuses on both the mind set and skill set of the individual tester who uses tight loops of exploration and critical thinking skills to help continuously re-optimize testing to match clients' needs and expectations.

More Information
Learn more about Paul Holland.

Large-scale testing projects can severely stress many of the testing practices we have gotten used to over the year. This can result in less than optimal outcomes. A number of innovative ideas and concepts have emerged to support industrial-strength testing of large and complex projects. Hans Buwalda shares his experiences and the strategies he's developed and used for large testing on large projects. Learn how to design tests specifically for automation and how to successfully incorporate keyword testing. The automation discussion will include virtualization and cloud options, how to deal with numerous versions and configurations common to large projects, and how to handle the complexity added by mobile devices. Hans also outlines the possibilities and pitfalls of outsourcing test automation. The information presented is based on his nineteen years of worldwide experience with testing and test automation involving large projects with test cases executing continuously for many weeks on multiple machines.

More Information
Learn more about Hans Buwalda.
MC Getting Started with Risk-Based Testing
Dale Perry, Software Quality Engineering
Monday, September 30, 2013 - 8:30am - 4:30pm

Whether you are new to testing or looking for a better way to organize your test practices and processes, the Systematic Test and Evaluation Process (STEP™) offers a flexible approach to help you and your team succeed. Dale Perry describes this risk-based framework—applicable to any development lifecycle model—to help you make critical testing decisions earlier and with more confidence. The STEP™ approach helps you decide how to focus your testing effort, what elements and areas to test, and how to organize test designs and documentation. Learn the fundamentals of test analysis and how to develop an inventory of test objectives to help prioritize your testing efforts. Discover how to translate these objectives into a concrete strategy for designing and developing tests. With a prioritized inventory and focused test architecture, you will be able to create test cases, execute the resulting tests, and accurately report on the quality of your application and the effectiveness of your testing. Take back a proven approach to organize your testing efforts and new ways to add more value to your project and organization.

More Information
Learn more about Dale Perry.
MD Application Performance Testing: A Simplified Universal Approach NEW
Scott Barber, PerfTestPlus, Inc.
Monday, September 30, 2013 - 8:30am - 12:00pm

In response to increasing market demand for high performance applications, many organizations implement performance testing projects, often at great expense. Sadly, these solutions alone are often insufficient to keep pace with emerging expectations and competitive pressures. With specific examples from recent client implementations, Scott Barber shares the fundamentals of implementing T4APM™  a simple and universal approach that is valuable independently or as an extension of existing performance testing programs. The T4APM™ approach hinges on applying a simple and unobtrusive "Target, Test, Trend, Tune” cycle to tasks in your application lifecycle—from a single unit test through entire system production monitoring. Leveraging T4APM™ on a particular task may require knowledge specific to the task, but learning how to leverage the approach does not. Scott provides everything you need to become the T4APM™ coach and champion, and to help your team keep up with increasing demand for better performance, regardless of your current title or role.

More Information
Learn more about Scott Barber.
ME Leading Change―Even If You’re Not in Charge
Jennifer Bonine, tap|QA, Inc.
Monday, September 30, 2013 - 8:30am - 12:00pm

Has this happened to you? You try to implement a change in your organization and it doesn’t get the support that you thought it would. And, to make matters worse, you can't figure out why. Or, you have a great idea but can’t get the resources required for successful implementation. Jennifer Bonine shares a toolkit of techniques to help you determine which ideas will—and will not—work within your organization. This toolkit includes five rules for change management, a checklist to help you determine the type of change process needed in your organization, techniques for communicating your ideas to your target audience, a set of questions you can ask to better understand your executives’ goals, and methods for overcoming resistance to change from teams you don’t lead. These tools—together with an awareness of your organization’s core culture—will help you identify which changes you can successfully implement and which you should leave until another day.

More Information
Learn more about Jennifer Bonine.
MF Implementing Crowdsourced Testing NEW
Rajini Padmanaban, QA InfoTech
Mukesh Sharma, QA InfoTech
Monday, September 30, 2013 - 8:30am - 12:00pm

In today’s market, global outreach, quick time to release, and a feature rich design are the major factors that determine a product’s success. Organizations are constantly on the lookout for innovative testing techniques to match these driving forces. Crowdsourced testing is a paradigm increasing in popularity because it addresses these factors through its scale, flexibility, cost effectiveness, and fast turnaround. Join Rajini Padmanaban and Mukesh Sharma as they describe what it takes to implement a crowdsourced testing effort including its definition, models, relevance to today’s development world, and challenges and mitigation strategies. Rajini and Mukesh share the facts and myths about crowdsourced testing. They span a range of theory and practice including case studies of real-life experiences and exercises to illustrate the message, and explain what it takes to maximize the benefits of a crowdsourced test implementation.

More Information
Learn more about Rajini Padmanaban and Mukesh Sharma.
MG Rapid Software Testing: Strategy NEW
James Bach, Satisfice, Inc.
Monday, September 30, 2013 - 8:30am - 12:00pm

A test strategy is the set of ideas that guides your test design. It's what explains why you test this instead of that, and why you test this way instead of that way. Strategic thinking matters because testers must make quick decisions about what needs testing right now and what can be left alone. You must be able to work through major threads without being overwhelmed by tiny details. James Bach describes how test strategy is organized around risk but is not defined before testing begins. Rather, it evolves alongside testing as we learn more about the product. We start with a vague idea of our strategy, organize it quickly, and document as needed in a concise way. In the end, the strategy can be as formal and detailed as you want it to be. In the beginning, though, we start small. If you want to focus on testing and not paperwork, this approach is for you.

More Information
Learn more about James Bach.
MH Management Issues in Test Automation
Dorothy Graham, Software Test Consultant
Monday, September 30, 2013 - 8:30am - 12:00pm

Many organizations never achieve the significant benefits that are promised from automated test execution. Surprisingly often, this is due not to technical factors but to management issues. Dot Graham describes the most important management concerns the test manager must address for test automation success, and helps you understand and choose the best approaches for your organization—no matter which automation tools you use or your current state of automation. Dot explains how automation affects staffing, who should be responsible for which automation tasks, how managers can best support automation efforts leading to success, and what return on investment means in automated testing and what you can realistically expect. Dot also reviews the key technical issues that can make or break the automation effort. Come away with an example set of automation objectives and measures, and a draft test automation strategy that you can use to plan or improve your own automation.

More Information
Learn more about Dorothy Graham.
MI Measurement and Metrics for Test Managers
Rick Craig, Software Quality Engineering
Monday, September 30, 2013 - 8:30am - 12:00pm

To be most effective, test managers must develop and use metrics to help direct the testing effort and make informed recommendations about the software’s release readiness and associated risks. Because one important testing activity is to “measure” the quality of the software, test managers must measure the results of both the development and testing processes. Collecting, analyzing, and using metrics is complicated because many developers and testers are concerned that the metrics will be used against them. Join Rick Craig as he addresses common metrics—measures of product quality, defect removal efficiency, defect density, defect arrival rate, and testing status. Learn the guidelines for developing a test measurement program, rules of thumb for collecting data, and ways to avoid “metrics dysfunction.” Rick identifies several metrics paradigms and discusses the pros and cons of each. Delegates are urged to bring their metrics problems and issues for use as discussion points.

More Information
Learn more about Rick Craig.
MJ Exploratory Testing Explained
Jon Bach, eBay, Inc.
Monday, September 30, 2013 - 8:30am - 12:00pm

Exploratory testing is an approach to testing that emphasizes the freedom and responsibility of testers to continually optimize the value of their work. It is the process of three mutually supportive activities—learning, test design, and test execution—done in parallel. With skill and practice, exploratory testers typically uncover an order of magnitude more problems than when the same amount of effort is spent on procedurally scripted testing. All testers conduct exploratory testing in one way or another, but few know how to do it systematically to obtain the greatest benefits. Even fewer can articulate the process. Jon Bach looks at specific heuristics and techniques of exploratory testing that will help you get the most from this highly productive approach. Jon focuses on the skills and dynamics of exploratory testing, and how it can be combined with scripted approaches.

More Information
Learn more about Jon Bach.
MK Test Estimation for Managers NEW
Julie Gardiner, The Test People
Monday, September 30, 2013 - 1:00pm - 4:30pm

Test estimation is one of the most difficult software development activities to do well. The primary reason is that testing is not an independent activity and is often plagued by upstream destabilizing dependencies. Julie Gardiner describes common problems in test estimation, explains how to overcome them, and reveals six powerful ways to estimate test effort. Some estimation techniques are quick but can be challenged easily; others are more detailed and time consuming to use. The estimation methods Julie discusses include FIA (Finger in the Air), Formula or Percentage, Historical, Consensus of Experts, Work Breakdown Structures, and Estimation Models. Through the use of exercises, you will gain experience using these techniques. Julie looks at how we can approach the “set-in-stone deadlines” often presented to us and effectively communicate estimates for testing to senior management. Spreadsheets and utilities will be given out during this session to help testers, test managers, and development managers improve their estimation practices.

More Information
Learn more about Julie Gardiner.
ML Testing the Data Warehouse—Big Data, Big Problems NEW
Geoff Horne, NZTester Magazine
Monday, September 30, 2013 - 1:00pm - 4:30pm

Data warehouses are critical systems for collecting, organizing, and making information readily available for strategic decision making. The ability to review historical trends and monitor near real-time operational data is a key competitive advantage for many organizations. Yet the methods for assuring the quality of these valuable assets are quite different from those of transactional systems. Ensuring that appropriate testing is performed is a major challenge for many enterprises. Geoff Horne has led numerous data warehouse testing projects in both the telecommunications and ERP sectors. Join Geoff as he shares his approaches and experiences, focusing on the key “uniques” of data warehouse testing: methods for assuring data completeness, monitoring data transformations, measuring quality, and more. Geoff explores the opportunities for test automation as part of the data warehouse process, describing how you can harness automation tools to streamline the work and minimize overhead.

More Information
Learn more about Geoff Horne.
MM Exploratory Testing Is Now in Session
Jon Bach, eBay, Inc.
Monday, September 30, 2013 - 1:00pm - 4:30pm

The nature of exploration, coupled with the ability of testers to rapidly apply their skills and experience, make exploratory testing a widely used test approach—especially when time is short. Unfortunately, exploratory testing often is dismissed by project managers who assume that it is not reproducible, measurable, or accountable. If you have these concerns, you may find a solution in a technique called session-based test management (SBTM), developed by Jon Bach and his brother James to specifically address these issues. In SBTM, testers are assigned areas of a product to explore, and testing is time boxed in “sessions” that have mission statements called “charters” to create a meaningful and countable unit of work. Jon discusses—and you practice—the skills of exploration using the SBTM approach. He demonstrates a freely available, open source tool to help manage your exploration and prepares you to implement SBTM in your test organization.

More Information
Learn more about Jon Bach.
MN Essential Test Management and Planning
Rick Craig, Software Quality Engineering
Monday, September 30, 2013 - 1:00pm - 4:30pm

The key to successful testing is effective and timely planning. Rick Craig introduces proven test planning methods and techniques, including the Master Test Plan and level-specific test plans for acceptance, system, integration, and unit testing. Rick explains how to customize an IEEE-829-style test plan and test summary report to fit your organization’s needs. Learn how to manage test activities, estimate test efforts, and achieve buy-in. Discover a practical risk analysis technique to prioritize your testing and become more effective with limited resources. Rick offers test measurement and reporting recommendations for monitoring the testing process. Discover new methods and develop renewed energy for taking your organization’s test management to the next level.

More Information
Learn more about Rick Craig.
MO Build Your Mobile Testing Expertise NEW
Karen N. Johnson, Software Test Management, Inc.
Monday, September 30, 2013 - 1:00pm - 4:30pm

Are you overwhelmed by the number of mobile devices you need to test? The device market is large and new devices become available almost weekly. Karen Johnson discusses three key mobile testing challenges—device selection, user interface, and device and application settings—and leads you through each. Learn how to select which devices to test and how to keep up-to-date in the ever-changing mobile market. Need to learn about user interface testing on mobile? Karen reviews mobile UX concepts and design. Wonder what device settings can impact your mobile app testing? Karen reviews common settings you need to consider. In addition to these mobile testing challenges, Karen guides you on how to conduct a competitive analysis of mobile apps. Learning how to conduct a survey of mobile apps and becoming aware of your competitors’ offerings are important to grow your own mobile knowledge.

More Information
Learn more about Karen N. Johnson.
MP Rapid Software Testing: Reporting NEW
James Bach, Satisfice, Inc.
Monday, September 30, 2013 - 1:00pm - 4:30pm

Test reporting is something few testers take time to practice. Nevertheless, it's a fundamental skill—vital for your professional credibility and your own self-management. Many people think management judges testing by bugs found or test cases executed. Actually, testing is judged by the story it tells. If your story sounds good, you win. A test report is the story of your testing. It begins as the story we tell ourselves, each moment we are testing, about what we are doing and why. We use the test story within our own minds, to guide our work. James Bach explores the skill of test reporting and examines some of the many different forms a test report might take. As in other areas of testing, context drives good reporting. Sometimes we make an oral report, occasionally we need to write it down. Join James for an in depth look at the art of the reporting.

More Information
Learn more about James Bach.
MQ How to Break Software: Embedded Edition NEW
Jon Hagar, Grand Software Testing
Monday, September 30, 2013 - 1:00pm - 4:30pm

In the tradition of James Whittaker’s book series How to Break … Software, Jon Hagar applies the testing “attack” concept to the domain of embedded software systems. Jon defines the sub-domain of embedded software and examines the issues of product failure caused by defects in that software. Next, he shares a set of attacks against embedded software based on common modes of failure that testers can direct against their own software. For specific attacks, Jon explains when and how to conduct the attack, as well as why the attack works to find bugs. In addition to learning these testing skills, attendees get to practice the attacks on a device—a robot that Jon will bring to the tutorial—containing embedded software. Specific attack methods considered include data issues, computation and control structures, hardware-software interfaces, and communications.

More Information
Learn more about Jon Hagar.
TA Mobile Applications Testing: From Concept to Practice SOLD OUT NEW
Jonathan Kohl, Kohl Concepts, Inc.
Tuesday, October 1, 2013 - 8:30am - 4:30pm

As applications for smartphones and tablets become incredibly popular, organizations encounter increasing pressure to quickly and successfully deliver testing for these devices. When faced with a mobile testing project, many testers find it tempting to apply the same methods and techniques used for desktop applications. Although some of these concepts transfer directly, testing mobile applications presents its own special challenges. Jonathan Kohl says if you follow the same practices and techniques as you have before, you will miss critical defects. Learn how to effectively test mobile applications, and how to add more structure and organization to generate effective test ideas to exploit the capabilities and weaknesses of mobile devices. Jonathan shares first-hand experiences with testing mobile applications and discusses how to address various challenges. Work on real problems on your own device, and learn firsthand how to be productive while testing mobile applications.

Note: This is a hands-on course. Participants must bring their own mobile device for course exercises.

More Information
Learn more about Jonathan Kohl.
TB Key Test Design Techniques
Lee Copeland, Software Quality Engineering
Tuesday, October 1, 2013 - 8:30am - 4:30pm

All testers know that we can identify many more test cases than we will ever have time to design and execute. The key problem in testing is choosing a small, “smart” subset from the almost infinite number of possibilities available. Join Lee Copeland to discover how to design test cases using formal black-box techniques, including equivalence class and boundary value testing, decision tables, state-transition diagrams, and all-pairs testing. Explore white-box techniques with their associated coverage metrics. Evaluate more informal approaches, such as random and hunch-based testing, and learn the importance of using exploratory testing to enhance your testing ability. Choose the right test case design approaches for your projects. Use the test results to evaluate the quality of both your products and your test designs.

More Information
Learn more about Lee Copeland.
TC Critical Thinking for Software Testers
James Bach, Satisfice, Inc.
Tuesday, October 1, 2013 - 8:30am - 4:30pm

Critical thinking is the kind of thinking that specifically looks for problems and mistakes. Regular people don't do a lot of it. However, if you want to be a great tester, you need to be a great critical thinker. Critically thinking testers save projects from dangerous assumptions and ultimately from disasters. The good news is that critical thinking is not just innate intelligence or a talent—it's a learnable and improvable skill you can master. James Bach shares the specific techniques and heuristics of critical thinking and presents realistic testing puzzles that help you practice and increase your thinking skills. Critical thinking begins with just three questions—Huh? Really? and So?—that kick start your brain to analyze specifications, risks, causes, effects, project plans, and anything else that puzzles you. Join James for this interactive, hands-on session and practice your critical thinking skills. Study and analyze product behaviors and experience new ways to identify, isolate, and characterize bugs.

More Information
Learn more about James Bach.
TD The Craft of Bug Investigation
Jon Bach, eBay, Inc.
Tuesday, October 1, 2013 - 8:30am - 12:00pm

Although many training classes and conference presentations describe processes and techniques meant to help you find bugs, few explain what to do when you find a good one. How do you know what the underlying problem is? What do you do when you find a bug, and the developer wants you to provide more information? How do you reproduce those pesky, intermittent bugs that come in from customer land? In this hands-on class, Jon Bach helps you practice your investigation and analysis skills—questioning, conjecturing, branching, and backtracking. For those of you who have ever had to tell the story about the big bug that got away, Jon offers up new techniques that may trap it next time so you can earn more credibility, respect, and accolades from stakeholders. Because collaboration and participation are encouraged in this class, bring your mental tester toolkit, tester’s notebook, and an open mind.

More Information
Learn more about Jon Bach.
TE Discovering New Test Ideas: Getting that Burst of Creativity NEW
Karen N. Johnson, Software Test Management, Inc.
Tuesday, October 1, 2013 - 8:30am - 12:00pm

Feel your testing’s stuck in a rut? Looking for new ways to discover test ideas? Wondering if your testers have constructive methods to discover different approaches for testing? In this interactive session, Karen Johnson explains how to use heuristics to find new ideas. After a brief discussion, Karen has you apply and practice with a variety of heuristics. Need to step back and consider some of your testing challenges from a fresh perspective? This workshop explores the use of the CIA’s tool, the Phoenix Checklist, a set of intentionally designed context-free questions that can help you look at a problem or challenge from a fresh perspective. Karen reviews the fun and useful tool of brainstorming and variations on brainstorming that you can use with your team. Come join a session designed to explore creative ways to strengthen your approach to testing.

More Information
Learn more about Karen N. Johnson.
TF Alan Page: On Testing NEW
Alan Page, Microsoft
Tuesday, October 1, 2013 - 8:30am - 12:00pm

You name the testing topic, and Alan Page has an opinion on it, hands-on practical experience with it—or both. Spend the morning with Alan as he discusses a variety of topics, trends, and tales of software engineering and software testing. In an interactive format loosely based on discovering new testing ideas—and bringing new life to some of the old ideas—Alan shares experiences and stories from his twenty year career as a software tester. Topics may include philosophical rants about code coverage and test pass rates; thoughts on the developer/tester relationship and quality ownership; and insights on test leadership and the real future of test. Join Alan for a unique opportunity to participate in intriguing discussions about testing that will expand your testing knowledge, give you the insight you need to grow your own career, and help your organization succeed.

More Information
Learn more about Alan Page.
TG Patterns in Test Automation: Issues and Solutions SOLD OUT NEW
Dorothy Graham, Software Test Consultant
Seretta Gamba, Steria Mummert ISS GmbH
Tuesday, October 1, 2013 - 8:30am - 12:00pm

Testers often encounter problems when automating test execution. The surprising thing is that many testers encounter the very same problems, over and over again. These problems often have known solutions, yet many testers are not aware of them. Recognizing the commonality of these test automation issues and their solutions, Seretta Gamba and Dorothy Graham have organized them into a set of test automation patterns. A pattern is a general, reusable solution to a commonly occurring problem. For many years, patterns have been identified, defined, catalogued, and used in software development, but they are not commonly recognized in test automation. Seretta and Dot help you recognize your test automation problems and show you how to identify appropriate patterns to help solve them. Patterns address issues such as No Previous Automation, High ROI Expectations, and High Test Maintenance Cost.

More Information
Learn more about Dorothy Graham and Seretta Gamba.
TH How to Break Software: Robustness Edition
Dawn Haynes, PerfTestPlus, Inc.
Tuesday, October 1, 2013 - 8:30am - 12:00pm

Have you ever worked on a project where you felt testing was thorough and complete—all of the features were covered and all of the tests passed—yet in the first week in production the software had serious issues and problems? Join Dawn Haynes to learn how to inject robustness testing into your projects to uncover those issues before release. Robustness—an important and often overlooked area of testing—is the degree to which a system operates correctly in the presence of exceptional inputs or stressful environmental conditions. By expanding basic tests and incorporating specific robustness attacks, Dawn shows you how to catch defects that commonly show up first in production. She offers strategies for making robustness testing a project-level concern so those defects get the priority they deserve and are fixed before release. Join Dawn to learn about robustness tests you can add to your suite and execute in just a few minutes—even if your test team is over-tasked and under-resourced.

More Information
Learn more about Dawn Haynes.
TI Exploring Usability Testing NEW
Rob Sabourin, AmiBug.com
Tuesday, October 1, 2013 - 8:30am - 12:00pm

It is not enough to verify that software conforms to requirements by passing established acceptance tests. Successful software products engage, entertain, and support the users' experience. While goals vary from project to project, no matter how robust and reliable your software is, if your users do not embrace it, business can slip from your hands. Rob Sabourin shares how to elicit effective usability requirements with techniques such as story boarding and task analysis. Together, testers, programmers, and users collaborate to blend the requirement, design, and test cycles into a tight feedback loop. Learn how to select a subset of system functions to test with a small group of users to get high value information at low cost. Learn how usability testers can take advantage of naïve questions from novice users as well as the tunnel vision and bias of domain experts. Rob shares examples of usability testing for a variety of technologies including mobile and web based products.

More Information
Learn more about Rob Sabourin.
TJ Improve Your Social and In-Person Networking Skills NEW
Johanna Rothman, Rothman Consulting Group, Inc.
Tuesday, October 1, 2013 - 8:30am - 12:00pm

You don’t have to be a social butterfly to succeed with social networking. As a manager, tester, or QA professional, you need to differentiate yourself from the pretenders. If you are a “doer,” it’s time to start building your reputation at work and extending your reach on social networking sites, discussion forums, through online participation and at conferences like STAR. Whether you are searching for a new job, recruiting a candidate, or looking for new ways to solve problems, you need to know how to network. However, as a professional, you want to network with authenticity by making a “warm” connection—having a reason to connect and something to give. Johanna Rothman helps you recognize and analyze your current business relationships and plan ways to expand and extend your networking. Leave with an action plan and a new, budding network to help you implement that plan.

More Information
Learn more about Johanna Rothman.
TK Production Performance Testing in the Cloud
Dan Bartow, SOASTA, Inc.
Tuesday, October 1, 2013 - 1:00pm - 4:30pm

Testing in production for online applications has evolved into a critical component of successful performance testing strategies. Dan Bartow explains the fundamentals of cloud computing, its application to full-scale performance validation, and the practices and techniques needed to design and execute a successful testing-in-production strategy. Drawing on his experiences, Dan describes the methodology he has used for testing numerous online applications in a production environment with minimal disruption. He explains how to create a performance testing strategy to give your team critical data about how your online application performs and scales. Learn how to create a robust lab-to-production ecosystem that delivers the answers about what will happen when peak traffic hits your site. Take back practical approaches to mitigate the three most common problems—security, test data, and potential live customer impact—that arise when embarking on testing in production.

More Information
Learn more about Dan Bartow.
TL Security Testing for Testing Professionals NEW
Jeff Payne, Coveros, Inc.
Tuesday, October 1, 2013 - 1:00pm - 4:30pm

Today’s software applications are often security-critical, making security testing an essential part of a software quality program. Unfortunately, most testers have not been taught how to effectively test the security of the software applications they validate. Join Jeff Payne as he shares what you need to know to integrate effective security testing into your everyday software testing activities. Learn how software vulnerabilities are introduced into code and exploited by hackers. Discover how to define and validate security requirements. Explore effective test techniques for assuring that common security features are tested. Learn about the most common security vulnerabilities and how to identify key security risks within applications and use testing to mitigate them. Understand how to security test applications—both web- and GUI-based—during the software development process. Review examples of how common security testing tools work and assist the security testing process. Take home valuable tools and techniques for effectively testing the security of your applications going forward.

More Information
Learn more about Jeff Payne.
TM Innovation Thinking: Evolve and Expand Your Capabilities NEW
Jennifer Bonine, tap|QA, Inc.
Tuesday, October 1, 2013 - 1:00pm - 4:30pm

Innovation is a word tossed around frequently in organizations today. The standard cliché is “Do more with less.” People and teams want to be innovative but often struggle with how to define, prioritize, implement, and track their innovation efforts. Jennifer Bonine shares the "Innovation Types" model to give you new tools to evolve and expand your innovation capabilities. Find out if your innovation ideas and efforts match your team and company goals. Learn how to classify your innovation and improvement efforts as core (to the business) or context (essential but non-revenue generating). WIth this data, you can better decide how much of your effort should being spent on core versus context activities. Take away new tools for classifying innovation and mapping your activities and your team’s priorities to their importance and value. With Jennifer’s guidance you’ll evolve and expand your innovation capabilities on the spot.

More Information
Learn more about Jennifer Bonine.
TN Collaboration Techniques: Forgotten Wisdom and New Approaches SOLD OUT NEW
Rob Sabourin, AmiBug.com
Dorothy Graham, Software Test Consultant
Tuesday, October 1, 2013 - 1:00pm - 4:30pm

In our increasingly agile world, the new buzzword is collaboration—so easy to preach but difficult to do well. Testers are challenged to work directly and productively with customers, programmers, business analysts, writers, trainers, and pretty much everyone in the business value chain. Testers and managers have many touch points of collaboration: grooming stories with customers, sprint planning with team members, reviewing user interaction with customers, troubleshooting bugs with developers, whiteboarding with peers, and buddy checking. Rob Sabourin and Dot Graham describe how collaboration worked on several agile projects, giving critiques of what worked well, where problems could arise, and additional aspects to consider. Join Rob and Dot to look at examples from agile projects and how forgotten but proven “ancient” techniques can be applied to your own collaboration, such as entry and exit criteria, role diversity, risk-based objectives, checklists, cross-checking, and root cause analysis. Bring your own stories of collaboration—good and bad—and see how forgotten wisdom can help improve today’s practices.

More Information
Learn more about Rob Sabourin and Dorothy Graham.
TO Introducing Keyword-Driven Test Automation
Hans Buwalda, LogiGear
Tuesday, October 1, 2013 - 1:00pm - 4:30pm

In both agile and traditional projects, keyword-driven testing has proven to be a powerful way to attain a high level of automation—when it is done correctly. Many testing organizations use keyword-driven testing but aren’t realizing the full benefits of scalability and maintainability that are essential to keep up with the demands of testing today’s software. Hans Buwalda outlines how you can meet what he calls the “5 percent challenges”—automate 95 percent of your tests with no more than 5 percent of your total testing effort—using his proven, keyword-driven test method. Hans also discusses how the keyword approach relates to other automation techniques like scripting and data-driven testing. Use the information and real-world application Hans presents to attain a very high level of automation with the lowest possible effort.

More Information
Learn more about Hans Buwalda.
TP Test Managers: How You Can Really Make a Difference NEW
Julie Gardiner, The Test People
Tuesday, October 1, 2013 - 1:00pm - 4:30pm

When leading a test team or working in an agile team, becoming a trusted advisor to other stakeholders is paramount. This requires three key skills: earning trust, giving advice, and building relationships. Join Julie Gardiner as she explores each of these skills, describing why and how a trusted advisor develops different “mindsets.” Julie shares a framework of “quick-wins” for test managers and team leaders who need to show the value of testing on projects. To help provide timely, relevant information to stakeholders, she shares seven powerful monitoring and predicting techniques. Julie demonstrates three objective measures showing how testing adds value to organizations. To make sure that everyone is on the same page, Julie urges managers to establish a foundation for testing through well-defined policy statements, agreed to and sanctioned by senior management. Receive a set of spreadsheets and utilities to support your activities as a test manager who really makes a difference.

More Information
Learn more about Julie Gardiner.
TQ How to Break Software: Web 101+ Edition NEW
Dawn Haynes, PerfTestPlus, Inc.
Tuesday, October 1, 2013 - 1:00pm - 4:30pm

When testing web applications, you may feel overwhelmed by the technologies of today's web environments. Web testing today requires more than just exercising a system’s functionality. Each system is composed of a customized mix of various layers of technology, each implemented in a different programming language and requiring unique testing strategies. This “stew” often leads to puzzling behavior across browsers; performance problems due to page design and content, server locations, and architecture; and inconsistent operation of navigation controls. Dawn Haynes shares an extensive set of test design ideas, standards, and software attacks. She explains their general applicability, effort needed to execute, and technical skill required for success, so you can determine what’s useful in your situation. Dawn demonstrates a variety of tools to help you improve your web testing of HTML syntax, page layout, download speeds, 508 compliance, readability, and more. From the easy and quick to implement to the techie hard stuff, Dawn has something for every web tester.

More Information
Learn more about Dawn Haynes.