Our Promise to You

http://www.sqe.com/

 
 
 

Monday Tutorials
Monday, October 01, 2012 8:30 AM
MA
Key Test Design Techniques C Full-day Lee Copeland, Software Quality Engineering All testers know that we can identify many more test cases than we will ever have time to design and execute. The major problem in testing is choosing a small, “smart” subset from the almost infinite number of possibilities available. Join Lee Copeland to discover how to design test cases using formal black-box techniques, including equivalence class and boundary value testing, decision tables, state-transition diagrams, and all-pairs testing. Explore white-box techniques with their associated coverage metrics. Evaluate more informal approaches, such as random and hunch-based testing, and learn the importance of using exploratory testing to enhance your testing ability. Choose the right test case design approaches for your projects. Use the test results to evaluate the quality of both your products and your test designs. Learn more about Lee Copeland
Monday, October 01, 2012 8:30 AM
MB
A Rapid Introduction to Rapid Software Testing C Full-day Michael Bolton, DevelopSense, Inc. You're under tight time pressure and have barely enough information to proceed with testing. How do you test quickly and inexpensively, yet still produce informative, credible, and accountable results? Rapid Software Testing, adopted by context-driven testers worldwide, offers a field-proven answer to this all-too-common dilemma. The “rapid” approach isn't just testing with speed or a sense of urgency; it's mission-focused testing that eliminates unnecessary work, assures that the most important things get done, and constantly asks how testers can help speed up the successful completion of the project. Rapid testing focuses on both the mind set and skill set of the individual tester who uses tight loops of exploration and critical thinking skills that help to continuously re-optimize testing to match clients' needs and expectations. In this one-day sampler of the approach, Michael Bolton introduces you to the skills and practice of Rapid Software Testing through stories, discussions, and "minds-on" exercises that simulate important aspects of real testing problems.
Participants are strongly encouraged to bring a Windows-based laptop computer to the workshop.
Learn more about Michael Bolton
Monday, October 01, 2012 8:30 AM
MC
The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More New Full-day Hans Buwalda, LogiGear Hans Buwalda shares his experiences and the strategies he's developed over the years for testing on large projects. When "normal" testing practices are stressed on a larger scale, a multitude of innovative ideas and concepts must emerge to support the industrial-strength practices such projects demand. Learn the significance of keyword automation on big test projects, how to design tests specifically for automation, how to fit automation and scaling into your strategy, and how to plan and manage such projects. Hans also explores how virtualization and cloud options can help you leverage resources, and how to deal with numerous versions and configurations common to large projects. Then, Hans points out the possibilities and pitfalls of outsourcing test automation. The information presented is based on his seventeen years of worldwide experience with testing and test automation, in testing projects that in some cases need to execute continuously for many weeks on numerous machines. Learn more about Hans Buwalda
Monday, October 01, 2012 8:30 AM
MD
Fundamentals of Risk-based Testing C Full-day Dale Perry, Software Quality Engineering Whether you are new to testing or looking for a better way to organize your test practices and processes, the Systematic Test and Evaluation Process (STEP™) offers a flexible approach to help you and your team succeed. Dale Perry describes this risk-based framework—applicable to any development lifecycle model—to help you make critical testing decisions earlier and with more confidence. The STEP™ approach helps you decide how to focus your testing effort, what elements and areas to test, and how to organize test designs and documentation. Learn the fundamentals of test analysis and how to develop an inventory of test objectives to help prioritize your testing efforts. Discover how to translate these objectives into a concrete strategy for designing and developing tests. With a prioritized inventory and focused test architecture, you will be able to create test cases, execute the resulting tests, and accurately report on the quality of your application and the effectiveness of your testing. Take back a proven approach to organize your testing efforts and new ways to add more value to your project and organization. Learn more about Dale Perry
Monday, October 01, 2012 8:30 AM
ME
Leading Change—Even If You’re Not in Charge C Morning Jennifer Bonine, Up Ur Game Learning Solutions Has this happened to you? You try to implement a change in your organization and it doesn’t get the support that you thought it would. And, to make matters worse, you can't figure out why. Or, you have a great idea but can’t get the resources required for successful implementation. Jennifer Bonine shares a toolkit of techniques to help you determine which ideas will—and will not—work within your organization. This toolkit includes five rules for change management, a checklist to help you determine the type of change process needed in your organization, techniques for communicating your ideas to your target audience, a set of questions you can ask to better understand your executives’ goals, and methods for overcoming resistance to change from teams you don’t lead. These tools—together with an awareness of your organization’s core culture—will help you identify which changes you can successfully implement and which you should leave until another day. Learn more about Jennifer Bonine
Monday, October 01, 2012 8:30 AM
MF
Seven Keys to Navigating Your Agile Testing Transition New Morning Bob Galen, RGalen Consulting Mary Thorn, Deutsche Bank Global Technologies So you’ve “gone agile” and have been relatively successful for a year or so. But how do you know how well you’re really doing? And how do you continuously improve your practices? And when things get rocky, how do you handle the challenges and not revert to old habits? You realize that the path to high-performance agile testing isn’t easy or quick. It also helps to have a guide. So consider this workshop your guide to ongoing, improved, and sustained high-performance. Join seasoned agile testing coaches Bob Galen and Mary Thorn as they share lessons from their most successful agile testing transitions. You’ll explore actual team case studies for building team skills, embracing agile requirements, fostering customer interaction, building agile automation, driving business value, and testing at-scale stories of agile testing excellence. You’ll examine the mistakes, adjustments, and the successes—so you’ll learn how to react to real-world contexts. You’ll leave with a better view of your teams’ strengths, weaknesses, and where you need to focus to improve. Learn more about Bob Galen, Mary Thorn
Monday, October 01, 2012 8:30 AM
MG
Measurement and Metrics for Test Managers C Morning Rick Craig, Software Quality Engineering To be most effective, test managers must develop and use metrics to help direct the testing effort and make informed recommendations about the software’s release readiness and associated risks. Because one important testing activity is to “measure” the quality of the software, test managers must measure the results of both the development and testing processes. Collecting, analyzing, and using metrics is complicated because many developers and testers are concerned that the metrics will be used against them. Join Rick Craig as he addresses common metrics—measures of product quality, defect removal efficiency, defect density, defect arrival rate, and testing status. Learn the guidelines for developing a test measurement program, rules of thumb for collecting data, and ways to avoid “metrics dysfunction.” Rick identifies several metrics paradigms and discusses the pros and cons of each. Delegates are urged to bring their metrics problems and issues for use as discussion points. Learn more about Rick Craig
Monday, October 01, 2012 8:30 AM
MH
Strengthening Your SQL Skills New Morning Karen N. Johnson, Software Test Management, Inc. Did you learn SQL sometime in your past—but now cannot remember how to write an inner left join? Or perhaps you never quite mastered inner versus outer joins much less left versus right. As a tester, you might need to access your applications’ databases directly, without going through the application itself—and developing SQL skills can help. Join Karen Johnson for this SQL refresher and learn to write complex queries including joins and unions, nested queries, and the syntax for sorting and manipulating data. The tutorial includes a review of queries, subqueries, joins, nested queries, and a brief look at stored procedures. With exercises, sample data models, and scripts, Karen guides your practice time to improve your ability to solve common testing problems using SQL. This tutorial is designed for those at an intermediate or advanced SQL skill level.
Participants are strongly encouraged to bring a laptop with an environment such as MySQL, SQL, or Oracle already installed. If you don’t have one of these environments, bring a laptop with Admin access so you can install a MySQL environment, run scripts to load data, and get practicing!
Learn more about Karen N. Johnson
Monday, October 01, 2012 8:30 AM
MI
What We Can Learn from the Big Bugs that Got Away C Morning Ken Johnston, Microsoft If you have ever shipped a piece of software, you probably have your own “big bug that got away” story. Some missed bugs are bizarre edge cases and perhaps forgivable. Others are enormous errors that have serious financial or business impact. In either case, it isn’t so much that the bug got through testing and out into the wild—unless you are working on rocket or nuclear plant systems. The important thing is what you learn from those bugs. In this fun and exciting workshop, Ken Johnston presents his own Microsoft bug stories, such as the elusive “Sasquatch” and “Billing 101” bugs that got away. Sharing other case studies from inside Microsoft and partner companies, Ken explores root cause concepts, such as seasonality, serialization, certification blind spots, and QoS for services. Take away models for doing your own root cause analysis and implementing process improvements within your team. Bring your current bug challenges and explore others’ bugs in this highly interactive session—where you learn while commiserating with your peers. Learn more about Ken Johnston
Monday, October 01, 2012 8:30 AM
MJ
Compact Test Process Improvement New Morning Martin Pol, Polteq Test Services BV Ruud Teunissen, Polteq Test Services BV Formal test improvement models such as TPI NEXT®, TMMi®, and CMMI® require formal assessments, process change working groups, extensive implementation programs—and very often call for organizational changes for implementation. Compact Test Process Improvement makes it possible for you to quickly implement changes within normal day-to-day activities to improve the testing process incrementally. It represents a way to select and implement a set of activities that rapidly improves testing’s contribution to project success. In this hands-on session Martin Pol and Ruud Teunissen use examples and case studies to show you how to apply Compact TPI in practice. Learn how to create a custom improvement plan for your team or organization, and find out how to select those improvements that are a perfect fit for your context. When you return to the office, you’ll be ready to get your team started on a systematic path for test improvement. TPI NEXT® is a registered trademark of Sogeti USA LLC. TMMi® is the registered trademark of the TMMi Foundation. CMMI® is a registered trademark of the Software Engineering Institute and Carnegie Mellon University. Learn more about Martin Pol, Ruud Teunissen
Monday, October 01, 2012 1:00 PM
MK
Team Leadership: Telling Your Testing Stories New Afternoon Bob Galen, RGalen Consulting It used to be that your work and results spoke for themselves. No longer is that the case. Today you need to be a better collaborator, communicator, and facilitator so that you focus your teams on delivering value. Join Bob Galen to explore the power of the story, one of the most effective communication paradigms. You can tell stories that create powerful collaboration. You can tell stories that communicate product requirements and customer needs. You can tell stories that inspire teams to deliver results. And you can tell stories that explain your value and successes to your customers and stakeholders. Explore basic storytelling techniques, specific framing for software testing activities, and test leadership storytelling that energizes and guides your teams. Take time to practice telling your stories, while becoming a much better storyteller and leader within your testing efforts. Learn more about Bob Galen
Monday, October 01, 2012 1:00 PM
ML
Exploratory Testing Explained C Afternoon Jon Bach, eBay Exploratory testing is an approach to testing that emphasizes the freedom and responsibility of testers to continually optimize the value of their work. It is the process of three mutually supportive activities done in parallel: learning, test design, and test execution. With skill and practice, exploratory testers typically uncover an order of magnitude more problems than when the same amount of effort is spent on procedurally scripted testing. All testers conduct exploratory testing in one way or another, but few know how to do it systematically to obtain the greatest benefits. Even fewer can articulate the process. Jon Bach looks at specific heuristics and techniques of exploratory testing that will help you get the most from this highly productive approach. Jon focuses on the skills and dynamics of exploratory testing, and how it can be combined with scripted approaches. Learn more about Jon Bach
Monday, October 01, 2012 1:00 PM
MN
Usability Testing in a Nutshell New Afternoon Julie Gardiner, Sage UK Because our systems are becoming more complex and the market is becoming saturated with competing brands, usability can be a differentiating factor in purchasing decisions. A classic system requirement is “The system must be user friendly” but what does that mean, and more importantly, how do we test this requirement? Join Julie Gardiner as she describes usability techniques you can employ to demonstrate a user interface’s efficiency and effectiveness. Find out how to predict and test for usability, and, most importantly, for user satisfaction before the software’s release. Take back new knowledge of proven usability testing techniques: heuristic evaluation, cognitive walkthroughs, focus groups, user personas, contextual task analysis, usability labs, and satisfaction surveys. Learn how to define usability goals and how to get management to take usability defects seriously. If you want to improve your skills in usability testing, this session is for you.
Delegates are encouraged to bring a laptop to this session.
Learn more about Julie Gardiner
Monday, October 01, 2012 1:00 PM
MO
Test Management for Cloud-based Applications New Afternoon Ruud Teunissen, Polteq Test Services BV Because the cloud introduces additional system risks—Internet dependencies, security challenges, and performance concerns, and more—you, as a test manager, need to broaden your scope and update your team’s practices and processes. Ruud Teunissen shares a unique approach that directly addresses more than 140 new testing concerns and risks you may encounter in the cloud. Learn how to identify cloud-specific requirements and the risks that can ensue from those requirements. Then, explore the test strategies you'll need to adopt to mitigate those risks. Explore cloud services selection, implementation, and operations. Then, take a dive in to the wider scope of test management in the cloud. Take back the ammunition you’ll need to convince senior management that test managers should participate during the cloud services selection to help avoid risks before implementation and, further, why you should work with IT operations to extend test activities after the system goes live. Learn more about Ruud Teunissen
Monday, October 01, 2012 1:00 PM
MP
Free and Cheap Test Tools C Afternoon Randy Rice, Rice Consulting Services, Inc. Too often, testers have limited time and little or no money to purchase, learn, and implement commercial test tools. However, the most effective testers have accumulated and regularly use their own personal toolkit of free and cheap test tools. Since 2001, Randy Rice has been researching such tools and has compiled a set of tools that he and many others in his consulting practice have found very helpful. Randy shares a plethora of tools that you can employ to add power and efficiency to your test planning, execution, and evaluation. He will present and demonstrate tools for pairwise test design, test management, defect tracking, test data creation, test automation, test evaluation, web-based load testing, and more. Randy shows you how to make the case for incorporating free and open-source tools into organizations that may resist such tools. Learn how you can combine these tools to achieve greater test speed and better test coverage—with little or no out-of-pocket cost. Learn more about Randy Rice


Top of Page
 
Send us Your Feedback