Skip to main content

Concurrent Sessions

Sessions are offered on Wednesday and Thursday at the conference and do not require a pre-selection. Build your own custom learning schedule, or choose to follow one of our tracks by topic area.

Concurrent Sessions
W1 Emotional Intelligence in Software Testing
Thomas McCoy, Australian Department of Families, Housing, Community Services and Indigenous Affairs
Wednesday, May 1, 2013 - 11:30am - 12:30pm

As test managers and test professionals we can have an enormous emotional impact on others. We're constantly dealing with fragile egos, highly charged situations, and pressured people playing a high-stakes game under conditions of massive uncertainty. We're often the bearers of bad news and are sometimes perceived as critics, activating people's primal fear of being judged. Emotional intelligence (EI), the concept popularized by Harvard psychologist and science writer Daniel Goleman, has much to offer test managers and testers. Key EI skills include self awareness, self management, social awareness, and relationship management. Explore the concept of EI, assess your own levels of EI, and look at ways in which EI can help. Thomas McCoy discusses how EI can be useful in dealing with anger management, controlling negative thoughts, processing constructive criticism, and dealing with conflict—all within the context of the testing profession. This lively session is grounded in real-life examples, giving you concrete ideas to take back to work. 

More Information
Learn more about Thomas McCoy.
W2 The Test Coverage Outline: Your Testing Road Map
Paul Holland, Testing Thoughts
Wednesday, May 1, 2013 - 11:30am - 12:30pm

To assist in risk analysis, prioritization of testing, and test reporting (telling your testing story), you need a thorough Test Coverage Outline (TCO)—a road map of your proposed testing activities. By creating a TCO, you can prepare for testing without having to create a giant pile of detailed test cases. Paul Holland says that a comprehensive TCO helps the test team to get buy-in for the overall test strategy very early in the project and is valuable for identifying risk areas, testability issues, and resource constraints. Paul describes how to create a TCO including the use of heuristic-based checklists to help ensure you don’t overlook important elements in your testing. Learn multiple approaches for critical information gathering, the artifacts used as input for creating a TCO, and how you can use a TCO to maintain testing focus. Take back a new, lightweight tool to help you tell the testing story throughout your project.

More Information
Learn more about Paul Holland.
W3 The Pathologies of Failed Test Automation Projects
Wednesday, May 1, 2013 - 11:30am - 12:30pm

Most test automation projects never die—they just become a mess and are redone. Initial solutions that start well and are full of promise often end up as brittle and unmaintainable monsters consuming more effort than they save. Political feuds can flourish as different automation solutions compete for attention and dominance. Tests become inefficient in both execution time and resource usage. Disillusionment ensues, projects are redefined, and the cycle begins again. Surely we can learn how to avoid such trouble on the next project. Michael Stahl has analyzed automation projects and identified recognizable failure patterns—mushrooming, duplication, going for the numbers, and others. Michael describes these patterns, suggests how to detect them early, and shares ways to avoid or mitigate them. Whether your team is just starting on test automation—or is already in full flight—you’ll take back ideas to improve the chances of achieving success in your test automation efforts.

More Information
Learn more about Michael Stahl.
W4 Baking In Quality: The Evolving Role of the Agile Tester
Dena Laterza, Agile Velocity
Wednesday, May 1, 2013 - 11:30am - 12:30pm

While more and more organizations are practicing agile development methodologies,  many have not learned how to “bake in quality” throughout the process. As an agile tester, you are an integral part of the development team—working on requirements, design, implementation, writing automated tests, and testing However, are all team members working together as they should to ensure quality from day one through final delivery? Dena Laterza offers proven tips to help you and your team make the cultural shift to adopt and foster a “quality first” team standard. Gain an understanding of a tester's involvement in test-driven development and behavior-driven development. Take back new ideas on automating tests, working with stakeholders, and becoming a fully informed tester. Learn how to push testing back into development and maximize the value of testers on the team. Take back a plan to get your agile team working together—as a team.

More Information
Learn more about Dena Laterza.
W5 Building an Enterprise Performance and Load Testing Infrastructure
Dave Ogletree, Bridgepoint Education
Wednesday, May 1, 2013 - 11:30am - 12:30pm

Are you frustrated by how long it takes your IT department to provision development and test environments? Have you performed load testing on inadequate hardware only to find performance problems emerge in production? Dave Ogletree leveraged virtualization to solve these problems. He and his team at Bridgepoint Education created and implemented integrated virtualized systems for both developers and testers. Dave describes how his organization built a catalog-based interface for provisioning, created virtual application templates, and established service level agreements to deliver complex and integrated systems. These virtualized systems are used to build testing and performance labs. Dave’s organization has used these labs to build and test software supporting more than 85,000 students, and a national marketing campaign for the 2012 Summer Olympics. Take away best practices for using virtualization technology to perform large-scale load testing, and learn from Dave’s experience at an enterprise level to improve your own testing capabilities.

More Information
Learn more about Dave Ogletree.
W6 Yin and Yang: Metrics within Agile and Traditional Lifecycles
Shaun Bradshaw, Zenergy Technologies, Inc.
Bob Galen, RGalen Consulting
Wednesday, May 1, 2013 - 11:30am - 12:30pm

Metrics are powerful tools when used to effect positive change in a project or organization. However, the value and benefits of metrics are often dependent on the context. While certain metrics provide information and insight to drive decision making for a traditional development approach, they may not be useful in an agile landscape—and vice versa. QA and agile experts Shaun Bradshaw and Bob Galen delve into the value, pitfalls, pros, and cons of various metrics in agile and waterfall development environments. Hear their experiences as they discuss and explore a variety of project-level, software development, and software testing metrics through the lens of both traditional and agile development contexts. Although Bob and Shaun respect each other’s knowledge and skill, they don’t often agree on metrics. And in this showdown, you’ll see why! Be prepared to learn, be entertained, and be ready to get in on the action as these two metrics titans go head-to-head.

More Information
Learn more about Shaun Bradshaw.
W7 Taming the Beast: Test/QA on Large-scale Projects
Shaun Bradshaw, Zenergy Technologies, Inc.
Wednesday, May 1, 2013 - 1:45pm - 2:45pm

Large, complex projects—those with more than 100 people and lasting more that a year—require special considerations for developing, communicating, and managing the overall QA strategy and test plans. Shaun Bradshaw provides insights he gained from a $70 million financial software implementation project comprised of multiple components including a general ledger, business intelligence platform, data warehouse, and data integration hub. Tasked with managing the entire test effort as part of the third-party validation team, Shaun acted as QA architect to create the test strategy and plan for the project. He shares the challenges he and his team had to overcome to help deliver a smooth implementation and installation. Shaun discusses his experiences aligning the QA strategy with the culture of the organization and ensuring key test and QA roles were filled with the right people. Take back new ideas and approaches you can use to tame the testing beast in your large project.

More Information
Learn more about Shaun Bradshaw.
W8 Think Different: Visualization Tools for Testers
Pascal Dufour, codecentric
Wednesday, May 1, 2013 - 1:45pm - 2:45pm

Traditional processes have required testers to create a large amount of documentation in the form of test plans, test cases, and test reports. It’s time to think differently. Creating test artifacts in the “old school” textual style takes too much time away from actual testing. Besides, text is boring and uses only the left side of your brain. Visual images—charts, graphs, and diagrams—engage your right brain for more thinking power. The old saying “A picture is worth a thousand words” is really true! Pascal Dufour shows how you can employ visualizations—mind maps, drawings, dashboards, charts, and other graphics—to improve clarity and guide your team to create lightweight testware artifacts. Find out how visualization helps you more easily and more quickly understand information—enabling and improving team decision making, collaboration, and agility. Join Pascal to see how visual tools, often very basic and simple, can help you think different—and perform better. 

More Information
Learn more about Pascal Dufour.
W9 Four Crucial Tips for Automated Web 2.0 Testing
Jim Holmes, Telerik
Wednesday, May 1, 2013 - 1:45pm - 2:45pm

The vast majority of problems found in web-based functional tests can be traced to a few common issues—dealing with dynamic page content, understanding the differences between explicit and implicit waits, choosing a proper element locator strategy, and understanding how to deal with setup or prerequisite data. Jim Holmes describes the basics of dynamic web page content (AJAX calls and the infamous spinning wheels and buttons) and how to create automated tests that properly deal with the main variants of dynamic content. Learn the importance of choosing proper element locators for your tests and the impacts of the various options. Discover effective approaches for building and using setup data for your tests—saving time and effort. We’ll use Selenium for examples and demos in C#, but we’ll discuss how these solutions are applicable to other technologies as well. Take away proven methods for ensuring your functional web tests are more robust, accurate, and maintainable.

More Information
Learn more about Jim Holmes.
W10 Exploratory Testing on Agile Projects: Combining SBTM and TBTM
Christin Wiedemann, Professional Quality Assurance, Ltd.
Wednesday, May 1, 2013 - 1:45pm - 2:45pm

Exploratory testing provides both flexibility and speed—characteristics that are vitally important with the quick pace of short agile iterations. With session-based test management (SBTM), exploratory testing is structured and documented in pre-defined sessions. A newer approach, thread-based test management (TBTM), organizes test efforts by threads of activities rather than sessions. So, how do you retain the traceability of SBTM without losing the creativity offered by TBTM? The answer is xBTM—a combination of SBTM and TBTM. After introducing SBTM and TBTM, Christin shows how she uses xBTM on projects to obtain maximum efficiency—only creating test documentation that actually adds value. Using a mock example, Christin describes the xBTM workflow on an agile project, covering all the steps from test planning and performing the tests through reporting. Her focus is on sharing practical examples and providing a range of flexible tools that you can immediately apply on almost any project.

More Information
Learn more about Christin Wiedemann.
W11 Cutting-edge Performance Testing on eCommerce Websites
Ron Woody, GSI Commerce
Wednesday, May 1, 2013 - 1:45pm - 2:45pm

Having problems with your website’s performance? Does it take too much time and effort to determine the cause of a particular page’s poor performance? Would you like to find the root cause of client-side issues in an automated way? If you answered yes to any of these questions, then this session is for you. At GSI Commerce, an eBay company, Ron Woody manages a large team of performance engineers working on nearly nearly 100 eCommerce websites. Ron and his team have developed cutting-edge approaches for automating client- and server-side performance testing. Learn the specific approaches Ron’s team uses today for pre-release performance tests, production performance management, and website optimization. Find out the ways they’ve automated cross-browser performance testing—and analysis—to increase productivity and efficiency. Covering these and additional topics Ron shares a toolkit of performance testing ideas and approaches your team can use to ensure optimal application performance and a better user experience.  

More Information
Learn more about Ron Woody.
W12 Presenting Test Results with Clarity and Confidence
Griffin Jones, Congruent Compliance
Wednesday, May 1, 2013 - 1:45pm - 2:45pm

Test leaders are often asked to present the results of their testing to management—and even to auditors. Can you clearly and confidently explain and summarize your test plans and results? Can you prove that your testing is compliant with internal procedures and regulations? Griffin Jones presents a model for how to prepare and present your test work and demonstrate compliance. He explores how you can appear—and be—congruent, honest, and competent during formal and informal presentations. Griffin describes a process model for developing your presentation, laying out the objective evidence of what you did, ways you can demonstrate control, and how to show your willingness to perform. Then, he reviews the expectations of different stakeholders and identifies common misconceptions and traps to avoid. Don’t allow your anxieties to ruin your next presentation. Learn tips to help you maintain composure during difficult moments and leave with a model to present your testing results with clarity and confidence.

More Information
Learn more about Griffin Jones.
W13 Increase Your Team’s Efficiency with Kanban
Wednesday, May 1, 2013 - 3:00pm - 4:00pm

Test teams must perform a wide variety of tasks from testing new functions and performing regression tests to helping with bug fixes, producing test reports, and working on test improvements. With all these activities, it is a challenge to keep priorities straight, operate most efficiently, and clearly show stakeholders all that the team is working on. Derk-Jan de Grood shares his experiences with Kanban, a proven method for managing workflow, as a visual tool to help teams allocate resources, reduce waste, and make progress visible to all stakeholders. Learn how you can assign testers to tasks that match their expertise and how improving “flow” will reduce time-to-market and testing costs. With the Kanban process helping you manage test activities, your company’s business managers will finally understand what is happening in testing every day, and you’ll know that everyone is working on the right task to maximize value to the project.

More Information
Learn more about Derk-Jan de Grood.
W14 Deadlines Approaching? Budgets Cut? How to Keep Your Sanity
Geoff Horne, NZTester Magazine
Wednesday, May 1, 2013 - 3:00pm - 4:00pm

Testing projects have a habit of dissolving into chaos—and even strife—as deadlines approach and budgets are cut. When asked to do the impossible, risk management and mitigation tools can be the only way for testers to survive. Geoff Horne presents a proven method he uses for identifying and assessing risks and the effects—both positive and negative—of various mitigation approaches. Through the school of hard knocks, Geoff has learned that the most plausible risk mitigation strategy is not always the best and may actually harm the project. Successfully used on different projects across different types of businesses, Geoff’s approach is based on evaluating risks and assessing the impacts across key criteria: resources, productivity, cost, quality, and confidence. Geoff presents risk assessments in a color-coded graphical format that enables an easy, straightforward comparison and prioritization of the mitigation strategies under consideration. Learn to maintain your sanity when you are next asked to do the impossible—one more time. 

More Information
Learn more about Geoff Horne.
W15 Test Automation for Packaged Systems: Yes, You Can!
Chris Bushell, ThoughtWorks
Wednesday, May 1, 2013 - 3:00pm - 4:00pm

Today, most businesses are heavily dependent on packaged systems, sometimes called commercial off-the-shelf software, for large parts of their operation. Highly-customizable packages such as BMC’s Remedy, Oracle's Maxim, and many others run the show at many of the world’s largest companies. While offering many features and feature options, these packages provide rich software development environments and a “configuration” that is a highly complex programming exercise. Chris Bushell explores why packaged systems, which are just as vulnerable to defects as custom-developed software, have been missing out on the many benefits of early automated testing. Chris argues that it's time for a change. Drawing from hands-on experience working with customized packaged systems, Chris explains that these packages offer ease of customization over testability and offers information on overcoming their limitations. Take away tips to reduce defects, lower testing costs, and improve time to market with your company’s packaged systems.

More Information
Learn more about Chris Bushell.
W16 Automation Culture: Essential to Agile Success
Geoff Meyer, Dell, Inc.
Wednesday, May 1, 2013 - 3:00pm - 4:00pm

For organizations developing large-scale applications, transitioning to agile is challenging enough. If your organization has not yet adopted an automation culture, brace yourself for a big surprise because automation is essential to agile success. From the safety nets provided by automated unit and acceptance tests to the automation of build, build verification, and deployment processes, the iterative nature of agile demands a culture of automation across your engineering organization. Geoff Meyer shares lessons learned in adopting a test automation culture as the Dell Enterprise Systems Group simultaneously adopted Scrum and agile processes across its entire software product portfolio. Learn to address the practical challenges of establishing an automation culture at the outset by ensuring that your organizational makeover incorporates changes to your hiring, staffing, and training practices. Find out how you can apply automation beyond the Scrum team in areas including continuous integration, scale and stress testing, and performance testing. 

More Information
Learn more about Geoff Meyer.
W17 Performance Testing Web 2.0 Applications—in an Agile World
Mohit Verma, Tufts Health Plan
Wednesday, May 1, 2013 - 3:00pm - 4:00pm

Agile methodologies bring new complexities and challenges to traditional performance engineering practices, especially with Web 2.0 technologies that implement more and more functionality on the client side. Mohit Verma presents a Scrum-based performance testing lifecycle for Web 2.0 applications. Mohit explains when performance engineers need to participate in the project, discusses how important it is for the performance engineer to understand the technical architecture, and explores the importance of testing early to identify design issues. Find out how to create the non-functional requirements that are critical for building accurate and robust performance test scenarios. Learn how to implement practices for continuous collaboration between test engineers and developers to help identify performance bottlenecks early. Learn about the tools available today to help you address the testing and tuning of your Web 2.0 applications. Leave with a new appreciation and new approaches to ensure that your Web 2.0-based applications are ready for prime time from day one.

More Information
Learn more about Mohit Verma.
W18 Reports of the Death of Testing Have Been Greatly Exaggerated
Ruud Teunissen, Polteq Test Services BV
Wednesday, May 1, 2013 - 3:00pm - 4:00pm

Have you heard? It’s all over the social media. We are the “last generation of testers.” Testing is dead. No more classical testing—too much inflexible process. Context driven? That is a code phrase for do whatever. Agility? Developers do testing, and testers become developers. DevOps? Development and operations join forces—and test is not in the picture. And, companies don’t test anymore—they outsource. Ruud Teunissen believes we must save the indispensable craft of testing. Non-functional tests require special skills; new paradigms like cloud and mobile must be explored and tested; Enterprise-to-enterprise integration tests become more vital as systems grow larger and more complex. And who’s going to do that testing? Testing skills are needed to work effectively and efficiently in these new contexts. Learn to save the testing skills within your organization and do what you’ve always done best—save the day by preventing defects from going live.

More Information
Learn more about Ruud Teunissen.
T1 Building Successful Test Teams
Lloyd Roden, Lloyd Roden Consultancy
Thursday, May 2, 2013 - 9:45am - 10:45am

“People are the most important asset of any organization.” Even though we hear that a lot, leaders and managers actually spend very little time focusing on the people side of testing. The skills and makeup of a test team are important and must be managed and cultivated properly. Individuals are very different and will react differently to various situations. Lloyd Roden describes the “tester’s style analysis questionnaire” and four types of testers—the pragmatist, the facilitator, the analyst, and the pioneer. When we recognize and acknowledge individual differences, we can use the individual’s strengths rather than dwell on the weaknesses. Lloyd examines how conflicts arise and how this style analysis questionnaire can help defuse conflicts to bring out the best in teams. Recruiting can be difficult, too. How do we recognize good testers during interviews? Once again, the style analysis can help. Lloyd provides Seven Top Tips for motivating your team to become more productive. Join Lloyd and take back ideas to help you assemble your most effective team.

More Information
Learn more about Lloyd Roden.
T2 There’s No Room for Emotions in Testing—Not!
Michael Bolton, DevelopSense
Thursday, May 2, 2013 - 9:45am - 10:45am

Software testing is a highly technical, logical, rational task. There's no place for squishy emotional stuff here—not among professional testers. Or is there? Because of commitment, risk, schedule, and money, emotions often do run high in software development and testing. Our ideas about quality and bugs are rooted in our desires, which in turn are rooted in our feelings. People don't decide things based on the numbers; they decide based on how they feel about the numbers. It is easy to become frustrated, confused, or bored; angry, impatient, or overwhelmed. However, if we choose to be aware of our emotions and are open to them, feelings can be a powerful source of information for testers, alerting us to problems in the product and in our approaches to our work. You may laugh, you may cry...and you may be surprised as Michael Bolton discusses the important role that emotions play in excellent testing.

More Information
Learn more about Michael Bolton.
T3 Unleash Service Virtualization: Reduce Testing Delays
Allan Wagner, IBM Software—Rational
Thursday, May 2, 2013 - 9:45am - 10:45am

The ability to rapidly release new product features is vital to the success of today’s businesses. To accelerate development, teams are adopting agile practices and leveraging service-oriented architectures to integrate legacy applications with other systems. At the same time, testing these composite applications can take longer and cost more. Al Wagner explains the whys and hows of service virtualization and explores ways testers can employ this technology to simulate parts of complex systems and begin testing earlier. Join Al to gain insights on which services to virtualize in order to maximize your ROI. Discover how testing at the API layer can isolate defects for faster remediation and avoid late stage integration issues. Stop waiting until the complete application is available in a test environment to begin your work. Leave with an understanding of how virtual components can make incomplete or unavailable application functionality available for testing.

More Information
Learn more about Allan Wagner.
T4 Android Mobile Testing: Right before Your Eyes
Thursday, May 2, 2013 - 9:45am - 10:45am

Few topics are hotter than mobile software development. Every company seems to be rushing to release its own mobile applications. When it comes time to build that software, they quickly learn that things are hard. Many developers claim that it is difficult or impossible to test drive the application while it’s in development. Because traditional testing tools can’t automate the tests in the emulator or on the device, testers are usually left with a manual testing approach. Join Cheezy Morgan and David Shah as they reveal the secret of quickly delivering a fully-tested, high-quality Android application. Following an acceptance test-driven (ATDD) approach right before your eyes, Cheezy will write automated tests prior to development. While David is test driving the Android code to make Cheezy’s tests pass, Cheezy will perform exploratory testing on David’s unfinished work. This fast-paced, hands-on session demonstrates the use of close collaboration and test automation to deliver high-quality mobile applications.

More Information
Learn more about Jeff "Cheezy" Morgan.
T5 Load and Performance Testing in the Cloud: Myth vs. Reality
Thursday, May 2, 2013 - 9:45am - 10:45am

Is the cloud just another overhyped IT buzzword or a transformational technology wave? Steve Weisfeldt helps you get past all the noise and identify how you can leverage the cloud’s flexibility and scalability to save time and money on load and performance testing. Steve describes ways to generate user loads that are more geographically accurate and easily scaled to large user loads. He explores the myth that “generating load from the cloud” is the only valid testing approach and discusses when it is—and is not—important to test from the cloud for web and mobile apps. Learn specific tactics including load generator provisioning approaches for generating load from the cloud and identify the advantages of a “hybrid” approach for combining testing from within the LAN and the cloud. Take back a new approach to performance testing that enhances your “inside the firewall” load generation techniques.

More Information
Learn more about Steve Weisfeldt.
T6 Testing—After You’ve Finished Testing
Jon Bach, eBay, Inc.
Thursday, May 2, 2013 - 9:45am - 10:45am

Stakeholders always want to release when they think we’ve finished testing. They believe we have discovered “all of the important problems” and “verified all of the fixes”—and now it’s time to reap the rewards. However, as testers we still can assist in improving software by learning about problems after code has rolled live—especially if it’s a website. Jon Bach explores why and how at eBay they have a post-ship site quality mindset in which testers continue to learn from live A/B testing, operational issues, customer sentiment analysis, discussion forums, and customer call patterns—just to name a few. Jon explains what eBay’s Live Site Quality team learns every day about what they just released to production. Take away new ideas on what you can do to test and improve value—even after you’ve shipped.

More Information
Learn more about Jon Bach.
T7 Crowdsourcing: An Innovative Approach to Testing
Ralph Decker, Alliance Global Services
Thursday, May 2, 2013 - 11:15am - 12:15pm

In the perfect world, you would prefer to hire and develop a large number of the most qualified testers to work on your projects. However, when that’s impossible, crowdsourcing may be the answer. Crowdsourcing provides a mechanism for finding and using large numbers of qualified individuals to work on the task at hand. Spread across various disciplines—design, development, testing, and R&D—crowdsourced testing is the powerful combination of cloud economics with the effectiveness and efficiency of the crowd. More important than the touted economics of crowdsourcing is its ability to marshal an abundance of talent and resources to the world of testing. Clever approaches to testing can provide talented people and additional resources for completing complex testing tasks quickly. Learn how Ralph Decker took an impossible performance testing challenge to the crowd and created a test solution—without licensing, infrastructure, bandwidth, or personnel costs—that provided a performance test bed of hundreds of thousands of concurrent users.  

More Information
Learn more about Ralph Decker.
T8 Better Unit Tests with ApprovalTests: An Open Source Library
Woody Zuill, Hunter Industries
Thursday, May 2, 2013 - 11:15am - 12:15pm

When a unit test fails, we want clear, expressive, rich feedback so we can quickly understand the nature of the failure and get a good idea of how to fix it. Unit testing frameworks are fantastic at running tests and alerting us to any failure. Unfortunately, sometimes (or is that often?) the details of the failure are difficult to evaluate. Isn’t there some way to make the specifics jump off the screen so we don’t have to dig through all the details? ApprovalTests library does just that. Woody Zuill demonstrates how to include ApprovalTests in your current unit test framework to easily get these benefits. It works for everything from simple strings to arrays, GUIs, and complex objects. Plus, it’s free and available for C#, Java, PHP, and Ruby. To go beyond the limits of traditional assertion tests, learn how to easily enhance and build on your current unit testing skills by adding ApprovalTests. 

More Information
Learn more about Woody Zuill.
T9 Flintstones or Jetsons? Jump Start Your Virtual Test Lab
David Silk, Verisign, Inc.
Thursday, May 2, 2013 - 11:15am - 12:15pm

The power of virtualization has made it easy and inexpensive to create multiple environments for testing. How you implement your virtualization strategy can boost not only the savings on physical gear and availability of test environments but also your testing productivity. Sharing his experience working through the evolution of Verisign’s virtual test lab, David Silk examines how a well-implemented virtual lab can push your testing productivity to new levels. Learn about the key practices to get a virtual test lab working like an advanced Jetson’s-style machine while avoiding the Flintstone's dinosaur approach. See how Verisign’s approach focuses on the whole environment—not just one virtual machine at a time. Learn where to start and how to build a virtual test lab that leverages the technology, ensures repeatability, and saves test engineers time and effort.  Don’t be a Flintstone!

More Information
Learn more about David Silk.
T10 Mobile Testing Methodologies: Trends, Successes, and Pitfalls
Eran Kinsbruner, Perfecto Mobile
Thursday, May 2, 2013 - 11:15am - 12:15pm

In today's dynamic mobile marketplace—where new handsets and mobile operating systems are released every day—your ability to deal with these changes which impact your mobile product is vital. The mobile application lifecycle today must be short; must be of great quality; cover a myriad of handsets with different sizes, layouts, and enhanced capabilities; and, of course, cover as many operating systems as possible. This lifecycle requires a new methodology and approach. Eran Kinsbruner describes the mobile project challenges and provides real life examples of ways to overcome them. Take back the main mobile market trends and forecasts together with the key automation tools available for your use today. Learn the differences between the various mobile cloud and automation tools to help you select the right tool for your project. See how you can ramp up a successful mobile project, avoid the common pitfalls, and shorten the time to market—all while delivering a top-notch quality product.

More Information
Learn more about Eran Kinsbruner.
T11 A Year of Testing in the Cloud: Lessons Learned
Thursday, May 2, 2013 - 11:15am - 12:15pm

Jim Trentadue describes how his organization first used the cloud for its non-production needs including development, testing, training, and production support. Jim begins by describing the components of a cloud environment and how it differs from a traditional physical server structure. To prove the cloud concept, he used a risk-based model for determining which servers would be migrated. The result was a win for the organization from a time-to-market and cost savings perspective. Jim shares his do’s and don’ts for moving to the cloud. Do’s include ensure you identify all costs associated with the new cloud infrastructure, implement a risk-based approach to cloud migration, define a governance model, and define Service Level Agreements for your cloud vendor. Jim warns against creating an open-ended environment without a charge-back model to allocate costs and failing to continuously monitor the overall environment.

More Information
Learn more about Jim Trentadue.
T12 Testing with an Accent: Internationalization Testing
Thursday, May 2, 2013 - 11:15am - 12:15pm

Finding time to test the basic functionality, performance, and security of a system is difficult enough, so how do you find time to add internationalization (i18n) and localization (l10n) testing? Today’s world is very small, and you may already have international users in your target market. Can you really afford to ignore those who can’t enter their name correctly with the default US-ASCII character set? Will it still be a quality product to them? Paul Carvalho shares how you can—with a little creative thinking and design—incorporate i18n and l10n testing into your regular routine. Great testing requires the right mindset, applied insight, preparation, and dedication. Learn how to identify the system elements that pose juicy risks; go beyond looking at the UI, using simple tools and tricks you can try right away; and discuss ways to integrate i18n into your functional testing in a fun way with little overhead. Impress your co-workers and delight your customers!

More Information
Learn more about Paul Carvalho.
T13 Strength in Numbers: Using Web Analytics to Drive Test Requirements
Lindiwe Vinson, Organic, Inc.
Thursday, May 2, 2013 - 1:30pm - 2:30pm

Once a client’s website is built, you’d think it would be time for a well-deserved break. However, almost immediately, questions come up—can we capture a larger audience? close more orders? increase our sales? And so, it’s time to redesign the site—and the test strategy and plans—based on real-world data. Lindiwe Vinson sees web analytics as a tool for guiding your test planning and test case design efforts. By learning the most common user flows through the site—as well as the most commonly used browsers, operating systems, mobile devices, and plug-ins—testers can design better tests and set the right test priorities. Learn the vocabulary of web analytics—like bounce rate, time-on-site, conversion, and more—and the web analytics tools that will transform your testing. Learn how web analytics plays a major role in the creation of Organic Inc.’s test plans for all client engagements and the tools they use every day to make their clients’ websites shine.

More Information
Learn more about Lindiwe Vinson.
T14 White-box Testing: When Quality Really Matters
Jamie Mitchell, Jamie Mitchell Consulting, Inc.
Thursday, May 2, 2013 - 1:30pm - 2:30pm

Jamie Mitchell explores perhaps the most underused test technique in our arsenal—white-box testing. Also known as structural testing, white-box requires some programming expertise, access to the code, and an analysis tool. If you only employ black-box testing, you could easily ship a system having tested only 50 percent or less of the code base. Not good! Although you might believe that the developers have performed sufficient unit and integration testing, how do you know that they have achieved the level of coverage your project requires? Jamie describes the levels of code coverage that the business and your customers may need—from statement coverage to modified condition/decision coverage. Leading you through examples of pseudocode, Jamie explains when you should strive to achieve different code coverage target levels. Even if you have no personal programming experience, understanding structural testing will make you a better tester. So, join Jamie for some code-diving!

More Information
Learn more about Jamie Mitchell.
T15 Innovations in Test Automation: It’s Not All about Regression
John Fodeh, Cognizant
Thursday, May 2, 2013 - 1:30pm - 2:30pm

Although classic test automation, which usually focuses on regression testing, has its its place in testing, there is much more you can do to improve testing productivity and its value to the project and your organization. Through experience-based examples, video clips, and demonstrations, John Fodeh shares one company’s innovation journey to improve its test automation practice. John illustrates how they learned to apply automated “test monkeys” that explore the software in new ways each time a test is executed. Then, he describes how the test team uses weighted probability tables to increase each test’s “intelligence” factor. Find out how they implemented model-based testing to improve automation effectiveness and how this practice led to the even more valuable behavior-driven testing approach they employ today. With these and other alternative approaches you, too, can get more mileage from your automation efforts. Join John to get inspired and start your own journey of innovation with new ideas that enhance your test automation strategy. 

More Information
Learn more about John Fodeh.
T16 Introducing Mobile Testing to Your Organization
Eric Montgomery, Progressive Insurance
Thursday, May 2, 2013 - 1:30pm - 2:30pm

Mobile is an integral part of our daily lives, and if it’s not already part of your business model, it soon will be. When that happens, will you be ready to tackle the demands of testing web and native mobile apps? From the perspective of a test lead, Eric Montgomery describes the challenges Progressive Insurance, a company with a strong web presence, recently faced—learning new technologies, transforming the approach of testers from PC-based to mobile-based, and working with testing tools in a market that has yet to see a definitive leader emerge. Learn from Eric's experiences and return to your job with ideas on training web testers to be mobile testers. Take back proven techniques for testing mobile devices, ways of choosing devices for test, methods of sharing information, developing a sense of community among testers, choosing tools from the available market, and keeping up with rapid technology changes.

More Information
Learn more about Eric Montgomery.
T17 Better Security Testing: Using the Cloud and Continuous Delivery
Gene Gotimer, Coveros, Inc.
Thursday, May 2, 2013 - 1:30pm - 2:30pm

Even though many organizations claim that security is a priority, that claim doesn’t always translate into supporting security initiatives in software development or test. Security code reviews often are overlooked or avoided, and when development schedules fall behind, security testing may be dropped to help the team “catch up.” Everyone wants more secure development; they just don’t want to spend time or money to get it. Gene Gotimer describes his experiences with implementing a continuous delivery process in the cloud and how he integrated security testing into that process. Gene discusses how to take advantage of the automated provisioning and automated deploys already being implemented to give more opportunities along the way for security testing without schedule disruption. Learn how you can incrementally mature a practice to build security into the process—without a large-scale, time-consuming, or costly effort.

More Information
Learn more about Gene Gotimer.
T18 Things Could Get Worse: Ideas About Regression Testing
Michael Bolton, DevelopSense
Thursday, May 2, 2013 - 1:30pm - 2:30pm

More Information Coming Soon!

More Information
Learn more about Michael Bolton.
T19 Maybe We Don’t Have to Test It
Eric Jacobson, Turner Broadcasting
Thursday, May 2, 2013 - 3:00pm - 4:00pm

Testers are taught they are responsible for all testing. Some even say “It’s not tested until I run the product myself.” Eric Jacobson believes this old school way of thinking can hurt a tester’s reputation and—even worse—may threaten the team’s success. Learning to recognize opportunities where you may not have to test can eliminate bottlenecks and make you everyone’s favorite tester. Eric shares eight patterns from his personal experiences where not testing was the best approach. Examples include patches for critical production problems that can’t get worse, features that are too technical for the tester, cosmetic bug fixes with substantial test setup, and more. Challenge your natural testing assumptions. Become more comfortable with approaches that don’t require testing. Eliminate waste in your testing process by asking, “Does this need to be tested? By me?” Take back ideas to manage not testing including using lightweight documentation for justification. You may find that not testing may actually be a means to better testing.

More Information
Learn more about Eric Jacobson.
T20 Data Masking: Testing with Near-real Data
Martin Kralj, Ekobit
Thursday, May 2, 2013 - 3:00pm - 4:00pm

Organizations worldwide collect data about customers, users, products, and services. Striving to get the most out of collected data, they use it to fuel many day-to-day processes including software testing, development, and personnel training. The majority of this collected data is sensitive and falls under specific government regulations or industry standards that define policies for privacy and generally limit or prohibit using the data for these secondary purposes. Data masking solves this problem. It replaces sensitive information with data that looks real and is structurally similar to the actual information but is useless to anyone trying to obtain the real data. Learn about the process, pros and cons of static and dynamic data masking architectures, subsetting, randomization, generalization, shuffling, and other basic techniques used to set up data masking. Discover how to start data masking and learn about common challenges on data masking projects. 

More Information
Learn more about Martin Kralj.
T21 Setting Automation Expectations: Lessons from Failure and Success
Laura Salazar, Softtek
Thursday, May 2, 2013 - 3:00pm - 4:00pm

Test automation is undeniably a key strategy for any test manager—and for good Test automation is undeniably a key strategy for any test manager—and for good reason. Test automation promises faster regression testing, higher productivity, better quality, and cost reduction. However, many organizations fail to achieve these hoped for benefits, instead facing late deliveries, misuse of expensive tools, a frustrated testing team, and lack of confidence from their managers. Today, automation is required for any agile, cloud, or mobile testing project but simply having the right skill sets and tools may not be enough to make your project successful. Laura Salazar reviews how unrealistic expectations can lead to disappointing results: setting non-achievable goals like full test coverage, battling against unhelpful tools, and wrongfully thinking that no processes or standards are required. Laura presents real-life examples on how to set the right expectations as you identify your testing needs and automation strategies. Learn how to define your automation strategy on both waterfall and agile projects—so that you meet your expectations.

More Information
Learn more about Laura Salazar.
T22 Mobile Testing Tools 101
David Dang, Zenergy Technologies, Inc.
Thursday, May 2, 2013 - 3:00pm - 4:00pm

The burgeoning use of mobile devices has created enormous opportunities for organizations to leverage mobile to increase sales, advertise products, and collaborate with internal and external resources. However, with increasing usage, the need to perform testing on these devices is increasing significantly. This is not an easy task considering the number of devices, device operating systems, and operating system versions. To manage the number of variations, organizations rely on mobile testing tools to support their testing efforts. David Dang shares his experiences analyzing numerous mobile testing tool platforms for a prominent shopping network. Learn how identifying the "right" mobile testing tool depends on multiple factors such as supported devices, level of testing, resources, and required integration with other tools. Take back to share with your team a review of common tools on the market and the pros and cons of each.

More Information
Learn more about David Dang.
T23 HTML5 Security Testing at Spotify
Thursday, May 2, 2013 - 3:00pm - 4:00pm

HTML5 is one of the hottest technologies around right now because HTML5 apps are beautiful, engaging, and can perform important and entertaining functions. With the wide range of devices and platforms to support, the promise of multi-platform support is appealing. But HTML5 apps present their own range of security issues. So, what do you do about security? How do you test HTML5 applications to ensure their security? Alexander Andelkovic works at Spotify where their streaming music player desktop client applications are all HTML5-based. Alexander explains how manual testers can get the most out of HTML5 app security testing and manifest of HTML5 apps. He covers these common security testing issues and more: cross-site scripting (script inclusion), privacy-related issues, data leakage, and permissions. Discover how, by being proactive, you can avoid having to search for security issues late in a development project.

More Information
Learn more about Alexander Andelkovic.
T24 New Testing Standards Are on the Horizon: What Will Be Their Impact?
Claire Lohr, Lohr Systems
Thursday, May 2, 2013 - 3:00pm - 4:00pm

The history of testing standards has not always been auspicious. Testing standards documents have been expensive to obtain, limited in scope, inflexible in expectations, and inconsistent. However, they contain important lessons learned from experienced practitioners—if a tester is willing to overcome the obstacles to get to the useful information. A set of new international standards is coming. These new standards are tailorable, consistent, and comprehensive in scope. In addition, they will be freely available (some are already). Claire Lohr provides a complete roadmap to all of the available—or soon-to-be-available—testing-related standards. Learn where to go for testing process guidelines, complete definitions of all test design techniques, full examples of test documentation (for both agile and traditional projects), and free international standards documents. Take away a “start-up guide” for how different types of projects can use the new standards along with valuable tips and practical lessons you can get from these standards.

More Information
Learn more about Claire Lohr.