Skip to main content

Concurrent Sessions

Sessions are offered on Wednesday and Thursday at the conference and do not require a pre-selection. Build your own custom learning schedule, or choose to follow one of our tracks by topic area.

Concurrent Sessions
W1 When Testers Feel Left Out in the Cold
Hans Buwalda, LogiGear
Wednesday, May 6, 2015 - 11:30am - 12:30pm

When you're responsible for testing, it's almost a given that you will find yourself in a situation in which you feel alone and out in the cold. Management’s commitment for testing might be lacking, your colleagues in the project might be ignoring you, your team members might lack motivation, or the automated testing you had planned is more complicated and difficult than you anticipated. You feel you can't test enough, and you will be blamed for post-release quality problems. Hans Buwalda shares a number of chilly situations and offers suggestions for overcoming them, based on his experiences worldwide in large projects. Specifically, Hans focuses on management commitment, politics, project dependencies, managing expectations, motivating team members, testing and automation difficulties, and dealing with overwhelming numbers of day-to-day problems. Take away more than forty-five tips and approaches to use when temperatures drop on you.

More Information
Learn more about Hans Buwalda.
W2 Common System and Software Testing Pitfalls
Donald Firesmith, Software Engineering Institute
Wednesday, May 6, 2015 - 11:30am - 12:30pm

In spite of many great testing “how-to” books, people involved with system and software testing—testers, requirements engineers, system/software architects, system and software engineers, technical leaders, managers, and customers—continue to make many different types of testing-related mistakes. Think of these commonly-occurring human errors as a system of software testing pitfalls. And when projects fall into these pitfalls, testing is less effective at uncovering defects, people are less productive when testing, and project morale is damaged. Donald Firesmith has collected more than 150 of these testing anti-patterns, organized them into twenty categories, and documented each with name, description, potential applicability, characteristic symptoms, potential negative consequences, potential causes, recommendations for avoidance and mitigation, and related pitfalls. Donald introduces this repository of testing pitfalls, explains its many uses, and provides directions for accessing additional information including his associated “how-not-to test” book and website that documents pitfalls and identifies pitfall categories.

More Information
Learn more about Donald Firesmith.
W3 An Automation Framework for Everyone
Chris Loder, Halogen Software
Wednesday, May 6, 2015 - 11:30am - 12:30pm

Chris Loder shares how his team at Halogen Software has implemented Selenium in a framework that everyone in his company's R&D group can use. With an ever-increasing amount of manual regression testing, the team needed an easy-to-use automation framework. Chris presents an example of how the framework they developed at Halogen Software is used and, while doing so, shows parts of the supporting code that automation developers will find interesting. Written in Java, the framework is using Selenium in some pretty cool ways. Chris starts off with flexible run configurations and how they are built. Then the tests meet the code. Are you a fan of design patterns? They are in the framework and are shown and discussed. Need conditional waits in your automation? See how Chris and his team implement them with great success. Take home some great ideas for your own automation framework.

More Information
Learn more about Chris Loder.
W4 The New Agile Testing Quadrants: Bringing Skilled Testers and Developers Together
James Bach, Satisfice, Inc.
Michael Bolton, DevelopSense
Wednesday, May 6, 2015 - 11:30am - 12:30pm

You want to integrate skilled testing and development work. But how do you accomplish this without developers accidentally subverting the testing process or testers becoming an obstruction? Efficient, deep testing requires “critical distance” from the development process, commitment and planning to build a testable product, dedication to uncovering the truth, responsiveness among team members, and often a skill set that developers alone—or testers alone—do not ordinarily possess. James Bach and Michael Bolton present a model which is a redesign of the famous Agile Testing Quadrants that distinguished business vs. technical facing tests and supporting vs. critiquing. Their new model frames these dynamics and helps teams think through the nature of development and testing roles and how they might blend, conflict, or support each other on an agile project. James and Michael include a brief discussion of the original Agile Testing Quadrants model, which the presenters believe has created much confusion about the role of testing in agile.

More Information
Learn more about James Bach and Michael Bolton.
W5 Unleashing the Creative Mind of the Test Engineer
Audrey Marak, AmerisourceBergen Corporation
Wednesday, May 6, 2015 - 11:30am - 12:30pm

Do we each have a natural capacity for creativity? Can creativity be learned or enhanced? How do we ignite inventiveness? To be competitive in today’s world, it’s important to creatively respond to unanticipated challenges, make new connections, and adapt and continually improve. The good news is that our brains are built for creative problem solving, and it’s easy to discover and enhance our natural inventiveness. Just as scientists adopt scientific methods to design experiments and unravel the mysteries of the world, we need a complementary set of tools and techniques―creative thinking―when we want to invent rather than discover. We each have creative genius waiting to be unlocked! Join Audrey Marak to explore a set of methods and environmental factors you can use to enhance your imagination and increase your ability to create innovative ideas. It’s time to make creative thinking a core part of our development and to reinforce these lessons throughout our lives.

More Information
Learn more about Audrey Marak.
W6 Deliberate Discovery
Dan North, Dan North & Associates
Wednesday, May 6, 2015 - 11:30am - 12:30pm

Modern software delivery involves the decomposition of a problem into packets of business and technical analysis, design, architecture, programming, testing, integration, deployment, documentation, and training. No matter how well-intentioned our iterative or sequential approach is to these activities, our success rate of software delivery is still far below what it should be. Advances in these disciplines haven’t reduced the unpleasant surprises that occur uncomfortably late in projects. Dan North thinks it's because we are focusing on the wrong things, which means that any software delivery is merely a happy accident. He explains why ignorance is the greatest enemy to success, and presents some strategies and techniques for deliberately reducing ignorance, increasing learning, and moving toward a more deterministic and lower risk software delivery. We don’t like hearing bad news and will happily delude ourselves into thinking that things are better than they are. So this session probably isn’t for you—except that it is.

More Information
Learn more about Dan North.
W7 The Changing Face of Test Management in an Agile World
Tom Roden, Neuri Consulting
Ben Williams, Neuri Consulting
Wednesday, May 6, 2015 - 1:45pm - 2:45pm

Test management doesn't exist in the world of agile, or rather test managers don't—or do they? Agile methods such as Scrum have many traditional test management activities built in. With practices like self-organizing teams, role blurring, and skill diversification, the face of test management is changing. But is that a bad thing? Tom Roden and Ben Williams explore the key tenets of test management in an agile context, the likely dispersal of traditional responsibilities, and the profound effect on teams and managers. Hear their first hand experiences, some new and radical ideas, and research from test management practitioners worldwide in organizations transforming to agile methods. As a test manager and leader, learn how to prepare yourself to adapt and thrive in a changing landscape. As an agile tester or team member, challenge yourself to answer questions about the maturity of your team’s testing capability.

More Information
Learn more about Tom Roden and Ben Williams.
W8 Harness the Power of Checklists
Kirk Lee, Infusionsoft
Wednesday, May 6, 2015 - 1:45pm - 2:45pm

As testers, we can feel overwhelmed by the sheer volume of things that require our attention. We are pressured to meet the demands of a fast-paced development environment while grappling with the extreme complexities inherent in today’s software. How can we remember everything while prioritizing our work in a way that allows us to test thoroughly and with confidence? Kirk Lee shares how the proper use of checklists provides a lightweight yet powerful solution. Kirk explains how checklists can prevent forgetfulness, assist in assessing what is really important, and most importantly, free our minds from the mundane so we can reach the deeper levels of thought required to find the nastiest bugs. Take back foundational checklists and learn how to adapt them to your specific circumstance through defect and root-cause analysis. Kirk shares checklists focusing on quality attributes, test types, heuristics; and functional, security, performance, automation, and mobile testing.

More Information
Learn more about Kirk Lee.
W9 Leveraging Open Source Automation: A Selenium WebDriver Example
David Dang, Zenergy Technologies
Wednesday, May 6, 2015 - 1:45pm - 2:45pm

As online activities create more revenue, organizations are turning to Selenium to test their web applications and to reduce costs. Since Selenium is open source, there is no licensing fee. However, as with purchased tools, the same automation challenges remain, and users do not have formal support and maintenance. Proper strategic planning and use of advanced automation concepts are musts to ensure successful Selenium automation efforts. Sharing his experience designing and implementing advanced automation frameworks using Selenium WebDriver, David Dang describes the factors necessary to ensure open source automation is right for your project. David helps you understand the real effort required to implement WebDriver in a way that will scale and minimize script development. Additionally, he dives into must-haves in your Selenium framework design; the resource and timeline considerations necessary to implement WebDriver; and the long-term, continual improvement enhancements all automation engineers should consider in their Selenium automation implementations.

More Information
Learn more about David Dang.
W10 Risk-Based Testing for Agile Projects
Erik van Veenendaal, Improve IT Services BV
Wednesday, May 6, 2015 - 1:45pm - 2:45pm

Many projects implicitly use some kind of risk-based approach for prioritizing testing activities. However, critical testing decisions should be based on a product risk assessment process using key business drivers as its foundation. For agile projects, this assessment should be both thorough and lightweight. Erik van Veenendaal discusses PRISMA (PRoduct RISk MAnagement), a highly practical method for performing systematic product risk assessments. Learn how to employ PRISMA techniques in agile projects using Risk Poker. Carry out risk identification and analysis, see how to use the outcome to select the best test approach, and learn how to transform the result into an agile one-page sprint test plan. Erik shares practical experiences and results achieved by employing product risk assessments. Learn how to optimize your test effort by including product risk assessment in your agile testing practices.

More Information
Learn more about Erik van Veenendaal.
W11 Measuring Quality: Testing Metrics and Trends in Practice
Liana Gevorgyan, Infostretch Corporation
Wednesday, May 6, 2015 - 1:45pm - 2:45pm

In today's fast-paced IT world, companies follow “best” testing trends and practices with the assumption that, by applying these methodologies, their product quality will improve. But that does not always happen. Why? Liana Gevorgyan questions and defines, in the language of metrics, exactly what is expected to be changed or improved, and how to implement these improvements. While your project is in progress, choosing the right metrics and looking at their trends help you understand what must change to improve your methodology. Metrics—customer satisfaction, critical/blocking issues ratio with trends for each iteration, gap analysis results and improvement metrics, automation scripts, and test case coverage—and their priority are defined by assigning weight for each based on current project size, process model, technology, time, and goal. With a long list of metrics and measurement techniques, learn to drill down to what really makes sense in your organization. Develop a model that meets your needs and evaluates changes more effectively.

More Information
Learn more about Liana Gevorgyan.
W12 Eliminate Regression Testing through Continuous Deployment
Matthew Heusser, Excelon Development
Wednesday, May 6, 2015 - 1:45pm - 2:45pm

Most traditional teams do testing at least twice—once during development as new features are created and again during release candidate testing right before release. As a system grows, regression testing takes more and more time, making tight releases impossible—or at least risky—and adding to the burden of maintaining automated tests. Matt Heusser suggests that adopting continuous integration (with its continuous testing) and continuous delivery (with its associated production monitoring) can eliminate the need for classic regression testing. In addition to advanced strategies like configuration flags and incremental roll-out, Matt describes the change in risks as teams deliver more often, the origins of long regression cycles, and small steps that can have a big impact on software team performance. Leave with examples, stories, things to consider, a possible roadmap—and the information you need to know if the roadmap is worth pursuing.

More Information
Learn more about Matthew Heusser.
W13 Speak Like a Test Manager
Mike Sowers, Software Quality Engineering
Wednesday, May 6, 2015 - 3:00pm - 4:00pm

Ever feel like your manager, development manager, product manager, product owner, or ____ (you fill in the blank) is not listening to you or your team? Are you struggling to make an impact with your messages? Are you “pushing a wet rope uphill” in championing product quality? Are you talking, but no one is listening? Mike Sowers shares practical examples of how to more effectively speak like a test manager and offers concrete advice based on his experiences in the technology, financial, transportation, and professional services sectors. Mike discusses communication and relationship styles that work—and some that have failed—and shares key principles (e.g., seeking to understand), approaches (e.g., using facts), and attributes (e.g., being proactive) to help you grow and prosper as a test manager. Leave with practical ideas to boost your communications skills and influence to become a trusted advisor to your team and your management.

More Information
Learn more about Mike Sowers.
W14 Static Testing: We Know It Works, So Why Don’t We Use It?
Meenakshi Muthukumaran, Tata Consultancy Services
Wednesday, May 6, 2015 - 3:00pm - 4:00pm

We know that static testing is very effective in catching defects early in software development. Serious bugs, like race conditions which can occur in concurrent software, can't be reliably detected by dynamic testing. Such defects can cause a business major damage when they pop up in production. Despite its effectiveness in early defect detection and ease of use, static testing is not very popular among developers and testers. Meena Muthukumaran discusses reasons why static testing is not commonly used or not used optimally: lack of awareness, lack of time, and myths about cost and effort requirements. Meena explains ways to perform effective static testing—identifying your needs, shortlisting the tools based on your needs, creating awareness and a culture for proactively eliminating defects early in the lifecycle, and encouraging effective usage of static testing. She offers various implementation solutions to suit different development methodologies and ways to measure the benefits realized with static testing.

More Information
Learn more about Meenakshi Muthukumaran.
W15 Reduce Third-Party Tool Dependencies in Your Test Framework
Chris Mauck, Neustar, Inc.
Wednesday, May 6, 2015 - 3:00pm - 4:00pm

Have you found yourself forced to use outdated test tools because the cost to migrate was prohibitive? Have you abandoned or rewritten existing tests because it was easier (and cheaper) than migrating? With technology ever changing, most businesses struggle to keep up with producing high-quality products for the lowest price possible. And it is usually testers who suffer the most, as they are forced to use tools that are outdated, or no longer supported, because the company cannot afford the migration cost. Chris Mauck offers a new way to design your automation tests to reduce the third-party tool dependencies in your current test framework and significantly shorten the time required to migrate those tests in the future. Using real coding examples Chris explains the approach, design, and implementation. Learn a different way to structure your tests and how you can implement better coding practices across your team.

More Information
Learn more about Chris Mauck.
W16 Testers and Testing: A Product Owner’s Perspective
Scott Barber, PerfTestPlus, Inc.
Wednesday, May 6, 2015 - 3:00pm - 4:00pm

Testers frequently feel that they and their contributions to delivering software are undervalued. These feelings may stem from patterns of important defects being de-prioritized, receiving lower salaries than their peers who code, being assigned seemingly pointless tasks, or being expected to “test comprehensively” with insufficient time and resources (that tend to shrink as the target release date approaches). If you’ve experienced these feelings, you’ve probably wondered “What does senior management value if not the information testers provide?!?” If so, here are some answers. After fifteen years of working primarily in and around testers and testing, Scott Barber had the opportunity to serve as a product owner for a family of products. Join Scott as he shares lessons he learned, responsibilities he was given, ways his own thinking about software testing and testers evolved, and the somewhat surprising expectations he came to have of testers and testing for his products—after he became “senior management.”

More Information
Learn more about Scott Barber.
W17 Metrics Program Implementation: Pitfalls and Successes
Kris Kosyk, SoftServe
Wednesday, May 6, 2015 - 3:00pm - 4:00pm

When we talk about product quality, test team efficiency, and productivity, we always talk numbers. However, very few companies implement metrics programs in a way that supports solid decision making. Many have tried and failed, leaving a negative impression of metrics. Kris Kosyk explains what metrics like Defect Removal Efficiency tell us and how it is impacted by Test Coverage and Defect Backlog Change Rate. Moving up a level, Kris explains how to use operational testing metrics to understand the development lifecycle process. Though it’s a common belief that a successful metrics program depends on the metrics selected, that is really only half the battle. The other half is a well-designed implementation of the metrics program and effective ongoing governance. Kris addresses these issues and other related questions, and shares a case study on her successes and mistakes while implementing a company-wide test metrics program for more than 200 projects.

More Information
Learn more about Kris Kosyk.
W18 Testing Blockbuster Games: Lessons for All Testers
Tulay Tetiker McNally, BioWare Electronic Arts
Alex Lucas, BioWare Electronic Arts
Wednesday, May 6, 2015 - 3:00pm - 4:00pm

We can all learn valuable lessons from game development where, in addition to functional performance, overall experiential quality—user experience (UX)—is of critical importance. Blockbuster game development presents particular challenges with regard to scale, rapid iteration, and fuzzy requirements. Learn from Tulay McNally and Alex Lucas how BioWare QA participates in development from concept through release, employs key methodologies like session-based and agile testing, and provides a path for Video Game Testing as a career. Additionally, discover how Tulay and Alex take quality engineering beyond test automation by eliminating broken builds, enhancing tester capacity and accuracy, employing machine learning, and developing industry-leading telemetry and data visualization solutions. Learn how to meet these challenges with an embedded model―one that partners QA with developers―and an aggressive QA technology roadmap. Take back new ideas and approaches for meeting consumer and customer demand for higher interactivity and deeper levels of engagement.

More Information
Learn more about Tulay Tetiker McNally and Alex Lucas.
T1 Stop Maintaining Multiple Test Environments
Joel Tosi, DevJam
Thursday, May 7, 2015 - 9:45am - 10:45am

Today, most of us struggle with non-production environments. Either the test data is not right or consistent, the dependencies are mismanaged, or “They just aren't quite like production.”  Instead of striving for simpler environments, most organizations add test environments―pre-prod, UAT, stage, QAB, and so on. And they end up spending more and more time troubleshooting and maintaining environments rather than building and learning. It does not have to be this way. Joel Tosi shares his experience working with many large organizations in paths that start with DevOps and continuous delivery yet ultimately lead to the need to simplify test environments. Using simple examples and communication, Joel explains how teams should stop pushing applications through environments but rather pull them through tests. Leave with a fresh perspective on how you can simplify your testing strategies and ultimately stop creating and maintaining separate test environments.

More Information
Learn more about Joel Tosi.
T2 Mindmaps: Lightweight Documentation for Testing
Florin Ursu, DMEautomotive
Thursday, May 7, 2015 - 9:45am - 10:45am

Quality starts with requirements. In small to mid-size companies, it is not uncommon for the communication chain to be broken. Florin Ursu shares ways to avoid miscommunication through a streamlined process in which requirements are communicated to both developers and testers simultaneously; then developers write code while testers document what will be tested. Florin explores what mindmaps are; what they can be used for, both in general and applied to software development; and then dives deeper into how mindmaps can be used for testing. He describes how his teams use mindmaps to brainstorm, organize testing scenarios, prioritize work, review test scenarios, present results to stakeholders highlighting what was tested and (just as important) what was not tested, issues found, and risks. Using example mindmaps, Florin highlights important details captured in day to day work, including tips regarding format, communication style, and how to “sell” the idea of mindmaps to your stakeholders.

More Information
Learn more about Florin Ursu.
T3 Verify Complex Product Migrations with Automation
Thursday, May 7, 2015 - 9:45am - 10:45am

In the world of agile, automation is king. When faced with testing multiple versions of software, either while migrating or supporting multiple versions in the field, many teams give up, convinced that automation cannot be achieved. Marquis Waller and Jeff Sikkink provide insights into how using tools—Jenkins, VMware API, Selenium, and others—can allow you to create a rich set of migration tests. They discuss the challenges they face maintaining migration testing for a large enterprise workflow product that runs on three different operating systems (AIX, Linux, Windows). Marquis and Jeff share how they overcame un-automatable software to create a system that tests more than thirty different migration scenarios and runs thousands of automated Selenium test cases after each software build. Providing error reports, logging for defect correction, and significant time savings, this system allows the team to focus on new software development.

More Information
Learn more about Marquis Waller and Jeff Sikkink.
T4 Mobile App Testing: The Good, the Bad, and the Ugly
Jon Hagar, Independent Consultant
Thursday, May 7, 2015 - 9:45am - 10:45am

Mobile app testing has lots of good practices, some not so useful (bad) concepts, and some really ugly, don’t-ever-do ones. In the tradition of James Whittaker’s How to Break Software books, Jon Hagar applies the testing “attack” concept to mobile app software. Jon starts by defining the big problems and challenges of testing mobile app software and examines the patterns of product failures that you must attack. He then shares a set of good, bad, and ugly test techniques, which testers and developers can direct against their software to find important bugs quickly. Looking at native, web-based, and hybrid apps, Jon explains the pros and cons of each technique with examples to further your understanding. Finally, he gives you takeaway information on tools, automation, and test attacks your can begin using immediately. Go beyond basic functionality verification and learn how to attack your mobile apps with the best techniques while avoiding the ugly ones.

More Information
Learn more about Jon Hagar.
T5 Release Automation: Better Quality, Faster Deployment, Amazing ROI
Bryan Linder, tap|QA
Thursday, May 7, 2015 - 9:45am - 10:45am

A great deal of confusion surrounds the concepts of release automation, continuous integration, continuous delivery, and continuous deployment. Even some industry experts are confused about the differences. How these concepts work progressively to achieve high quality software delivery is generating a lot of discussion and controversy. Bryan Linder defines the methodology, processes, and tools associated with release automation, as well as the differences between its maturity levels. Understand the benefits of more frequent, smaller releases, and the exponential risk generated by large, infrequent releases. Hear highlights of industry case studies that demonstrate the substantial speed, quality, and ROI gains of improving your release automation process. Acquire the insight and motivation needed to take the next step—from wherever you organization is now—toward full release automation. Takeaways include a glossary of terms, a continuous integration tools comparison chart, and a release automation maturity chart.

More Information
Learn more about Bryan Linder.
T6 Improve Your Test Process from the Bottom Up
Gitte Ottosen, Capgemini-Sogeti Denmark
Thursday, May 7, 2015 - 9:45am

Test process improvement can be done in many ways. In a top-down approach a central organization does all the planning, and then implementation is done when everything is ready. In a bottom-up approach the improvements, developed and implemented in individual projects, are then spread throughout the organization. Gitte Ottosen shares her experiences of implementing test process improvements in both small projects and large organizations, with a primary focus on the bottom-up approach, ensuring that the testing community has a high degree of ownership and commitment, both important factors when implementing any process change. You need the overall test community to buy in to the thoughts and methodology behind the process, and you need them to support the implementation. Having a clear goal and knowledge about current process status are necessary because you need to know where you are and where you need to go in order to draw the route.

More Information
Learn more about Gitte Ottosen.
T7 Avoid Testing Mistakes or Really Bad Things Can Happen
Bart Knaack, Professional Testing
Thursday, May 7, 2015 - 11:15am - 12:15pm

In our work we assess the quality of software to give well-grounded advice on the “go live” decision. We test software to prevent bad things from happening to users once the software is deployed. However, in some cases, the mere act of testing breaches safety barriers and can put companies on the spot, causing embarrassment, damage, or even death. The worst test ever to go bad—the Chernobyl meltdown which cost approximately 200,000 lives―was caused by a stress test executed in production. By analyzing a number of real life testing “accidents” of this category, Bart Knaack helps us understand how to prevent them. The accidents Bart describes have resulted in either front page news, millions in damage, or embarrassment at C-level. Bart goes through the examples, challenging the audience to discover solutions to prevent testing accidents from happening to you. He hopes you will take home these lessons learned and and apply in your world.

More Information
Learn more about Bart Knaack.
T8 Predict Defects with Data Mining and Machine Learning
Stephen Frein, Comcast
Thursday, May 7, 2015 - 11:15am - 12:15pm

Quality assurance professionals have an arsenal of tried-and-true techniques for assessing and improving quality. Many of these revolve around the concept of risk. When quality professionals focus on risk, they generally focus on areas where defects would be the most damaging, rather than areas in which defects are most likely to be found. In recent years, the maturation of big data mining and predictive analysis tools have made it practical to predict where defects in an application are likely to reside. Stephen Frein describes his recent experiments with data mining and machine learning tools that can predict where defects are likely to appear. Learn how word clouds can point out the user stories most likely to harbor defects. Explore ways to identify and characterize your most defect-prone configuration items. Learn how modern analysis tools can reveal statistical patterns that are beyond the reach of human intuition and insight, and how these patterns can alert us to where defects may appear.

More Information
Learn more about Stephen Frein.
T9 Automate Legacy-System Testing: Easy, Reliable, and Extendible
Emanuil Slavov, Komfo, Inc.
Thursday, May 7, 2015 - 11:15am - 12:15pm

Everyone loves working on a greenfield project. You’re starting fresh and nothing holds you back. Unfortunately, for most testers, this is a rare occurrence. Chances are you will work on legacy applications. Because these often have no automated tests, developers are afraid to make bold changes. More testers than developers can be assigned to these projects. Changing one line of code may require multiple days of manual testing. Eventually work grinds to a halt. Sound familiar? Emanuil Slavov explains how to deal with this sticky situation without losing your mind. Start small and work outside in. Learn how to combine the best practices of automated acceptance tests, unit tests, static code analysis, continuous integration, and architecture for testability. Discover how to make your automated tests more reliable, easy to support, and a breeze to extend. Emanuil’s presentation is inspired by his real-life experience—working on legacy projects for more than five years.

More Information
Learn more about Emanuil Slavov.
T10 Designing a Robust Test Strategy for Mobile Apps
Parimala Hariprasad, Amadeus Software Labs India Pvt. Ltd
Thursday, May 7, 2015 - 11:15am - 12:15pm

Every day thousands of mobile apps are built, and many are released with poor quality. Dozens of new mobile devices become available every day. Immense pressure mounts on organizations to test mobile apps with shorter go-to-market cycles. Mobile app testing becomes overwhelming due to multiple platforms, varying OS versions, device manufacturers, screen resolutions, and more. Parimala Hariprasad presents an approach to designing test strategies for mobile apps. She addresses such questions as: What devices to test? How to select them? Can we use simulators/emulators? How to handle fragmentation challenges? Which platforms are good enough? Parimala shares her experience, and highlights how analytics and user reviews can facilitate the creation of a good test strategy that evolves over time and balances tradeoffs between cost, quality, and time-to-market in the constantly changing mobile market. Key takeaways include learning about fragmentation, the shotgun approach, mobile personas, and using analytics to fine-tune the test strategy.

More Information
Learn more about Parimala Hariprasad.
T11 Continuous Testing in the Cloud
Chris Broesamle, Sauce Labs
Thursday, May 7, 2015 - 11:15am - 12:15pm

Are you looking to fulfill the promise of continuous delivery (CD), a process that accelerates the release of software through automation and the practice of continuous integration (CI)? Chris Broesamle can help with that. Explore how to create a full CD solution entirely in the cloud using GitHub, Selenium, Sauce Labs, and a Travis CI server. Chris shows you how you can take advantage of these open source and hosted development resources to increase the velocity of your releases and improve application quality demanded by your users. Learn how you can securely execute Selenium tests in parallel and at scale on a grid of virtual machines with the ability to configure and test against browser, OS, platform, and device combinations, dramatically reducing the time it takes to run critical integration and acceptance tests on your web applications. Finally, realize the dream of continuous delivery through continuous testing in the cloud.

More Information
Learn more about Chris Broesamle.
T12 Continuous Test Improvement in a Rapidly Changing World
Martin Pol, Polteq Testing Services BV
Thursday, May 7, 2015 - 11:15am - 12:15pm

Classical test process improvement models no longer fit in organizations adopting the newest development approaches. Instead, a more flexible approach is required today. Solutions like SOA, virtualization, web technology, cloud computing, mobile, and the application of social media have dramatically changed the IT landscape. In addition, we are innovating the way we develop, test, and manage. Many organizations are moving toward a combination of agile/scrum, context-driven testing, continuous integration and delivery, DevOps, and TestOps. Effective test automation has become a prerequisite for success. All of this requires a different way of improving testing, an adaptable way that responds to innovations in both technology and development. Martin shares a roadmap that enables you to translate the triggers and objectives for test improvement into actions that can be implemented immediately. Learn how to achieve continuous test improvement in any situation, and take away a practical set of guidelines to enable a quick start.

More Information
Learn more about Martin Pol.
T13 What Do Defects Really Cost? Much More Than You Think
Wayne Ariola, Parasoft
Thursday, May 7, 2015 - 1:30pm - 2:30pm

As software increasingly becomes the face of the business, defects can lead to embarrassment, financial loss, and even business failure. Nevertheless, in response to today's demand for speed and “continuous everything,” the software delivery conveyer belt keeps moving faster and faster. It's foolhardy to expect that speeding up an already-troubled implementation process will achieve the desired results. Wayne Ariola shares why and how to evolve from automated to continuous testing and discusses the methods to help you do so. Explore how to establish quality gates that continuously measure software vs. business expectations, allowing you to confidently and automatically promote software from one phase of the SDLC to the next. Learn strategies—how to promote collaborative risk reduction, collapse remediation cycle time, and establish a feedback loop for defect prevention—to remove SDLC constraints without compromising quality.​

More Information
Learn more about Wayne Ariola.
T14 Survival Guide: Taming the Data Quality Beast
Shauna Ayers, Availity
Thursday, May 7, 2015 - 1:30pm - 2:30pm

As companies scramble to adjust to the demands of an increasingly data-driven world, testers are told “go test data quality” without any guidance as to what that entails or how to go about it. The fact that the data is often a living, flowing ecosystem, rather than just a single object, requires the use of different strategies to gain meaningful insights. Shauna Ayers and Catherine Cruz Agosto guide you through the challenges of data quality and apply a structured approach to analyze, measure, test, and monitor living data sets, and gauge the business impact of data quality issues. Shauna and Catherine define data quality, describe the five goals of data quality management, provide the four pillars of data quality assurance, and show how data flow, scale, and properties interact to build the data quality landscape. Learn how to tame the data quality beast, determine what and how to test, overcome technical obstacles—and emerge with a usable plan of attack.

More Information
Learn more about Shauna Ayers and Catherine Cruz Agosto.
T15 Implement an Enterprise Performance Test Process
Ryan Riehle, InCycle Software
Thursday, May 7, 2015 - 1:30pm - 2:30pm

Suddenly, application performance is important to your business, and you have been given the budget to improve it. You’re in a hurry because customers are complaining or because you expect jumps in transaction volume and your application needs to scale quickly. Do you know where to start? Join Ryan Riehle as he shares his experiences developing enterprise performance testing programs. Ryan covers the key techniques and heuristics that lead to an effective performance improvement effort. He discusses patterns teams use to effectively collaborate to achieve performance requirements, how to configure and organize test environments, considerations for application deployment and release cycles, appropriate metrics to use and how to report them, and strategies and techniques for data movement that support reproducible test results. But measuring alone does not solve the performance problem. So Ryan discusses how teams can act on testing results to improve and verify the impact of application and infrastructure changes.

More Information
Learn more about Ryan Riehle.
T16 Testing with a Rooted Mobile Device
Max Saperstone, Coveros
Thursday, May 7, 2015 - 1:30pm - 2:30pm

Traditional applications are tested through the GUI and through all exposed APIs. However, typical mobile app testing is only done through the front-end GUI. In addition, performance and security details are not readily available from the mobile device. Max Saperstone demonstrates some benefits of testing a native mobile application on a rooted device—one with privileged access control. Although Max does not describe how to root a device, he shares how to access back-end processes and test at this detailed level. He discusses the technical controls made available through a rooted device—together with its auditing, logging, and monitoring—and describes the gathering of additional metrics. Max demonstrates tools for penetration testing, sniffing, and network hacking; shares how to access application data directly; and shows how data security is implemented for the application. Learn how to use the admin rights associated with a rooted device to examine device performance and to simulate interrupts and system faults.

More Information
Learn more about Max Saperstone.
T17 Security Testing: What Testers Can Do
Declan O'Riordan, Test and Verification Solutions
Thursday, May 7, 2015 - 1:30pm - 2:30pm

Thousands of times each day, network perimeter security defenses fail to recognize new and obfuscated attacks. Rather than attempting to build security firewalls, Declan O’Riordan asserts that project teams must design, code, and test security into applications―and that requires skills that are in short supply. As testers, we need to recognize which security tests we can perform and which require delegation to experts. Let’s stop our passive acceptance of designs that are weak on security and instead conduct analysis of the security features before we plan the system testing. As a tester, examine how the developers are coding, and verify their use of secure coding guidelines. Work through a security testing example and identify its authentication, access control, and session management functionality. Acquire the skill to identify tests you can handle—e.g., incomplete validation of credentials and unprotected functionality—from the tests you need to delegate to experts—e.g., brute-force login and predictable session tokens.

More Information
Learn more about Declan O'Riordan.
T18 Testing as a Service (TaaS): A Solution to Hard Testing Problems
Scott Tilley, Florida Institute of Technology
Thursday, May 7, 2015 - 1:30pm - 2:30pm

Some problems in software testing seem timeless. Other challenges—including SOA and cloud computing—arise due to the introduction of new technologies. Scott Tilley has led a three-year project at the Florida Institute of Technology to identify hard problems in software testing as voiced by leading practitioners in the field. The problems were identified through a series of workshops, interviews, and surveys. Some of the problems—education and training, lack of good tools, and unrealistic schedules—are timeless; others such as agility and system security are emerging as increasingly important. Are your software testing pain points more common than you think? Can TaaS help with your specific problems? Learn the answers to these questions and more. Return to the office knowing that you are not alone, and help is available.

More Information
Learn more about Scott Tilley.
T19 Create Products That Customers Love: A Testing Perspective
Thursday, May 7, 2015 - 3:00pm - 4:00pm

Have you ever stood in line at midnight to buy the latest release of a product? Have you worked on a product that created such delight in customers that they camped out overnight to be the first to buy it? Though this level of customer devotion is rare, it is possible to create everyday products that your customers will love. In the past, the designers and developers have received the lion’s share of the credit, but the role of quality teams is just as important in creating this level of success. From being the defender of the customer experience, to working directly with customers, to providing feedback to designers, testers make significant contributions. Stephen Hares describes actionable items—working closely with customers, treating product requirements as a quality deliverable, and modeling test strategies to be customer-centric. Learn to be more actively and effectively involved in the development of—and champions for—products that customers love.

More Information
Learn more about Steve Hares.
T20 Virtualization to Improve Speed and Increase Quality
Thursday, May 7, 2015 - 3:00pm - 4:00pm

Many development and test organizations must work within the confines of compressed release cycles, various agile methodologies, and cloud and mobile environments for their business applications. So, how can test organizations keep up with the pace of development and increase the quality of their applications under test? Clint Sprauve describes how service virtualization and network virtualization can help your team improve speed and increase quality. Learn how to use service virtualization to simulate third-party or internal web services to remove wait times and reduce the need for high-cost infrastructures required for testing. Take back techniques for incorporating network virtualization into the testing environment to simulate real-world network conditions. Learn from Clint how the combination of service and network virtualization allows teams to implement a robust and consistent continuous testing strategy to reduce defects in production applications.

More Information
Learn more about Clint Sprauve and Todd DeCapua.
T21 Performance Testing in the Agile Lifecycle
Lee Barnes, Utopia
Thursday, May 7, 2015 - 3:00pm - 4:00pm

Traditional large scale end-of-cycle performance tests served enterprises well in the waterfall era. However, as organizations transition to agile development models, many find their tried and true approach to performance testing—and their performance testing resources—becoming somewhat irrelevant. The strict requirements and lengthy durations just don’t fit in the context of an agile cycle. Additionally, investigating system performance at the end of the development effort misses out on the early stage feedback offered by an agile approach. And it’s more important than ever that today’s agile-built systems perform. So how can agile organizations ensure optimum performance of their business critical systems? Lee Barnes discusses why agile teams need to change their thinking about performance from a narrow focus on testing to a broader focus on analysis—from a people, process and technology perspective. Take back techniques for shifting your performance testing/analysis earlier in the development cycle and extracting performance data that is immediately actionable.

More Information
Learn more about Lee Barnes.
T22 How to Deliver Winning Mobile Apps
Joe Larizza, Royal Bank of Canada
Eran Kinsbruner, Perfecto Mobile
Thursday, May 7, 2015 - 3:00pm - 4:00pm

Do you find yourself confused about the definition of mobile testing? Do you understand the challenges of mobile testing and where to start? Is this your first mobile testing project? Joe Larizza and Eran Kinsbruner describe the techniques of mobile testing and the steps necessary to help testing teams transform to face these new challenges. Learn about test automation, testing tools, new methodologies—DevOps, DevTest, Shift Left and Right—and how to build a strategic mobile test road map to increase your market awareness and avoid common pitfalls affecting mobile testing teams. Discover from Joe and Eran how successful teams decide test coverage in this fast-paced IT world. Catch up with the latest industry trends, and learn short cuts to successfully meet your future mobile testing needs. Finally, take these ideas and tailor them to fit your organization or project to lead your team into the mobile world.

More Information
Learn more about Joe Larizza and Eran Kinsbruner.
T23 Improve Security through Continuous Testing
Thursday, May 7, 2015 - 3:00pm - 4:00pm

Many companies develop strong software development practices that include ongoing testing throughout the development lifecycle. But they fail to account for the testing of security-related issues. This leads to security controls being tacked on to an application just before it goes to production. With security controls implemented in this manner, more security vulnerabilities are uncovered but there is less time to correct them. As more applications move to cloud-based architectures, this will become an even greater problem as some of the protection enjoyed by applications hosted on-site no longer exists. Jeremy Faircloth discusses a better approach—ensuring that testing throughout the development lifecycle includes the appropriate focus on security controls. Jeremy illustrates this through the establishment of security-related use cases, static code analysis, dynamic analysis, fuzzing, availability testing, and other techniques. Save yourself from last minute security issues by proactively testing the security of your application!

More Information
Learn more about Jeremy Faircloth.
T24 Web and Mobile App Accessibility Testing
Nancy Kastl, SPR Consulting
Thursday, May 7, 2015 - 3:00pm - 4:00pm

If a website or mobile app is not accessible to all potential visitors, is it truly a quality product? Services, products, information, and entertainment on the web and mobile devices can be made available to millions of consumers with vision, hearing, or motor control difficulties by complying with accessibility standards. Assistive technologies enable access by converting the text and images of mobile screens and web pages into computerized voice. But these technologies cannot interpret pages that are not built and tested for compliance to accessibility standards and programming guidelines. Join Nancy Kastl to learn about Section 508 and WCAG standards, Mobile Web Best Practices, and Apple and Android Developer Accessibility Guidelines. Learn how to test for accessibility on mobile devices and desktop using screen readers and open source tools. Become an advocate of accessible mobile apps and websites throughout the project lifecycle and add accessibility testing to your testing capabilities.

More Information
Learn more about Nancy Kastl.