Skip to main content

Concurrent Sessions

Sessions are offered on Wednesday and Thursday at the conference and do not require a pre-selection. Build your own custom learning schedule, or choose to follow one of our tracks by topic area.

Concurrent Sessions
W1 Building Quality In: Adopting the Tester’s Mindset
Stephen Vance, Stellar Advances
Wednesday, October 15, 2014 - 11:30am - 12:30pm

When trying to improve the development process, agile and lean transformations often start by focusing on engineering. Product management and development get a lot of attention; however, tester is not one of the defined Scrum roles. Despite the attention given to automated tests in agile, many transformations seem lost in knowing how to engage testers—and testers struggle to find their place. But the tester’s mindset―applying an investigatory and explorative perspective for empirical and hypothesis-driven improvement―is essential to agile transformation at all organizational levels, from the individual team to the board room. Stephen Vance shows how applying the tester’s mindset at the beginning of development more effectively supports efforts to build quality in—rather than detecting problems after they occur. Learn how to flip your thinking by applying the tester’s mindset to drive change, incorporating ideas from software craftsmanship, systems thinking, lean manufacturing, lean startup, Net Promoter System, and more.

More Information
Learn more about Stephen Vance.
W2 Testing Lessons Learned from Sesame Street
Rob Sabourin, AmiBug.com
Wednesday, October 15, 2014 - 11:30am - 12:30pm

Rob Sabourin has discovered testing lessons in the Simpsons, the Looney Tunes gang, Great Detectives, Dr. Seuss, and other unlikely places, but this year he journeys to Sesame Street. Sesame Street teaches basic life skills in a safe, entertaining, memorable style. Rob uses them to solve stubborn technical, management, and people-related testing problems. Oscar the Grouch guides us through failure mode analysis. Ernie and Bert help us tackle problems from different perspectives. Big Bird and Mr. Snuffleupagus teach about persistence, rhetoric, and bug advocacy. The Count misdirects management with fallacious metrics. And Kermit demonstrates that it is not easy being a tester, but we can make a difference by getting the right things done well. Sesame Street songs teach testing lessons, too. Rob performs a powerful affinity analysis singing "One of these things …".  Enjoy testing lessons brought to you by Rob and your friends at Sesame Street.

More Information
Learn more about Rob Sabourin.
W3 Why Automation Fails—in Theory and Practice
Jim Trentadue, Ranorex
Wednesday, October 15, 2014 - 11:30am - 12:30pm

Testers face common challenges in automation. Unfortunately, these challenges often lead to subsequent failures. Jim Trentadue explains a variety of automation perceptions and myths―the perception that a significant increase in time and people is needed to implement automation; the myth that, once automation is achieved, testers will not be needed; the myth that scripted automation will serve all the testing needs for an application; the perception that developers and testers can add automation to a project without additional time, resources, or training; the belief that anyone can implement automation. The testing organization must ramp up quickly on the test automation process and the prep-work analysis that needs to be done including when to start, how to structure the tests, and what system to start with. Learn how to respond to these common challenges by developing a solid business case for increased automation adoption by engaging manual testers in the testing organization, being technology agnostic, and stabilizing test scripts regardless of applications changes.

More Information
Learn more about Jim Trentadue.
W4 A Tester’s Guide to Collaborating with Product Owners
Bob Galen, Velocity Partners
Wednesday, October 15, 2014 - 11:30am - 12:30pm

The role of the Product Owner in Scrum is only vaguely defined—owning the Product Backlog and representing the “customer.” In many organizations, Product Owners go it alone, trying their best to represent business needs to their teams. What’s often missing is a collaborative connection between the team’s testers and the Product Owner—a connection in which testers help to define and refine requirements, broaden the testing landscape and align it to customer needs, provide a conduit for collaboration between the customer and the team, assure that the team is building the right thing, and help demonstrate complete features. This relationship is central to the team and facilitates transparency to help gain feedback from the entire organization. Join seasoned agile coach Bob Galen as he shares techniques for doing just this. Return with new ideas and techniques for helping your Product Owner and team deliver better received and higher value products—not just by testing but by fostering collaboration.

More Information
Learn more about Bob Galen.
W5 Growing into Leadership
Peter Walen, Gordon Food Service
Wednesday, October 15, 2014 - 11:30am - 12:30pm

Pete Walen is not going to tell you how to be a good test manager. Instead, Pete shares ideas on becoming a true leader. While some managers certainly are leaders, testers of all varieties and experience levels can become leaders. Developing technical leadership skills, regardless of job title, involves overcoming our own uncertainties, self-doubts, and perceptions. Learning to foster relationships while perfecting our craft is a challenge for everyone, particularly when others look to us to be an expert—even when we don’t feel like one. Pete presents choices, options, and paths available to software professionals, including opportunities for self-education, networking, and other professional and technical development. He describes how he learned to apply these lessons in day-to-day work situations, building skills for himself and his co-workers. In this interactive discussion, Pete shares his mistakes and successes, what he learned from each, and what opportunities there are for you to grow as a leader in your own right.

More Information
Learn more about Peter Walen.
W6 Testing Compliance with Accessibility Guidelines
Anish Krishnan, Hexaware Technologies, Ltd
Wednesday, October 15, 2014 - 11:30am - 12:30pm

Currently, 2.4 billion people use the Internet, and about 10 percent of the world’s population has some form of disability. This means millions of potential users will have difficulty accessing the Internet. Thus, accessibility testing should not be ignored. Anish Krishnan discusses the importance of accessibility testing, reasons for considering accessibility issues while designing, and international Web accessibility laws. He shares effective techniques for carrying out accessibility testing, the potential scope of this testing, myths surrounding accessibility testing, and a set of automated tools to support this testing. Join Anish to learn about the Section 508 standards and how to test for web accessibility using screen readers and open source tools. Experience screen reader technology on both an accessible and non-accessible site. Learn how your test team can be advocates of accessible websites throughout the project lifecycle and add accessibility testing to your testing capabilities.

More Information
Learn more about Anish Krishnan.
W7 The Role of Testing: Quality Police or Quality Communicator?
Mike Duskis, 10-4 Systems
Wednesday, October 15, 2014 - 1:45pm - 2:45pm

An underwear advertisement in 1985 featured the dedicated and thorough Inspector 12 saying, “They don't say Hanes until I say they say Hanes.” Historically, software testers have been called on to perform a similar role―preventing defective products from reaching customers. However, software development is not underwear manufacturing. The specifications are less clear and the acceptance criteria more complex. Why then do organizations continue to place acceptance decisions in the hands of testers? Because they lack the information required to make a sound decision. Mike Duskis presents one way out of this mess―empower the organization to make acceptance decisions with confidence. This requires a move away from producing binary pass/fail test results toward gathering, organizing, and providing the information which the business needs to assess product risk and quality. Learn strategies and techniques you can use to stop playing the inspector role and begin to position yourself as a provider of critical information.

More Information
Learn more about Mike Duskis.
W8 Virtualization: Improve Speed and Increase Quality
Wednesday, October 15, 2014 - 1:45pm - 2:45pm

Many development and test organizations must work within the confines of compressed release cycles, various agile methodologies, and cloud and mobile environments for their business applications. So, how can test organizations keep up with the pace of development and increase the quality of their applications under test? Clint Sprauve describes how service virtualization and network virtualization can help your team improve speed and increase quality. Learn how to use service virtualization to simulate third-party or internal web services to remove wait times and reduce the need for high cost infrastructures required for testing. Take back techniques for incorporating network virtualization into the testing environment to simulate real-world network conditions. Learn from Clint how the combination of service and network virtualization allows teams to implement a robust and consistent continuous testing strategy to reduce defects in production applications.

More Information
Learn more about Clint Sprauve.
W9 Functional Testing with Domain-Specific Languages
Tariq King, Ultimate Software
Wednesday, October 15, 2014 - 1:45pm - 2:45pm

Developing high-quality software requires effective communication among various project stakeholders. Business analysts must elicit customer needs and capture them as requirements, which developers then transform into working software. Software test engineers collaborate with business analysts, domain experts, developers, and other testers to validate whether the software meets the customer’s expectations. Misunderstandings between different stakeholders can introduce defects into software, reducing its overall quality and threatening the project’s success. Domain-specific languages (DSLs) are special purpose languages created to describe tasks in a particular field. DSLs provide stakeholders with a common vocabulary for describing application elements and behaviors. Tariq King describes how DSLs can be leveraged during functional testing to help identify potential issues early and reduce misunderstanding. Tariq demonstrates how a well-designed, testing DSL allows non-technical stakeholders to read and write automated tests, better engaging them in software testing activities. Learn how DSL-based testing tools can be used to improve test case management, regression testing, and test maintenance.

More Information
Learn more about Tariq King.
W10 Agile Development and Testing in a Regulated Environment
John Pasko, Karl Storz Imaging
Wednesday, October 15, 2014 - 1:45pm - 2:45pm

One of the Agile Principles states that working software is the primary measurement of success―generally measured by the level of customer satisfaction. So, how do you measure “customer satisfaction” when it is based on successful surgical outcomes? Join John Pasko as he takes you through a case study of the design, development, testing, and release of a complex system—integrating embedded software with hardware—for a surgical product which met stringent FDA standards and regulations. Cross-functional teams comprised of Product Owners, software engineers, and QA engineers used agile, TDD, continuous integration, and automated and manual acceptance testing to create iterative in-house releases to our Product Owner—the internal customer. When all requirements and standards were satisfied, we released the completed product for use in medical facilities. Learn how we satisfied regulatory requirements and provided an audit trail by using a formal tool to map requirements, handle change requests, and monitor defects and their corresponding fixes.

More Information
Learn more about John Pasko.
W11 Adventures of a Social Tester
Martin Nilsson, House of Test
Wednesday, October 15, 2014 - 1:45pm - 2:45pm

If we know that good co-worker relationships can positively impact our success, why don’t we take a systematic approach to relationship building? Martin Nilsson shares how building personal relationships has helped develop his personal competency. Even though Martin’s technical skills are high, his greatest successes as a tester have come from his ability to build relationships. He shares how a focused effort at building rapport resulted in greater cooperation. When he was mistaken for a test lead during a project, Martin learned that having coffee with someone can trump an email. Today, to understand the systems with which he works, he uses tools that map team members and their interactions. Martin shares how he applied those tools as a project test manager to understand the situation of a group of test leads and managers and remedy the problems that were keeping them from working together effectively.

More Information
Learn more about Martin Nilsson.
W12 Test Improvement in Our Rapidly Changing World
Martin Pol, Polteq Testing Services BV
Wednesday, October 15, 2014 - 1:45pm - 2:45pm

In organizations adopting the newest development approaches, classical test process improvement models no longer fit. A more flexible approach is required today. Solutions like SOA, virtualization, web technology, cloud computing, mobile, and the application of social media have changed the IT landscape. In addition, we are innovating the way we develop, test, and manage. Many organizations are moving toward a combination of agile/Scrum, context-driven testing, continuous integration and delivery, DevOps, and TestOps. Effective test automation has become a prerequisite for success. And all of these require a different way of improving testing, an adaptable way that responds to innovations in both technology and development. Martin Pol shares a roadmap that enables you to translate the triggers and objectives for test improvement into actions that can be implemented immediately. Learn how to achieve continuous test improvement in any situation, and take away a practical set of guidelines to enable a quick start.

More Information
Learn more about Martin Pol.
W13 The Test Manager’s Role in Agile: Balancing the Old and the New
Mary Thorn, ChannelAdvisor
Wednesday, October 15, 2014 - 3:00pm - 4:00pm

What do test managers do? In traditional organizations, they assign people to projects, oversee the testers’ progress, provide feedback, and perhaps offer coaching to people who want it. Test managers are the go-to people when you don't know how to do something—not because they know, but because they know who does know. How does that change with a transition to agile? Do we still need test managers? As one who has successfully made the transition from traditional to agile test manager, Mary Thorn shares keys to the transition. Explore why establishing a mission, vision, and strategy for your agile test team is vital. Learn why cross-organizational transparency, communication, and bridge-building become your prime responsibilities. Review models for building great agile test teams—teams who successfully balance old and new techniques to provide customer value. In the end, Mary inspires you to reach a higher level of test leadership.

More Information
Learn more about Mary Thorn.
W14 Testing the New Disney World Website
Les Honniball, Walt Disney Parks and Resorts Technology
Wednesday, October 15, 2014 - 3:00pm - 4:00pm

At Walt Disney Parks and Resorts Technology, we provide the applications and infrastructure our online guests use to plan, book, explore, and enjoy their stay at our parks and resorts. With millions of page views per day and a multi-billion dollar ecommerce booking engine, we face a unique set of challenges. Join Les Honniball for insights into how they work with Product Owners and development teams to design tests, both manual and automated for these challenges. Les explains the testing processes that support a global set of brands on one web platform, including successful QA strategies, analytics, and user experience design―all while working within an agile development process. Discover how Les and his team of QA engineers work with various development teams in Orlando FL, Glendale CA, and Argentina to support many areas of Walt Disney Parks and Resorts Technology Business.

More Information
Learn more about Les Honniball.
W15 End-to-End Test Automation with Open Source Technologies
Ramandeep Singh, QA InfoTech
Wednesday, October 15, 2014 - 3:00pm - 4:00pm

As organizations continue to adopt agile methodologies, testers are getting involved earlier in product testing. They need tools that empower them to manage varied test automation needs for web services, web APIs, and web and mobile applications. Open source solutions are available in abundance. However, most of these solutions are independent and not integrated, significantly increasing the tester’s work around test automation development. Ongoing test automation suite evolution and building a robust regression test suite have become cumbersome. Join Ramandeep Singh as he shares the idea of a comprehensive end-to-end automation framework to minimize the efforts spent in using existing test automation solutions across all aspects of the application. Take back techniques to create effective automated tests that are robust and reusable across multiple forms of the same application. Learn to help functional testers effectively use test automation, simplified through a comprehensive framework, to efficiently build automated test cases.

More Information
Learn more about Ramandeep Singh.
W16 Your Team’s Not Agile If You’re Not Doing Agile Testing
Jeanne Schmidt, Rural Sourcing, Inc.
Wednesday, October 15, 2014 - 3:00pm - 4:00pm

Many organizations adopt agile software development processes, yet they do not adopt agile testing processes. Then they fall into the trap of having development sprints that are just a set of mini-waterfall cycles. Some software developers still feel they can work more quickly if they let QA test after code is completed. Jeanne Schmidt identifies simple ways to get your team to adopt agile testing methods. Embracing agile testing requires you to change processes, responsibilities, and team organization. Jeanne details specifically how agile testers can add value by participating both at the beginning of each iteration and at the end of each sprint. She describes different ways you can pair your team members and different techniques for teaching developers the value of testing. Finally, Jeanne offers solutions for managing resistance to change and leading all team members to take responsibility for the product quality.

More Information
Learn more about Jeanne Schmidt.
W17 Speak Like a Test Manager
Mike Sowers, Software Quality Engineering
Wednesday, October 15, 2014 - 3:00pm - 4:00pm

Ever feel like your manager, development manager, product manager, product owner, or (you fill in the blank) is not listening to you or your team? Are you struggling to make an impact with your messages? Are you “pushing a wet rope uphill” in championing product quality? Are you talking, but no one is listening? Mike Sowers shares practical examples of how to more effectively speak like a test manager and offers concrete advice based on his experiences in the technology, financial, transportation, and professional services sectors. Mike discusses communication and relationship styles that work—and some that have failed—and shares key principles (e.g., seeking to understand), approaches (e.g., using facts), and attributes (e.g., being proactive) to help you grow and prosper as a test manager. Leave with practical ideas to boost your communications skills and influence to become a trusted advisor to your team and your management.

More Information
Learn more about Mike Sowers.
W18 Implementing Outsourced Testing Services with a Third Party
Shelley Rueger, Moxie Software
Wednesday, October 15, 2014 - 3:00pm - 4:00pm

Outsourcing test services are all the rage today. But are they really faster, better, and cheaper? Shelley Rueger shares how you can improve the efficiency and effectiveness of your test process using a third-party test service. She provides guidance on how to determine if your product is a good candidate for testing services, how to select the right vendor, and how to avoid common pitfalls. Shelley discusses her team's experience as they made the transition from in-house testing to using external testing services. She addresses questions including: When should you outsource testing? When should you not? What questions should you ask a test services company before engaging them? What issues should you keep an eye out for during the transition? Leave with an actionable plan for implementing successful third-party testing within your organization—or come away with the knowledge that it will not be right for you.

More Information
Learn more about Shelley Rueger.
T1 “Rainmaking” for Test Managers
Thursday, October 16, 2014 - 9:45am - 10:45am

The dictionary defines a rainmaker as “an executive (or lawyer) in the unsentimental world of business with an exceptional ability to attract clients, use political connections, increase profits, etc.” Simply put, a rainmaker is someone who gets things done. Is this relevant to testing? Absolutely! It is too easy to get stuck in the status quo and to avoid trying something new because everything works well as it is. But what we do can always be made better. Julie Gardiner focuses on two key areas―(1) becoming a Trusted Advisor, and (2) adapting rainmaking principles to the testing role. Join Julie as she discusses rainmaking topics: The power of relationships—what is your market? Credibility—what is it? how do we get it? and The Platinum Rule—why should we always follow it? If you are looking for ways to enhance your ability to make testing—and your company—work even better, then this session is for you!

More Information
Learn more about Julie Gardiner.
T2 Release the Monkeys: Testing Using the Netflix Simian Army
Gareth Bowles, Netflix
Thursday, October 16, 2014 - 9:45am - 10:45am

The cloud is all about redundancy and fault tolerance. Since no single component can guarantee 100 percent uptime, we have to design architectures where individual components can fail without affecting the availability of the entire system. But just designing a fault tolerant architecture is not enough. We have to constantly test our ability to actually survive these “once in a blue moon” failures. And the best way is to test in an environment that matches production as closely as possible or, ideally, actually in production. This is the philosophy behind Netflix' Simian Army, a group of tools that randomly induces failures into individual components to make sure that the overall system can survive. Gareth Bowles introduces the main members of the Simian Army―Chaos Monkey, Latency Monkey, and Conformity Monkey. Gareth provides practical examples of how to use them in your test process—and, if you're brave enough, in production.

More Information
Learn more about Gareth Bowles.
T3 A Path through the Jungle: Validating a Test Automation System for the FDA
Chris Crapo, Boston Scientific Neuromodulation
David Nelson, Boston Scientific Neuromodulation
Thursday, October 16, 2014 - 9:45am - 10:45am

Test automation is difficult to get right. Working under FDA regulation presents its own challenges. Combining the two is a scary proposition because the FDA requires—and will scrutinize—the validation of any test automation used. Despite this, working in a regulated environment only magnifies the value of test automation. Aware that automation is a driver of quality and consistency, the FDA welcomes automated tests as part of an audit submission. The key to success is demonstrating quality in a way that the FDA recognizes. Chris Crapo and David Nelson lay out the road map to validation of a test automation system and highlight the critical thinking, planning, and types of maintenance that form the core of any successful validation strategy. By understanding the focal points of validation, you can set your project up for regulatory success while maintaining a lean, focused execution that drives results, not paperwork.

More Information
Learn more about Chris Crapo and David Nelson.
T4 Top Ten Attacks to Break Mobile Apps
Jon Hagar, Grand Software Testing
Thursday, October 16, 2014 - 9:45am - 10:45am

To aid development in the mobile and smartphone app world, testers must do more than simply test against requirements; they should include attack-based testing to find common errors. In the tradition of James Whittaker’s How to Break Software books, Jon Hagar applies the testing “attack” concept to mobile app software, defines the domain of mobile app software, and examines common industry patterns of product failures. Jon then shares a set of ten software test attacks, based on the most common modes of failure in native, web-based, and hybrid apps. Developers and testers can use these attacks against their own software to find errors more efficiently. Jon describes why each attack works with its pros and cons. He provides information on how attacks can be used to cover many different quality attributes beyond testing only functionality.

More Information
Learn more about Jon Hagar.
T5 Using DevOps to Improve Software Quality in the Cloud
Jeff Payne, Coveros, Inc.
Thursday, October 16, 2014 - 9:45am - 10:45am

DevOps is gaining popularity as a way to quickly and successfully deploy new software. With all the emphasis on deployment, software quality can sometimes be overlooked. In order to understand how DevOps and software testing mesh, Jeff Payne demonstrates a fully implemented continuous integration/continuous delivery (CI/CD) stack. After describing the internals of how CI/CD works, Jeff identifies the touch points in the stack that are important for testing organizations. With the now accelerated ability to deliver software, the testing groups need to know how this technology works and what to do with it because swarms of manual testers will not be able to keep up. Jeff demonstrates where and how to use automated testing, how to collect and make sense of the massive amount of test results that can be generated from CI/CD, and how to usefully apply manual testing.

More Information
Learn more about Jeff Payne.
T6 Testers, Use Metrics Wisely or Don’t Use Them at All
Deborah Kennedy, Aditi Technologies
Thursday, October 16, 2014 - 9:45am - 10:45am

For thousands of years, human language has provided us with beautiful and complex ways of sharing important ideas. At the same time, language can derail attempts to communicate even the most basic pieces of critical information. We testers are the heralds of vast amounts of data, and it is our responsibility to use that data wisely—or not at all. Whether you are the information presenter whose voice not being heard or the information receiver who needs ways to spot errors in the message, a review of how metrics can be skewed—through ignorance, bias, or malice—provides us with the ability to think beyond content to the ethics of presentation. Using scientific research, case studies, and an interactive “try it yourself” experience, Deborah Kennedy explores both sides of metric—the good and the bad. Take away key insights to present your message without built-in barriers and arm yourself against disreputable attempts to sway you with unwisely presented data.

More Information
Learn more about Deborah Kennedy.
T7 Leading Internationally-Distributed Test Teams
Dennis Pikora, Symantec
Thursday, October 16, 2014 - 11:15am - 12:15pm

Are you employing your offshore test team to its best advantage—gaining the cost savings and test coverage you expected? Unless correct management methodologies are in place, you will lose rather than gain both time and money with internationally-distributed testers. If you are thinking you can go offshore with minimal effort, think again. Distributed test leadership and management issues apply when working with third-party firms, a subsidiary, or even your own employees. Don’t let unrealistic expectations impact your career or your company’s goals. Learn methodologies such as site mirroring, managing Scrum of Scrum meetings, and the value of physical presence. Become aware of labor laws and cultural differences. Ensure the best selection of employees at offshore sites. If you want to successfully manage your distributed international teams and avoid the pitfalls that plague many firms, join Dennis Pikora as he discusses the methodologies that enabled the efficiency of his worldwide teams.

More Information
Learn more about Dennis Pikora.
T8 Career and Organizational Development Within a Software Testing Environment
Nate Shapiro, Blizzard Entertainment
Thursday, October 16, 2014 - 11:15am - 12:15pm

Being a software tester has its own unique set of challenges. To help testers overcome these challenges, it is vital to set up a system where employees have available a number of development opportunities, including on-the-job mentorship, coaching, classroom training, and a defined career path. Nate Shapiro outlines how the quality assurance department at Blizzard Entertainment is investing in its employees by implementing a program to help create and sustain long-term careers in software testing. He describes some specific successes and challenges Blizzard faced as it worked to create a career path, including gamification of reward programs and offering certifications. 

More Information
Learn more about Nate Shapiro.
T9 Automation Abstractions: Page Objects and Beyond
Alan Richardson, Compendium Developments
Thursday, October 16, 2014 - 11:15am - 12:15pm

When you start writing automation for your projects, you quickly realize that you need to organize and design the code. You will write far more than “test” code; you also will write abstraction code because you want to make tests easier to read and maintain. But how do you design all this code? How do you organize and structure it? Should you use a domain specific language? Should you go keyword driven or use Gherkin? Should you use page objects with POJO or Factories? Do you create DOM level abstractions? Where do domain models fit in? Alan Richardson provides an overview of options available to you when modeling abstraction layers. Based on his experience with many approaches on real-world commercial projects, Alan helps you understand how to think about the modeling of abstraction layers. Illustrated with a number of code examples, Alan shows you a variety of approaches and discusses the pros and cons associated with each.

More Information
Learn more about Alan Richardson.
T10 Bridging the Gap in Mobile App Quality
Costa Avradopoulos, Capgemini Consulting
Thursday, October 16, 2014 - 11:15am - 12:15pm

Today, an alarming 65 percent of mobile apps—more than 1.3 million—have a 1-star rating or less. Why? The majority of development organizations have neither the right processes nor access to the devices required to properly test mobile applications. If not addressed, these deficiencies will have a major impact on the quality of the apps the organization develops. In addition, users are intolerant of problems and quick to switch to competing apps. Costa Avradopoulos explores how to address the unique challenges of mobile testing, starting with adopting the right test strategy. Costa describes the top challenge test leaders face today―how to design a proper test lab, given thousands of unique mobile devices. Costa shares insight into choosing the right devices to optimize test coverage and reduce risks. He also shows you how to leverage existing tools and evaluate automation options to keep your team current with the faster pace of mobility.

More Information
Learn more about Costa Avradopoulos.
T11 Checking Performance along Your Build Pipeline
Andreas Grabner, Compuware
Thursday, October 16, 2014 - 11:15am - 12:15pm

Do you consider the performance impact when adding a new JavaScript file, a single AJAX call, or a new database query to your app? Negligible, you say? I disagree―and so should you. Andreas Grabner demonstrates the severe impact small changes can have on performance and scalability. Many small changes will have an even bigger impact so it is important to catch them early. If you are working with a delivery pipeline, make sure to look into performance, scalability, and architectural metrics such as the number of resources on your page, size of resources, number of requests hitting your web servers, database statements executed, and log messages created. Monitoring these allows you to add a new quality gate to your delivery pipeline and prevents major problems. Andi shares a handful of metrics to teach to your developers, testers, and operations folks, and explains why they are important to performance.

More Information
Learn more about Andreas Grabner.
T12 Metrics That Matter
Pablo Garcia, Redmind
Thursday, October 16, 2014 - 11:15am - 12:15pm

Imagine you’re a test manager starting a new assignment. On the first day of work, you’re presented with a list of metrics you are to report. Soon, you realize that most of the metrics are not really connected to what should be measured. Or, consider the situation where you’re told that there is no value collecting metrics because “we’re agile.” In either situation, what would be your next step be? Join Pablo Garcia as he shares his experience with the dangers of poor metrics. Believing that some metrics can have value in helping testing be effective and efficient, Pablo shares his favorite metrics including a couple of crazy ones―requirements coverage, defect detection percentage, faults in production, and cost per bug. Each is discussed, evaluating what it really measures, when to use it, and how to present it to send the correct message. Take back a toolbox of testing metrics that will make your testing role easier.

More Information
Learn more about Pablo Garcia.
T13 The Unfortunate Triumph of Process over Purpose
James Christie, Claro Testing
Thursday, October 16, 2014 - 1:30pm - 2:30pm

As a test manager, James Christie experienced two divergent views of a single project. The official version claimed that planning and documentation were excellent, with problems discovered during test execution being managed effectively. In fact, the project had no useful plans, so testers improvised test execution. Creating standardized documentation took priority over preparing for the specific problems testers would actually face during testing. The required documentation standards didn't assist testing; they actually hindered by distracting from relevant, detailed preparation. It was a triumph of process over purpose. James shows that this is a problem that testing shares with other complex disciplines. Devotion to processes and standards inhibits creativity and innovation. They provide a comfort blanket and a smokescreen of “professionalism” where following the ritual becomes more important than accomplishing the goals. Unless we address this issue, organizations will question whether testers really add value. Testers must respond by challenging unhelpful processes and the culture that encourages them. Purpose must come before process!

More Information
Learn more about James Christie.
T14 Speed Up Testing with Monitoring Tools
Jim Hirschauer, AppDynamics
Thursday, October 16, 2014 - 1:30pm - 2:30pm

The software development lifecycle is a pretty complex process in many organizations. However, by using monitoring tools and methodologies, you can accelerate testing and release higher quality code―the cornerstone of rapid software delivery. These tools provide immediate feedback with actionable information so you can address problems as they are detected instead of waiting until the end of a testing cycle. Earlier detection, combined with tests that are a better representation of production workloads, are key to releasing better code, faster. Jim Hirschauer shows how to use monitoring software to make a major impact during development, test, and production. He describes typical use cases for server monitoring, log monitoring, and application performance monitoring. Learn about open source testing tools including Siege, Multi-Mechanize, and Bees with Machine Guns. Understand how to use each of these tools and more in development, test, and production as well as creating a feedback loop that drives continuous improvement.

More Information
Learn more about Jim Hirschauer.
T15 Making Your Test Automation Transparent
Subodh Parulekar, AFour Technologies, Inc.
Thursday, October 16, 2014 - 1:30pm - 2:30pm

Business analysts, developers, and testers are sometimes not on the same page when it comes to test automation. When there is no transparency in test cases, execution, coverage, and data, review of automation by all stakeholders is difficult. Making automation scripts easily readable and writable allows stakeholders to better participate. Subodh Parulekar describes how his team dealt with these issues. Learn how they leveraged behavior-driven development (BDD) concepts and put a wrapper around their existing automation framework to make it more user-friendly with the easy to understand Given-When-Then format. Subodh discusses how his team implemented the new approach in four months to automate 700+ test cases. Now, test reports contain the actual Gherkin test step that passed or failed allowing any stakeholder to evaluate the outcome. Learn how stakeholders can rerun a failed test case from the reporting dashboard to determine if the failure is related to a synchronization, environmental, functional, or test data problem.

More Information
Learn more about Subodh Parulekar.
T16 Ensuring the Performance of Mobile Apps—on Every Device and Network
Thursday, October 16, 2014 - 1:30pm - 2:30pm

Applications today are accessed over myriad network configurations—wired, wireless, and mobile networks. Deployed applications may deliver different content and functionality depending on whether the user is accessing it via a browser, smartphone, or tablet. Steve Weisfeldt explains how these combinations significantly impact the performance of applications, creating a previously unseen set of testing challenges. A crucial part of the load testing process is being able to emulate network constraints, change connection speeds, and control parameters such as packet loss and network latency to test in the most realistic scenarios. Learn the ramifications of these new technologies and constraints on testing for mobile application performance. Join Steve in discussing approaches and considerations to ensure that high application performance is delivered to all end-users—all the time—regardless of device or network.

More Information
Learn more about Steve Weisfeldt.
T17 Build Your Custom Performance Testing Framework
Prashant Suri, Rackspace
Thursday, October 16, 2014 - 1:30pm - 2:30pm

Performance testing requires knowledge of systems architecture, techniques to simulate the load equivalent of sometimes millions of transactions per day, and tools to monitor/report runtime statistics. With the evolution from desktop to web and now the cloud, performance testing involves an unparalleled combination of different workloads and technologies. There is no one tool available—either commercial or open source—that meets all performance testing needs. Some tools act as load generators; others only monitor system resources; and many only operate for specific applications or environments. Prashant Suri shares the essential components you need for a comprehensive performance test framework and explores why each component is required for a holistic test. Learn how to develop your custom framework―starting with parsing test scripts in a predefined format, iterating over test data, employing distributed load generators, and integrating test monitors into the framework. Discover how building your own framework gives you flexibility to challenge multiple performance problems—and save thousands of dollars along the way.

More Information
Learn more about Prashant Suri.
T18 Testing Application Security: The Hacker Psyche Exposed
Mike Benkovich, Imagine Technologies, Inc.
Thursday, October 16, 2014 - 1:30pm - 2:30pm

Computer hacking isn’t a new thing, but the threat is real and growing even today. It is always the attacker’s advantage and the defender’s dilemma. How do you keep your secrets safe and your data protected? In today’s ever-changing technology landscape, the fundamentals of producing secure code and systems are more important than ever. Exploring the psyche of hackers, Mike Benkovich exposes how they think, reveals common areas where they find weakness, and identifies novel ways to test your defenses against their threats. From injection attacks and cross-site scripting to security mis-configuring and broken session management, Mike examines the top exploits, shows you how they work, explores ways to test for them, and then shares what you can do to help your team build more secure software in the future. Join Mike and help your company avoid being at the center of the next media frenzy over lost or compromised data.

More Information
Learn more about Mike Benkovich.
T19 Before You Test Your System, Test Your Assumptions
Aaron Sanders, Agile Coach
Thursday, October 16, 2014 - 3:00pm - 4:00pm

Do you find yourself discussing with your peers what you think the system you’re building should do? Do you argue over what the users want? Do discussions wind up in a heated debate? This result indicates that no shared understanding exists about the system. With a lack of shared understanding, it’s easy to fall into the trap of making assumptions about system functionality, who the users will be, and how to build the system. These assumptions introduce errors into the requirements and design—long before a single line of code is written. Creating a shared understanding among stakeholders, users, and teams reduces the chances of not building the right thing—as well as not building the thing right. Aaron Sanders describes the techniques of experimental design, story mapping, user research, prototyping, and user acceptance testing that he’s used to help teams build a shared understanding. Learn to test your assumptions as rigorously as you test the system itself.

More Information
Learn more about Aaron Sanders.
T20 User Acceptance Testing in the Testing Center of Excellence
Deepika Mamnani, Capgemini
Thursday, October 16, 2014 - 3:00pm - 4:00pm

Centralization of testing services into a testing center of excellence (TCoE) for system testing is common in IT shops today. To make this transformation mature, the next logical step is to incorporate the user acceptance testing (UAT) function into the TCoE. This poses unique challenges for the TCoE and mandates the testing team develop a combination of business process knowledge coupled with technology and test process expertise. Deepika Mamnani shares her experiences in implementing a UAT TCoE and best practices—from inception to planning to execution. Learn techniques to create business-oriented testable requirements, strategies to size and structure the team, and the role of automation. Review testing metrics needed to measure the success of the UAT function. Hear a real-world transformation journey and the quantitative business benefits achieved by an organization incorporating UAT as a centralized function within the TCoE. Take back strategies to incorporate UAT as a part of your TCoE.

More Information
Learn more about Deepika Mamnani.
T21 The Doctor Is In: Diagnosing Test Automation Diseases
Seretta Gamba, Steria Mummert ISS GmbH
Thursday, October 16, 2014 - 3:00pm - 4:00pm

When doing test automation, you sometimes notice that things are not working out as expected, but it’s not clear why. You are so caught up in the day-to-day work that you don't see the bigger picture. It’s like when you get sick―you know something is wrong, but you don’t know what. That’s the time to visit a doctor. Doctors diagnose diseases mainly by asking questions. First, they get a general idea of what’s wrong; then the questions become more and more specific; and in the end, they identify the disease and prescribe the appropriate cure. This method also works well for test automation. By first asking general questions, and then more and more specific ones, you can identify the disease (the issue) and then it’s relatively simple to select the most appropriate remedy. Seretta Gamba demonstrates this method with examples of common automation diseases and suggests the appropriate patterns to cure them.

More Information
Learn more about Seretta Gamba.
T22 Five Ways to Improve Your Mobile Testing
Thursday, October 16, 2014 - 3:00pm - 4:00pm

Few technology shifts have impacted the way we do business as much as mobile. The new and exciting functionality delivered by mobile apps, the pace at which they are being developed, and their emergence as the “face of the business” requires that organizations deliver unprecedented quality in these software systems. Join Dennis Schultz to learn how leading enterprises are approaching their mobile application testing challenges and how they have integrated mobile into their existing processes. Dennis describes the importance of testing on real devices, the value of using emulators to supplement your testing strategy, how to optimize your testing with real devices using SaaS remote device services, how to automate your repetitive tests to speed time to market and improve quality, and how to support a collaborative work environment and efficient test process for mobile development.

More Information
Learn more about Dennis Schultz.
T23 Modeling System Performance with Production Data
Thursday, October 16, 2014 - 3:00pm - 4:00pm

When creating system performance models, the primary challenges are where and how to start. Whatever the performance characteristics being estimated or modeled, we need a solid approach that addresses both business and system needs. All too often performance tests inadvertently mix load and stress scenarios with little regard for how this will confound recommendations and business decisions. If you are a test manager, a business process owner, or you simply want to better understand performance testing, you will be interested in William Hurley’s case studies. Will presents two real-world examples that demonstrate the impact on business decisions, and show how to use production data and statistical modeling to improve both the analysis and business decisions. The first study is a cloud-based performance testing engagement. The second is a back-end re-hosting study where acceptance criteria was based on an achieving “equal or faster” performance. Take away new insights and approaches to improve performance-based decisions for your organization.

More Information
Learn more about William Hurley.
T24 Testing API Security: A Wizard’s Guide
Ole Lensmar, SmartBear Software
Thursday, October 16, 2014 - 3:00pm - 4:00pm

As we've seen in recurring events in the past year, web services APIs are a primary target for security attacks—and the consequences can be catastrophic for both API providers and end users. Stolen passwords, leaked credit card numbers, and revealed private messages and photos are just some of the headaches awaiting those who have been compromised. Ole Lensmar puts on his hacker-cloak to show how attackers break systems via web service APIs with fuzzing, session spoofing, injection attacks, cross-site scripting, and other methods. Learn how these attacks actually work on an API and how we can test an API to make sure it isn't vulnerable—without compromising the API at the same time. Find out the roles various security-related standards play and how they affect testing. Come and find out. You can’t afford not to.

More Information
Learn more about Ole Lensmar.