Our Promise to You

http://www.sqe.com/

 
 
 

Wednesday Concurrent Sessions
Wednesday, October 03, 2012 11:30 AM
W1
Test Management
SoLoMo Is Shaking Up the Testing Landscape: How to Adapt, Keep Up, and Thrive C Matt Johnston, uTest The collision of social, local, and mobile media (a.k.a. SoLoMo) is impacting and disrupting software development and testing organizations worldwide. With so much sensitive and critical data flowing to and from SoLoMo technologies, there is immense pressure to ensure that apps are reliable, scalable, and secure across a multitude of environments—handsets, tablets, operating systems, browsers, carriers, languages, and locations. Using real-world success stories from Google, Microsoft, and others, Matt Johnston identifies how SoLoMo has transformed the software industry and reveals the secrets to overcoming the challenges that SoLoMo technologies present today. For example, companies that communicate with customers and partners via Facebook or Twitter must protect sensitive data. GPS apps present location-testing challenges. Many apps are being used outside the sterile confines of the testing lab, under “in-the-wild” conditions; thus, when apps and users are distributed around the globe, a portion of your testers should be also. Learn more about Matt Johnston
Wednesday, October 03, 2012 11:30 AM
W2
Test Techniques
Testing a Business Intelligence/Data Warehouse Project C Karen N. Johnson, Software Test Management, Inc. When an organization builds a data warehouse, critical business decisions are made on the basis of the data. But how do you know the data is accurate? What should you test, and how? Karen Johnson discusses how to test in the highly technical areas of data extraction, transformation, and loading. Stored procedures, triggers, and custom ETL (extract, transform, load) transactions often must be tested before the reports or dashboards from a business intelligence (BI) project can be tested. The volume of data is frequently so large that testing “all the data” is simply not possible so choosing an appropriate test data set is often one of the most strategic decisions in BI testing. Karen shares stories about past BI projects and ideas on how to test data warehouse and business intelligence projects. Learn the techniques for ensuring quality data on your vital databases. Learn more about Karen N. Johnson
Wednesday, October 03, 2012 11:30 AM
W3
Test Automation
Building a 21st Century Test Automation Framework C Clark Malmgren, TVWorks Customers in today’s Web 2.0 world expect rapid releases of feature-rich applications that just work. Keeping up with such a paradigm change requires test organizations to focus heavily on test automation. Also required is the ability to deal with a test environment that has multiple teams simultaneously working on different areas of a product. Of increasing importance are parallelism, pipelining, and using machine intelligence to sort through the noise and find real defects in products. Clark Malmgren shares how Comcast has increased throughput of their tests by running them in parallel across multiple set-top boxes and pipelining tests to minimize downtime during a test run. Learn how to maximize your testers’ time by harnessing the power of machine intelligence to identify which test failures are real and which are the result of other issues—and allow your team to focus its limited time on what really matters. Learn more about Clark Malmgren
Wednesday, October 03, 2012 11:30 AM
W4
Agile Testing
Making the Most of Test Automation on an Agile Project C Alexander Andelkovic, Spotify In today’s competitive marketplace, the ability to rapidly release new product features is vital. As we move from traditional release cycles of months and years to cycles of days and weeks, test automation approaches need rethinking. Alexander Andelkovic describes the challenges of implementing and integrating rapid test automation on an agile project. Traditional test automation tries to maintain an ever-growing regression test suite and struggles to implement automated tests of new functionality. Manual testers often lack the necessary skills to implement automated tests in a short-cycle development environment. Alexander describes a process to save time by having manual testers implement their own tests daily using a simple, model-based test automation framework that requires only basic modeling and scripting skills. Automated tests can be implemented earlier, providing valuable feedback to the project. Alexander shares a glimpse into the future of test automation with a description of a fully-automated test solution. Learn more about Alexander Andelkovic
Wednesday, October 03, 2012 11:30 AM
W5
Cloud Testing
Testing in the Cloud: Policy, Security, Privacy, and Culture C Steven Woodward, Cloud Perspectives Many organizations are evaluating and migrating toward cloud computing solutions. In 2012, the challenges are less technological, and more cultural and policy related. Steven Woodward shares the National Institute of Standards for Technology (NIST) Cloud Computing Reference Architecture that forms the foundation for many organizations’ cloud initiatives. He describes the key policy, security, privacy, and cultural considerations in the context of testing in cloud computing and what cloud standards development organizations are adopting. When considering cloud computing service models, old habits need to be reassessed and refined. Testers in the cloud need to be aware of the various options and specifically where they fit in the cloud ecosystem. Cloud testing skills remain critical; however, processes, procedures, and general habits will require changes, depending on the specific cloud solution adopted. Steven shares real-life practical scenarios to emphasize cultural perspectives for testers. Learn more about Steven Woodward
Wednesday, October 03, 2012 11:30 AM
W6
Special Topics
Tests and Requirements: You Can’t Have One without the Other C Ken Pugh, Net Objectives The practice of software development, including agile, requires a clear understanding of business needs. Misunderstanding requirements causes waste, missed schedules, and mistrust within the organization. A disagreement about whether or not an incident is a defect can arise between testers and developers when the cause is really a disagreement about the requirement itself. Ken Pugh describes how you can use acceptance tests to decrease this misunderstanding of intent. A testable requirement provides a single source that serves as the analysis document, acceptance criteria, regression test suite, and progress tracker for any given feature. Ken explores the creation, evaluation, and use of testable requirements by the business and developers. Examine how to transform requirements into stories— small units of work—each of which has business value, small implementation effort, and easy to understand acceptance tests. Learn how testers and requirement elicitors can work together to create acceptance tests prior to implementation. Learn more about Ken Pugh
Wednesday, October 03, 2012 1:45 PM
W7
Test Management
Moneyball and the Science of Building Great Testing Teams C Peter Varhol, Seapine Software Moneyball is an analytical, metrics-based approach to assembling a competitive baseball team. It is based on breaking down accepted preconceptions and finding new ways to look at individual skills and how they mesh as a team. Sometimes the characteristics that we believe the team needs aren’t that important in improving quality. In fact, some accepted practices may have less impact on quality than we might have predicted. Peter Varhol examines how to use data about applications and quality to tell the right story about our state of quality and our success in shipping high quality applications. Looking at some of our preconceptions about testing and individual skills, Peter identifies characteristics for building and running a high-performance testing team. Learn about applying the Moneyball approach to testing and quality, giving your teams the best bang for their buck in evaluating their own capabilities and delivering the highest quality possible. Learn more about Peter Varhol
Wednesday, October 03, 2012 1:45 PM
W8
Test Techniques
The Tester’s Role in Continuous Integration C Ayal Cohen, HP Software Roi Carmel, HP Software If your software product is recompiled and integrated frequently, you can improve your testing by integrating automated tests into your continuous integration process. In many organizations, unit tests are run as part of continuous integration; however, that is not enough. During the continuous integration cycle, integration of all automated tests—system, integration, unit, and regression—is vital to help find defects quickly and provide a substantial return on investment. Ayal Cohen and Roi Carmel describe the types of tests needed, the pros and cons of each type, and how to choose which tests to execute according to development code change, business criticality, and history of execution. Ayal and Roi discuss the need for service virtualization so you can run your tests in an environment that has not yet been fully developed, providing virtual substitutes for the missing services. Learn more about Ayal Cohen, Roi Carmel
Wednesday, October 03, 2012 1:45 PM
W9
Test Automation
Stop the Test Automation ROI-based Justification Insanity C Bob Galen, RGalen Consulting In the past, we justified our automation efforts with ROI calculations based on saved test execution time. Unfortunately, those “savings” frequently led to eliminating testers from the organization. Management applauded the “do more with less” reality that these ROI savings promised. Seasoned and slightly askew test leader Bob Galen challenges these traditional views toward automation ROI-based savings. Explore better value drivers for automation that include increasing your competitive position, increasing the capacity and skill of your test organization, allowing for late-binding changes for development that provide a delivery “safety net,” and increasing the overall quality of your risk-based testing strategies. Bob explains why cost savings is a low-level, trivial pursuit and discusses why focusing on increased investment in your team and your testing should be the prime directive for your automation initiatives. Also learn how to communicate this strategy change to your executives and how to align it with your business goals and objectives. Learn more about Bob Galen
Wednesday, October 03, 2012 1:45 PM
W10
Agile Testing
Agile Defect Management: Focus on Prevention C David Jellison, Constant Contact Efficient agile organizations focus on defect prevention rather than downstream defect discovery because discovering defects during or after testing adds to development costs. Delaying discovery and repair of defects can make an agile team feel like they are operating in a mini-waterfall. Sharing his experience with Scrum/Kanban teams, David Jellison describes how grouping defects into two major categories—work-in-process defects and escaping defects—reduces development costs and improves reliability in the field. Dave illustrates how to manage problem discovery early and minimize the existence of escaping defects. Treating escaping defects as the exception rather than the norm results in a much smaller defect backlog and increased customer satisfaction. This approach encourages increased collaboration between quality engineers and developers, and shifts the focus of team measures from defect counts to product delivery velocity and cycle time, with increased confidence in quality as the work is completed. Learn more about David Jellison
Wednesday, October 03, 2012 1:45 PM
W11
Cloud Testing
Beyond the Silver Lining: Testing Go Daddy’s Cloud C Brent Strange, Go Daddy The cloud offers users a way to easily access applications and data from anywhere on any device. However, behind that simple façade lies a colossal testing challenge. Go Daddy's Storage as a Service and the surrounding SOA consist of technologies galore. Test automation and coverage do not come easily with a technology portfolio that includes PHP, Perl, Python, C++, MySQL, RabbitMQ, and Cassandra—to name just a few. Join Brent Strange to see how engineers at Go Daddy solve these problems by working together to build a test automation infrastructure and QA processes that ensure the dependability, scalability, and high quality of Go Daddy's next generation cloud storage solution. Brent shares the SOA automation frameworks and some techniques that Go Daddy employs to provide test coverage across all technologies, layers, and environments. Discover how four years of triumphs and woes ultimately morphed into quality assurance that is built into every step of an agile SDLC. Learn more about Brent Strange
Wednesday, October 03, 2012 1:45 PM
W12
Special Topics
Forgotten Wisdom from the Ancient Testers C Dorothy Graham, Software Test Consultant Rob Sabourin, AmiBug.com, Inc. In our increasingly agile world, collaboration is the new buzzword. But collaboration is hard to do well. Testers are challenged to work directly, effectively, efficiently, and productively with customers, programmers, business analysts, writers, trainers—and pretty much everyone in the business value chain. There are many points of collaboration including grooming stories with customers, sprint planning with team members, reviewing user interaction with users, whiteboarding with peers, and buddy checking. Rob Sabourin and Dot Graham explain what collaboration is, why it is challenging, and how to make it better. Learn how forgotten but proven techniques can help you work more efficiently, improve your professional relationships, and deliver quality products. Join Dot and Rob to hear how “ancient” techniques apply in today’s world, with stories of how these techniques work now. Learn more about Dorothy Graham, Rob Sabourin
Wednesday, October 03, 2012 3:00 PM
W13
Test Management
Three New Technologies that Will Disrupt Your Test Organization C Klaus Haller, Swisscom IT Services Which forces are shaping the future of your test organization—processes, tools, technologies? It is a simple—and misleading—question. The test organization is not the center of the universe. The test organization serves the IT department and the business. If they change, the test organization must change. Three new technologies—the cloud, service-oriented architectures, and multi-tenant systems—are revolutionizing IT departments. Test organizations must adapt their methodologies, tools, and processes to these technologies. The combination of these three is a catalyst for advanced sourcing models. Join Klaus Haller to learn how the rise of application or business service provisioning changes the task portfolio and staffing needs of testing organizations. Discuss with Klaus how to move from a classic testing organization to a continuous and holistic quality assurance organization. Take back disruptive insights and fresh ideas that may be new to your organization—and just may help get your test organization ready for the future. Learn more about Klaus Haller
Wednesday, October 03, 2012 3:00 PM
W14
Test Techniques
The Many Flavors of Exploratory Testing C Gitte Ottosen, Sogeti The concept of exploratory testing is evolving, and different interpretations and variations are emerging and maturing. These range from the pure and original thoughts of James Bach, later expanded to session-based exploratory testing by Jon Bach, to testing tours described by James Whittaker, to the many different ways test teams across the world have chosen to interpret exploratory testing in their own contexts. Though it appears to be simple, exploratory testing can be difficult to introduce into a traditional organization where testers are familiar only with executing scripted test cases and where the concept of exploration and creative testing may be somewhat foreign. At the same time, organizations need to address the challenges of traceability and reporting, moving from traditional ways to a more exploratory approach. Join Gitte Ottosen as she describes some of the different flavors of exploratory testing with which she has been working—different approaches but all with the underlying foundation of "simultaneous learning, test design, and execution.” Learn more about Gitte Ottosen
Wednesday, October 03, 2012 3:00 PM
W15
Test Automation
UI-based Test Automation: A Story of Failure and Success with FIFA Soccer 10 C Michael Donat, Electronic Arts Canada Automating UI-based tests, especially those involving multiple clients, has many pitfalls. One of the major barriers to broad UI-based automation is the high cost of maintaining scripts. Removing this barrier requires a shift to architectures like Model-View-ViewModel (MVVM). Michael Donat explores why the Online Team Play (OTP) feature in the Electronic Arts Sport's FIFA Soccer 10 game greatly exceeded its QA budget and describes several reasons why automation efforts failed in the past. Michael goes on to explain that separating the interface from the actual controls appearing in front of the user is critical to reducing maintenance costs. This allows future automation scripts to test the interface directly, making the automation immune to screen design and control changes. Join Michael and learn how to make automation a powerful tool in testing UI-based clients and how architectures like MVVM reduce automation maintenance costs. Learn more about Michael Donat
Wednesday, October 03, 2012 3:00 PM
W16
Agile Testing
Static Testing Comes to Agile: A Simplified Inspection Process that Works C Anne Hungate, Nationwide Costs soar when defects are not discovered until system testing—or worse, in production. Inspections can drive down delivery times, drive out defects, and help align business and IT expectations. The benefits of inspections are known and documented, although adding these quality steps can appear to slow down an agile team. Can there be harmony between prevention processes and agile practices? Anne Hungate takes you through the experiences she and her team gained bringing static testing practices into their transition from waterfall to agile. They streamlined and simplified the inspection process while still capturing critical data to prevent problems from escaping to production. Anne shares the practical steps to overcoming the organizational and cultural barriers that keep teams from realizing the benefits of inspections. Arm yourself with evidence that inspections can save time and money—and improve quality in agile development. Leave with a plan to improve your delivery process with agile-ready static testing. Learn more about Anne Hungate
Wednesday, October 03, 2012 3:00 PM
W17
Cloud Testing
Agile, Automation, and the Cloud C Kiran Karnad, Mimos Berhad The cry throughout organizations today is “move to the cloud”. However useful the cloud may be, testing applications hosted in the cloud presents an additional set of challenges. Both the application-under-test and the platform changes need to be regression tested. Taking you on a journey to demystify the testing lifecycle for cloud-based applications, Kiran Karnad details the different approach and the new set of practices and tools we need to deliver high quality cloud applications. Join Kiran as he shares the process of continuous integration for the cloud and introduces an open source tool for automation and performance testing. Using these experiences, Kiran highlights the salient points of testing cloud-based applications and differentiates it from testing in the traditional world. Take away new ideas and approaches for automating test execution, performance testing, and the challenges agile methodologies present to testing in the cloud. Learn more about Kiran Karnad
Wednesday, October 03, 2012 3:00 PM
W18
Special Topics
Testing in the DevOps World of Continuous Delivery C Manoj Narayanan, Cognizant Technology Solutions DevOps is an increasingly popular development approach focused on ensuring that delivered code is immediately stable and works as expected. DevOps team members must be multi-skilled and are expected to perform all the activities of development, testing, and SysAdmin tasks. Manoj Narayanan shares how to implement testing using DevOps tenets and how it differs from its more popular cousin, agile development. To work productively with developers and SysAdmins, testers must develop knowledge of development and design principles, programming languages, and continuous integration. Manoj explores the critical role that functional and regression test automation plays in enabling testing organizations to be more productive. Manoj concludes with an analysis of the cultural impact DevOps has on the testing organization and its interaction with other critical stakeholders—business, developers, operations, and customers. Take back details about new testing tools to help you succeed in this new world and with an understanding of how DevOps impacts testing. Learn more about Manoj Narayanan


Top of Page
 
Send us Your Feedback