STARWEST Software Testing Analysis & Review
 
SQE Home
 
 
 
STARWEST 2011
Wednesday Concurrent Sessions
Wednesday, October 05, 2011 11:30 AM
W1
Test Management
Test Estimation and the Art of Negotiation C Nancy Kelln, Unimagined Testing Lynn McKee, Quality Perspectives Many of us have struggled with test estimation. We have tried simple, heuristic models to craft a best guess—often without much success. We have also tried using a variety of complex, scientific models to calculate an accurate number. The problem is, we are usually fooled by the models—both simple and complex ones—and either overestimate testing needs or are lulled into impossible commitments. Lynn McKee and Nancy Kelln explore the realities of test estimation and propose a new mindset for handling estimation requests. In an interactive format, Nancy and Lynn demonstrate that the best estimate may be no estimate at all. By shifting the focus from estimating to negotiating, you’ll learn how to reveal the often obscured but already determined available time for testing. The next time you are pressured to improve your estimates by reviewing project details carefully, assessing and forecasting risks, and adding more contingency time, be prepared to negotiate and offer up a realistic, practical, and more powerful response. Learn more about Nancy Kelln, Lynn McKee
Wednesday, October 05, 2011 11:30 AM
W2
Test Techniques
Google's New Methodology for Risk-driven Testing C Jason Arbon, Google, Inc. Sebastián Schiavone, Google, Inc. Risk mitigation and risk analysis are delicious ingredients in a recipe Google calls risk-driven testing. Most of us are familiar with how to approach risk mitigation from a test perspective—in the form of test plan development, test cases, and documentation. However, comprehensive risk analysis is still considered black magic by many in our field. In this hands-on presentation, Jason Arbon and Sebastián Schiavone introduce ACC—Attributes–Components–Capabilities—a methodology for systematically breaking down an application into coherent and logically-related elements for risk analysis. ACC prescribes a very easy-to-follow process that you can apply consistently and quickly to many types of projects. Jason and Sebastián break down risk analysis into seven simple steps and walk participants through the complete ACC and risk analysis process for several high-profile Google products. Take back a practical, easy-to-use process that you can apply in your current and future projects. Learn more about Jason Arbon, Sebastián Schiavone
Wednesday, October 05, 2011 11:30 AM
W3
Test Automation
Automated Testing: The Differentiators of Success C Nazar Hossain, Zarieas While automated testing is not new, it has undergone a resurgence in recent years. A combination of matured technology and continually increasing pressure to deliver more value has put a greater focus on finding efficiencies within testing. However, having the right automated testing tools is not enough. Nazar Hossain shares what he has found to be the key factors common to organizations that successfully use automated testing. First, have a comprehensive end-to-end process to manage and measure the success of automation efforts. Second, integrate test automation into the overall system development framework, making it an integral part of the project. This requires a well-engineered automation framework, automation development practices, and a close connection with the change management system. And third, focus on the automation engineers’ roles and responsibilities, career and performance management, and test automation process governance. Explore these differentiators with Nazar to build success into your test automation program. Learn more about Nazar Hossain
Wednesday, October 05, 2011 11:30 AM
W4
Agile Testing
Seven Key Factors for Agile Testing Success C Janet Gregory, DragonFire, Inc. What do testers need to do differently to be successful on an agile project? How can agile development teams employ testers’ skills and experience for maximum value to the project? Janet Gregory describes the seven key factors she has identified for testers to succeed on agile teams. She explains the whole-team approach of agile development that enables testers to do their job more effectively. Then, Janet explores the “agile testing mindset” that contributes to a tester’s success. She describes the different kind of information that testers on an agile team need to obtain, create, and provide for the team and product owner. Learn the role that test automation plays in the fast-paced development within agile projects, including regression and acceptance tests. By adhering to core agile practices while keeping the bigger picture in mind, testers add significant value to and help ensure the success of agile projects. Learn more about Janet Gregory
Wednesday, October 05, 2011 11:30 AM
W5
The Cloud
Cloud Computing: Powering the Future of Testing C Sundar Raghavan, Skytap With the advent of agile development processes, the expected cycle time for building and shipping quality software has been cut dramatically. Yet, much of the IT infrastructure testing used has remained the same for most companies. Testing teams often find themselves squeezed between the need for speed and their inadequate test infrastructure. Today, hundreds of companies are using cloud-based IT infrastructures to streamline, parallelize, and accelerate their testing cycles. Using real-world case studies, Sundar Raghavan shares how the cloud model can enable you to create multiple test environments, instantiate production-like virtual data centers, run multiple tests in parallel, and perform load tests almost at will. Sundar discusses how the cloud model reduces the cost and complexity of test harness set-up and tear-down—all without requiring you to change test tools or methodologies. Return to work with a comprehensive set of questions you should ask every cloud provider to determine if their solution will fit your needs. Learn more about Sundar Raghavan
Wednesday, October 05, 2011 11:30 AM
W6
Special Topics
A Crowdsourcing Success Story C Anu Kak, PayPal Today, many organizations are using crowdsourcing to develop and test their products. Anu Kak presents an overview of how eBay has applied this concept both internally and externally to improve their development and testing. Anu shares PayPal’s implementation of the crowdsourcing approach that helps employees gain insights into applications problems and solutions. With this knowledge in hand, they can quickly improve product quality and customer satisfaction. Anu and his colleague Venkatesh share examples of how eBay utilized their customers for crowdsourcing to improve overall buyer and seller experience. Learn how the crowdsourcing process played a big role in the pilot phase for the Shopping Cart product and how customer feedback, smart troubleshooting tactics, and proactive repair of defects helped tremendously as the pilot ran throughout the US and UK. Take back an approach for getting the crowds to participate in your testing and development. Learn more about Anu Kak
Wednesday, October 05, 2011 1:45 PM
W7
Test Management
Top Ten Disruptive Technologies You Must Understand C Doron Reuveni, uTest The consumerization of enterprise software applications is no longer on its way—it is here. Emerging technologies such as mobile apps, tablets, 4G, cloud computing, and HTML5 are impacting software engineering and testing organizations across all industries. By enabling sensitive data to be accessed through the web and on mobile devices, there is immense pressure to ensure that apps are reliable, scalable, private and secure. Using real-world examples, Doron Reuveni identifies the top ten disruptive technologies that have transformed the software industry and outlines what they mean for the testing community now and in the future. The ways in which web and mobile apps are designed, developed, and delivered are changing dramatically, and therefore the ways these apps are being tested are being taxed and stretched to the breaking point. It is crucial that test and engineering organizations prepare to meet the challenges these emerging technologies present. Learn more about Doron Reuveni
Wednesday, October 05, 2011 1:45 PM
W8
Test Techniques
Managing Test Data in Large and Complex Web-based Systems C Ron Schioldager, Wells Fargo Are you testing an application or web site whose complexity has grown exponentially through the years? Is your test data efficiently and effectively supporting your test suites? Does the test data reside in systems not under your direct control? Learn how the WellsFargo.com test team integrated test data management processes and provisions to gain control over test data in their very large and complex web system environment. Join Ron Schioldager to explore the lifecycle of data, its relationship to effective testing, and how you can develop conditioned, trusted, and comprehensive test data for your systems. Learn about the tools Wells Fargo developed and employs today to support their test data management process, enabling them to maintain a shorter data maintenance cycle while improving their test reliability. If your test data is vulnerable to variations in test environments and sometimes morphs into something completely unrecognizable or disappears completely, this session is for you. Learn more about Ron Schioldager
Wednesday, October 05, 2011 1:45 PM
W9
Test Automation
Pushing the Boundaries of User Experience Test Automation C Julian Harty, eBay, Inc. Although full test automation of the user experience (UX) is impractical and unwise, there are approaches that can save you time and resources. At eBay, Julian Harty and his colleagues are finding new ways to automate as much of UX testing for eBay.com as is reasonably possible. Even with a highly complex, web-based application, they have found that automation finds many potential problems in the user experience—even in rich application scenarios. Julian shares a practical experience report of their successes together with the barriers and boundaries they discovered—detecting navigation issues, layout bugs, and problematic differences between the behavior of various web browsers. Learn from eBay’s experiences why automated testing can be beguiling and, paradoxically, increase the chances of missing critical problems if you chose to rely mainly or even solely on the automated tests. Find out when UX test automation is appropriate and when it’s not the right thing to do. Learn more about Julian Harty
Wednesday, October 05, 2011 1:45 PM
W10
Agile Testing
Concurrent Testing Games: Developers and Testers Working Together C Nate Oster, CodeSquads, LLC The best software development teams find ways for programmers and testers to work closely together to build quality into their software. These teams recognize that programmers and testers each bring their own unique strengths and perspectives to the project. However, working in agile teams we need to unlearn many of the patterns that traditional development taught us. In this interactive session with Nate Oster, you learn how to use the agile practice of “concurrent testing” to overcome common “testing dysfunctions” by having programmers and testers work together—rather than against each other—to deliver quality results throughout an iteration. Join Nate and practice concurrent testing with games that demonstrate just how powerfully dysfunctional approaches can act against your best efforts and how agile techniques can help you escape the cycle of poor quality and late delivery. Bring your own team members to this session and get the full effect of these revealing and inspiring games! Learn more about Nate Oster
Wednesday, October 05, 2011 1:45 PM
W11
The Cloud
The Force of Test Automation in the Salesforce Cloud C Chris Chen, Salesforce.com What would happen if your company doubled or even tripled its number of releases and asked you to do the same with your testing? What if the number of developers doubled and your testing staff remained the same size? Would your test automation be capable of meeting the demand? How would you ensure that one-hundred Scrum teams are investing enough in test automation? How would you triage hundreds of test failures each day? How would you validate each of more than one-hundred releases to production per year? These are the questions Salesforce.com has had to answer during its twelve year history. These are the challenges that led to the creation of its “test automation cloud.” Chris Chen shares how Salesforce.com’s test automation cloud works and gives you an inside look at the different technologies and methodologies they use today. Chris explores test frame works, continuous integration systems, production validation, coverage, processes, and much more in this revealing session. Learn more about Chris Chen
Wednesday, October 05, 2011 1:45 PM
W12
Special Topics
Testing in Production: Which Version Wins? C Harish Narayan, Vistaprint Would your marketing department like to know which website feature will excite online customers to buy more products, return to your site again and again, and increase revenue and profits? Harish Narayan describes how his team uses risk-based testing and statistical test design to optimally check features deployed with multiple website options. At Vistaprint, their measurement-focused marketing department requires live production tests of multiple web options—split runs in their jargon—that expose different features for different customer sessions; they choose to retain the one that “wins” to maximize returns. Preproduction testing of split-run features, especially when multiple runs are deployed in every release, presented challenges for Vistaprint’s testers. They found that assuring each new split-run works correctly on the website, not only with the existing functionality but also with the other marketing tests running at the same time, is critical. Join Harish to learn how his test team overcame these challenges to help their company deliver better customer service and more profits. Learn more about Harish Narayan
Wednesday, October 05, 2011 3:00 PM
W13
Test Management
Get Testers Out of the QA Business C Michael Bolton, Developsense Why is the testing department often misnamed "Quality Assurance?" We testers usually aren't allowed to control the scope of the product or change the source code. We don't have authority over budgets, staffing, schedules, customer relationships, market placement, or development models. So how, exactly, can we testers assure quality? We can't. Quality assurance is in the hands of those with authority over it-the programmers who write the code and the managers who run the project. We're extensions of their senses-extra professional eyes, ears, fingertips, noses, and taste buds. Join Michael Bolton and learn why and how to focus your testing energy on exploring, discovering, investigating, and learning about the product. Then, you'll be empowered to provide management with information they need to make informed technical and business decisions. Michael explains why you should not become a process enforcer who tries to "own" quality. Instead, you should add value to the whole team, offering service-not obstacles-to help the project succeed. Find out how to think critically and prevent programmers and managers from being fooled-and start by not fooling yourself. Learn more about Michael Bolton
Wednesday, October 05, 2011 3:00 PM
W14
Test Techniques
Structural Testing: When Quality Matters C Jamie Mitchell, Jamie Mitchell Consulting, Inc. Jamie Mitchell explores an underused and often forgotten test type—white-box testing. Also known as structural testing, white-box techniques require some programming expertise and access to the code. Using only black-box testing, you could easily ship a system having tested only 50 percent or less of the code base. Are you comfortable with that? For mission-critical systems, such low test code coverage is clearly insufficient. Although you might believe that the developers have performed sufficient unit and integration testing, how do you know that they have achieved the level of coverage that your project requires? Jamie describes the levels of code coverage that the business and your customers may need—from statement coverage to modified condition/decision coverage. He explains when you should strive to achieve different code coverage target levels and leads you through examples of pseudo code. Even if you have no personal programming experience, understanding structural testing will make you a better tester. So, join Jamie in this code-diving session. Learn more about Jamie Mitchell
Wednesday, October 05, 2011 3:00 PM
W15
Test Automation
New Generation Record/Playback Tools for AJAX Testing C Frank Cohen, PushToTest While some in the test community talk about record/playback technology as dead-end test automation approach, a new generation of open source record/playback test tools that every tester should consider is now available. Tools like Sahi and TestMaker Object Designer were built for AJAX environments and support thousands of web objects and the asynchronous nature of AJAX. Frank Cohen shows you how to install and use these free tools in your environment and record test scripts of a complicated AJAX application in IE, Chrome, Firefox, Safari, and Opera. Learn how to data-enable applications without coding, use branching and looping commands, construct advanced element target locators without using XPath, and package tests as reusable test objects to share with other testers. Learn more about Frank Cohen
Wednesday, October 05, 2011 3:00 PM
W16
Agile Testing
Session-based Exploratory Testing on Agile Projects C Bob Galen, iContact One of the challenges associated with testing in agile projects is selecting test techniques that “fit” the dynamic nature of agile practices. How much functional and non-functional testing should you do? What is the appropriate mix of unit, integration, regression, and system testing? And how do you balance these decisions in an environment that fosters continuous change and shifting priorities? Bob Galen has discovered that session-based exploratory testing (SBET) thrives in agile projects and supports risk-based testing throughout the development project. SBET excels at handling dynamic change while also finding the more significant technical- and business value-impacting defects. Join in and learn how to leverage SBET for test design and as a general purpose agile testing technique. Bob also explores the whole-team execution view that SBET fosters and demonstrates techniques for rolling out this technique across large-scale projects and teams. Learn more about Bob Galen
Wednesday, October 05, 2011 3:00 PM
W17
The Cloud
Quality and the Cloud: Realities and Costs C Clinton Sprauve, Micro Focus Testing organizations want to take advantage of the cost savings of cloud computing and Software-as-a-Service (SaaS). However, many jump in without really understanding whether or not cloud or SaaS will actually produce a cost savings to their organization. Clint Sprauve helps you dissect cloud computing and SaaS, and calculate their true costs and benefits from a test perspective. Clint describes in detail how to estimate the total cost of both technologies for the different quality drives, including requirements management, test management, functional testing, performance testing, and continuous build-integration. He provides recommendations and best practices for implementing a cloud strategy for your test organization. Clint discusses the organizational dynamics and methodologies—agile, traditional, waterfall—that determine the cloud computing needs and impediments for test and development teams. Learn more about Clinton Sprauve
Wednesday, October 05, 2011 3:00 PM
W18
Special Topics
Peer Reviews at PepsiCo: Finding Defects Early C Christopher Clark, PepsiCo Establishing a structured review process offers a simple yet cost effective way to identify, document, and correct requirements and design defects before they create problems later in the project. Chris Clark shares the Peer Review Process that PepsiCo implemented and that continues to operate successfully today. Chris has worked within PepsiCo to ensure that the process is fully integrated into their development lifecycle. He shares tips on how to keep the process alive and how to increase process visibility by leveraging metrics captured during the reviews. Learn about the types of peer reviews—informal walkthroughs and technical inspections—and the role of testing in the review process. See real world examples of peer review results and how the process is being used to improve the value of testing at PepsiCo. Take back an implementation approach you can use to kick-start peer reviews in your organization and keep them going for the long haul. Learn more about Christopher Clark


Top of Page
 
Send us Your Feedback