Home About Software Quality Engineering Sponsorship Opportunities Contact Us SQE.com  
STAREAST
 
Register Now
SQE Home
 
 
STAREAST 2009 Concurrent Sessions

Go To:   Wednesday  |  Thursday  


Concurrent Sessions for Thursday, May 7, 2009 
T1  
Testing Dialogues: Your Management Issues
Rob Sabourin, Amibug.com
Lee Copeland, Software Quality Engineering
Double Session

What is the biggest management problem you are facing in 2009? Doing more with less? Demonstrating the value of testing to your company? Improving your team’s skills while keeping up with projects? Automating more tests? Testing Dialogues is a unique platform for you to learn from experienced test managers around the world and share your ideas and experiences at the same time. Facilitated by Rob Sabourin and Lee Copeland, this double session focuses on test management issues that you face every day. You'll share your expertise and successes and learn from others' challenges and lessons learned. Lee and Rob will help participants generate topics in real-time and structure discussions in a framework so that everyone will receive a summary of the work product after the conference.  Many past participants in Management Dialogues have rated it session their best experience at the STAR conference.

 
Learn more about Rob Sabourin
Learn more about Lee Copeland
 

T2  
The Irrational Tester: Avoiding the Pitfalls
James Lyndsay, Workroom Productions, Ltd.
 
As a tester or test manager, have you ever wondered whether reason actually plays a part in some management decisions? It seems that many decisions are influenced by far more—or far less—than rational analysis. Surprise! Testers make decisions that are just as irrational as anyone else. James Lyndsay presents his view of bias—why we so often labor under the illusion of control, how we lock onto the behaviors we're looking for, and why two people can use the same evidence to support opposing positions. James shares real-life experiences of tester irrationality to help you see how biases affect our everyday work.  Discover why timeboxes work, why independence really matters, and the subtle nudges that can encourage us to stay on track.  Be prepared to join in discussions, engage with demonstrations, and challenge your preconceptions. By understanding how patterns underlie irrationality, James promises that you will become a better tester.  
Learn more about James Lyndsay  

T3  
Top Ten Automation Questions and Their Answers
Mukesh Mulchandani, ZenTEST Labs
 

As software becomes more complex, many strategic test automation issues emerge that put test automation projects and improvement programs more at risk than ever. Mukesh Mulchandani shares ten key questions you must answer before beginning a test automation project. He begins with the elementary questions: Should I automate now or wait? What specifically should I automate? What approach should I adopt? Mukesh then considers more complex questions: vertical vs. horizontal automation, handling static and dynamic data, and testing dynamic objects. The final questions relate to future automation trends: moving beyond keywords automation technology, making automation scripts extensible, introducing test-driven development, starting automation when the application is not yet stable, and offering the automation scripts to clients. Whether you are just starting with test automation or wanting to improve your automation, find out which of these questions resonates with you—and learn Mukesh’s suggested answers.

  
Learn more about Mukesh Mulchandani  

T4  
Updating Your Testing Methods for Web 2.0
Matt Brayley-Berger, Borland
 
Web 2.0 moves much of the application functionality directly into the browser. While creating a richer user-experience that everyone craves, these technologies pose significant new challenges for testing. Matt Brayley-Berger discusses areas that are critical in testing Web 2.0 applications. First, Matt presents an overview of a typical Web 2.0 application architecture (tiers, backend databases, and application workflow) to help you plan test designs and test cases. He explains how to architect testing directly into the application, including instrumenting the application to provide data for use by testing tools. Learn why testers must focus more on performance testing that accurately profiles the asynchronous inter-application communication. Take back ways to correlate low-level metrics (EJBs, Flex/Flash Remoting, business logic layer) with browser-specific execution times to obtain a more accurate representation of the response under various states of load.  
Learn more about Matt Brayley-Berger  

T5  
The New Era of Community-based Testing
Doron Reuveni, uTest
 
Professionals are being confronted by a growing list of challenges—shorter release cycles, increased expectations, smaller budgets, and fewer testing resources. It’s time to rethink the outdated methods of the past to enable us to deal with increased complexity in technical platforms, agile development methodologies, and greater scrutiny into the costs of defect discovery. Community-based testing meets these challenges head-on. By utilizing a global community of professional testers, companies can achieve higher quality releases, meet their release schedules, and stay within budget. Doron Reuveni describes the increasing role of community-based testing in the world of quality assurance. He describes how innovative software and Web companies are using virtual QA teams to support their in-house testing efforts. Discover best practices of how community-sourced testing support the needs of agile Web organizations as well as traditional plan-driven software firms.  
Learn more about Doron Reuveni  

T6  
Lessons Learned in the 24/7 Online World Web 
Jane Fraser, Electronic Arts
 
Managing a successful, rapidly changing Web site and trying to track the bugs is a never-ending process. Every release brings new challenges—identifying a bug that’s causing havoc, creating patch solutions, and strategizing ways to fight fires with little down time. If you don’t juggle resources well, the stress of managing a live site will take a toll on your team. Jane Fraser takes you through the setup and deployment of a War Room for releasing software in a 24/7 online world. See how Electronic Art’s use of wikis, chat rooms, and call bridges help keep communications flowing smoothly. Join Jane to follow the path of a production issue from inception to conclusion—identification, reproduction, risk assessment, patching, and the checks and balances to ensure the safest path towards fixing the issue. Learn from Electronic Art’s lessons to eliminate some of the pitfalls in your 24/7 online world.  
Learn more about Jane Fraser  

T7  
After System Testing: Don't Forget Infrastructure Testing
David Watt, Lockheed Martin
 
Traditionally, testing IT applications is done in isolation on a stand-alone platform. However, when applications interface with the corporate IT infrastructure, you need to plan, engineer, and execute an additional level of integration testing. David Watt describes a typical IT infrastructure and the historical problems, costs, and complexities of conducting infrastructure integration testing. Because of the complexities common to many IT infrastructures, this level of testing is often ignored and omitted. David explains how enhancements to testing techniques and test process management can remediate many of these complexities and make infrastructure integration testing possible. David introduces the concept of an Enterprise Test Bed and explains how strict management techniques can make this resource a reality for your infrastructure integration testing.  Of course, if you’d rather, you can always conduct this forgotten level of testing by simply turning the system over to the users to perform these tests in production.  
Learn more about David Watt  

T8  
Cheap and Free Test Tools
Randy Rice, Rice Consulting Services
 
Too often, testers have limited money, time, or both to purchase, learn, and implement the robust commercial test tools available today. However, as a tester, one of the best things you can have is your own personal testing toolkit. Since 2001, Randy Rice has been researching free and inexpensive test tools and has compiled a set of tools that have been a great help to him and many others. Randy presents an overview of these tools that can add power and efficiency to your test planning, execution, and evaluation. Randy presents and demonstrates tools that can be used for pairwise test design, test management, defect tracking, test data creation, test automation, test evaluation and Web-based load testing. Learn how you can use these tools together to achieve a combined effect of greater test speed and better test coverage at little or no out-of-pocket cost.  
Learn more about Randy Rice  

T9  
An Open Source Tool for RIA/Ajax Testing 
Frank Cohen, PushToTest

Building rich Internet applications (RIA) using Ajax is challenging partly because of all the variations in browser performance and functional issues. In addition, different browsers render Ajax differently depending on version and operating environment. Frank Cohen shares a free, open-source service to check Ajax application functions and create a gallery of screen shots to ensure browsers are rendering your application correctly. The unit tests run on a distributed set of test engines in a cloud of servers. Each server in the cloud operates a different browser version and operating environment combination. Frank also describes the skills and experience you need to be successful in testing Ajax applications and describes an appropriate test methodology you can employ. Frank demonstrates his approach using popular open-source testing technology, including  Appcelerator TestMonkey, PushToTest TestMaker, Quartz, Selenium, and JackRabbit/ Derby.

 
Learn more about Frank Cohen  

T
10
 
Practical Security Testing for Web Applications
Rafal Los, Hewlett-Packard
 

Testing teams are generally quite efficient at testing Web applications through a wide range of functional data, business processes, and click streams. However, testing for security defects, which requires testing and a different mindset, is another story. Security testing involves anticipating what the application is not expecting and building test cases to cover those situations.  Rafal Los demonstrates the approaches you need to understand negative security testing by offering insight into common attacks from simple parameter-based attacks like Cross-Site Scripting (XSS) and SQL Injection (SQLi) to more complex attacks like Cross-Site Request Forgeries (CSRF) and multi-stage persistent Cross-Site Scripting attacks (pXSS).  Rafal provides examples and methodologies for gathering information, creating a negative-test strategy, executing attacks, and interpreting the results. Take back a new understanding of Web security issues and proven methods for addressing them proactively.

 
Learn more about Rafal Los  

T
11
 
When Everyone is Ready to Quit: Rebuilding the Team 
Rachel Pilgrim, Travelocity

What would you do if everyone on your new team wanted to quit?  Learn how one test manager went from a demoralized team of almost quitters to one of the most satisfied, sought after, and effective teams in the company—all within one year.  Rachel Pilgrim explains how listening to employee concerns, taking calculated risks, and finding solutions to individual  issues provided a solid foundation for turning around her team.  Learn to empower your team by defining roles and responsibilities and challenging the “business as usual” model.  Improve work-life balance by leveraging tools and automation to reduce workloads and increase satisfaction. Learn about low-cost training opportunities, mentoring programs, and ways to alleviate job boredom. After implementing these and other ideas, Rachel’s employees reported a 96% job satisfaction rating. You can do the same with your testing team!

 
Learn more about Rachel Pilgrim  

T
12
 
A Pragmatic Approach to Improving Your Testing Process
Clive Bates, Grove Consultants
 
Although most test managers know they need to improve their processes, many don’t know how to go about it. How do you understand the effectiveness of your current test process and then move forward for quick wins and long-term gains? Clive Bates presents a step-by-step approach to gather information on the existing process using special questionnaires and interviews that help you compare your organization with others and identify short and long-term improvement activities. Find out how to package these improvement activities and present them to management and gain their commitment. Once changes start to happen, learn to monitor your testing to determine the impact of your actions and how to properly guide improvement activities. Learn how to conduct project retrospectives, identify what metrics to gather before and after the improvements, and report your successes. Introduce your improvement ideas with confidence and guarantee their success.  
Learn more about Clive Bates  

T
13
 
Using Data Objects to Create Effective Test Data
Huw Price, Grid-Tools, Ltd.
 
Fact—the quality of test data directly impacts the quality of testing. Traditional manual methods for creating test data are laborious, time consuming, often ineffective, and error prone. Huw Price explains the concept of test “data objects,” an approach he uses to create high quality test data and eliminate the need to access live production data for testing. Data objects are abstractions of the data that capture the essence of a data type that can be quickly assembled to support specific tests.  With definitive examples, Huw shows how to create basic “data objects” using data sampling and then expands to more complex objects using “data inheritance” that varies the data to satisfy specific testing needs. Learn about test data selection techniques—such as all-pairs, cause and effect, randomization, and probability distribution, to build high quality sets of data. You’ll spend less time creating data and have the time to find more defects.   
Learn more about Huw Price  

T
 14
 
Improve Your Testing Assets Through Domain Modeling
 
Renato Quedas, Borland
 
Just as agile approaches have made inroads in development, FitNesse is doing the same in acceptance testing. Many testers rely heavily on FitNesse to improve collaboration and communication among the product owner, developers, and testers. However, beginning by writing tests based on a specific tool will ultimately lead to ineffective testing. Renato Quédas asserts that by basing your tests on a domain model of your application rather than your test tool, you will gain a better understanding of what needs to be tested and, therefore, design more effective test cases. Once the model is created, test cases can be derived automatically from the model. Also, should the model change, test cases can be easily recreated. Renato demonstrates this concept applied to FitNesse using the open-source Eclipse Modeling Technology (EMT). First, he creates the domain model, and then he generates the test automation assets (test scripts, Web pages, and documentation) by using the model transformation capabilities of EMT. Join Renato to learn how you can use this technique to eliminate the continuous rewriting of test cases as your application evolves  
Learn more about Renato Quedas  

T
15
 
Virtual Test Labs - The Next Frontier
Darshan Desai, Microsoft
 

Are you spending too much time setting up test environments? Do you have too many “can’t repro” defects? Test lab virtualization may be the answer you’re looking for. It’s no longer just a promise—it can be a reality in modern test labs today. Darshan Desai explains how to leverage virtualization to solve some of your complex testing problems. Virtualization provides the ability to create and share test environments quickly and do more testing in the same amount of time. Darshan explains how virtualization reduces the total cost of ownership of test labs and helps you test earlier on production-like environments. More importantly, you’ll be able to file high-quality, actionable defect reports that are reproducible for the developer. Learn how successful teams at Microsoft use virtual test labs and understand the best practices and the pitfalls to watch out for when you go virtual.

 
Learn more about Darshan Desai  

T
16
 
Measuring the Value of Testing
Dorothy Graham, Independent Test Consultant

Value is based on objectives, so why do we test? We test to find defects effectively, gain confidence in the software, and assess risk. So, the value of testing should be measured based on test effectiveness, confidence validation, and reduced system risk.  In terms of testing effectiveness, the most useful metric is defect detection percentage (DDP)—the ratio of defects found in testing divided by the total number of defects found by testing and users in production. Dorothy Graham explains when to use this metric and outlines the choices, problems, and benefits of using DDP. In addition, Dorothy describes confidence and risk metrics that will tell you if you are going in the right direction—or not. She explains how to take costs into account to assess the return on investment for testing and outlines a simple way to bring home the message about the value of testing in terms of what the organization can save.

   
Learn more about Dorothy Graham  

T
17
 
Agile Testing in the Large: Stories and Patterns of Transformation 
Robert Galen, Software Testing Consultant
Double Session

You’re part of a large test organization that has invested money, sweat, and tears in test processes, plans, cases, and automation tools that have served you well. You’ve built a team that excels in your development environment. In fact, everyone is depending on you to verify sound engineering practices and formally assure product quality. Now agile methods are being adopted in your organization and messing up everything. Developers and testers are pushed together with the hope that quality will somehow still happen. Is this your future? Bob Galen describes patterns of testing that he’s found helpful in large-scale teams when they transition from traditional to agile testing. Join Bob to explore compliance and regulatory actions, sorting out bugs in triage, test planning and tracking, reporting test coverage, making release readiness decisions, influencing the broader development team, creating agile test automation, thriving in large-scale agile environments, and more

 
Learn more about Robert Galen  

T
18
 
The Strategic Alignment of Testing Development
Jasbir Dhaliwal,  FedEx Institute of Technology and Dave Miller, FedEx

Strategic alignment between testing and development is vital for successful systems development. Missing, however, have been actionable, how-to approaches for assessing and enhancing this alignment. Jasbir Dhaliwal and Dave Miller present STREAM, the Software Testing Reassessment and Alignment Methodology, a systematic approach used to achieve this alignment at both strategy and execution levels. STREAM incorporates a step-by-step procedure that can be used to: 1) identify symptoms of developer-tester misalignment, 2) analyze and understand the misalignment, and 3) formulate action-plans for fostering stronger developer-tester alignment. In addition, Jasbir and Dave identify specific mechanisms and tools for ensuring that the execution capabilities of testing groups are aligned with their stated strategies. This represents a natural pre-requisite for successful developer-tester alignment. STREAM extends ideas commonly used by corporate CIOs for ensuring business-IT alignment to the realm of ensuring alignment within the corporate IT unit as well. Take away a practical framework for thinking strategically about the testing function and its collaborative relationship with development.

 
Learn more about Jasbir Dhaliwal
Learn more about Dave Miller
 

T
19
 
Applying Test Design Concepts in the Real World
Marie Lipinski Was, CNA Insurance

Have you ever read a book, taken a class, or attended a conference session on test design concepts that you never actually incorporated into your work? Have others on your team rejected new design techniques that appeared promising to you? Sometimes the path from concept to real-world application can be wrought with challenges. Marie Lipinski Was shares the path she took to bring formal test design techniques from the classroom to the workroom.  Marie explains how she incorporated test design techniques—such as mind mapping, decision tables, pairwise testing, and user scenario testing—into the existing test processes at CNA Insurance.  From the case studies Marie offers, you will learn how to present these new concepts to key stakeholders, quantify the cost/benefit to management, and overcome the challenges of changing the status quo.

Learn more about Marie Lipinski Was  

T
20
 
"A" is for Abstraction: Managing Change in a Successful Test Automation
Mark Meninger, Research in Motion

Implementing a test automation project can be like a mountain climbing expedition—many find the task daunting, some attempt it, and only a few are successful. Showing real-world examples—such as the need for scripting across different platforms—Mark Meninger explains how to embrace change and use abstraction to provide creative ideas and approaches for your test automation project. You’ll learn how to implement a platform abstraction layer in the automation architecture to overcome multi-platform issues and much more. Mark helps you understand how the roles of change and abstraction in test automation can impact your automation project. You can become one of the few who are truly successful by embracing abstraction in your test automation architecture. Otherwise, you may spend money, invest in tools, and build a team that never makes it to the top of the test automation mountain.

Learn more about Mark Meninger  

T
21
 
Taking Control Using Virtual Test Lab Automation
Ravi Gururaj, VMLogix

Due to more complex software and environments, the expectations placed on software labs have grown significantly. While under tighter budget constraints, test labs are expected to rapidly provide the infrastructure to create varied test environments for executing test cases. Traditionally, only physical machines and bare-bones hypervisors formed the lab infrastructure. Testers spent a significant amount of time creating and re-creating pristine test environments. Ravi Gururaj explains how virtual lab automation (VLA) leverages server virtualization and redesigns the lab to make it relevant to a broad set of stakeholders, including development, test, support, pre-sales, and training. Learn how you can create multi-machine configurations, execute test cases, and capture entire machine states quickly for later defect analysis. Because it is easy to create and share configurations, testers can rapidly expand the test environment and realize reductions in cycle time for functional, combinatorial, and regression testing.

Learn more about Ravi Gururaj  

T
22
 
When to Ship? Choosing Quality Metrics
Alan Page, Microsoft

It’s time to ship your product and you’re looking at pages of data about the testing work you’ve done over the last year. How well does this data prepare you for making the recommendation to ship the product or delay it—perhaps once again? Do you rely primarily on the data or do you fall back on “gut feel” and intuition to make your decision? In this highly interactive session, Alan Page discusses how common measurements, such as code coverage, bug counts, and test pass rates are often misused, misinterpreted, and inaccurate in predicting software quality. Learn how to select both quantitative and qualitative metrics that evaluate your progress and help you make important decisions and recommendations for your product. Share your own ideas for test and quality metrics and learn how to evaluate those metrics to ensure that they are accurately answering the questions you need them to answer.

 
Learn more about Alan Page  



Top of Page

 
Send us Your Feedback
Software Quality Engineering  •  330 Corporate Way, Suite 300  •  Orange Park, FL 32073
Phone: 904.278.0524 or 888.268.8770  •  Fax: 904.278.4380  •  Email: [email protected]
© 2009 Software Quality Engineering, All rights reserved.