Software Testing Analysis & ReviewSoftware Testing Analysis & Review
Software Testing Analysis & Review
  Home About Software Quality Engineering Conference Sponsors Contact Us SQE.com  
Software Testing Analysis & Review
Why Attend?
Conference FAQs
Conference at-a-Glance
Keynote Presentations
Preconference Tutorials
Concurrent Sessions
Certification Training
Special Events
Testing EXPO
Networking Events
Alumni Testimonials
Conference Sponsors
Contact Us
About Us
Past STAR Conferences
Other Conference Events
 
 
 

STAREAST 2007 Friday Concurrent Sessions

  Go To:   Wednesday  |  Thursday  |  Friday  

 Friday, May 18, 2007 10:00 a.m.
F1
Test Management
Recruiting, Hiring, and Retaining Great Testers
Krishna Iyer, ZenTEST Labs
 
Hiring great testers is the single biggest challenge that test managers face. Unfortunately the number of experienced testers is dwindling while the number of testers with weak skill sets is proliferating. Drawing on his experience of building an independent testing company, Krishna Iyer shares unconventional— yet quite effective—methods to find, hire, and retain great testers. He looks for testers outside the software world and has had success, for example, with auditors—they have the same inquisitiveness that makes testers great. Krishna describes good interviewing techniques such as “vague questioning” that probe the candidates’ thinking skills rather than their ability to recall facts. Krishna concludes with suggestions on how to retain great testers, including supporting social responsibility projects and balancing testers’ personal needs with the demands of work.

• New pools of talent for recruiting testers
• Improve your selection skills
• Methods of employee retention that help both the individual and the organization
F2
Test Techniques
Gain Control over Chaotic Development Projects
Dennis Tagliabue, Dell, Inc.
 
Testers are frequently assigned to projects in which applications are undergoing major modifications, yet documentation may be incomplete, wrong, or non¬existent. With limited time, testers must rely on developers, business partners, and others to tell them what to test. The result is often an incomplete grasp of the application resulting in inadequate testing. Dennis Tagliabue shares a real-world approach that allows you to gain control over chaotic application development environment. By employing a simplified use case and scenario-based testing approach, you can develop a high-level view of the application and drive this view down to low-level reusable test cases. The emerging picture of the application reduces the future learning curve, improves communication among stakeholders, and provides a basis for test planning and estimating. All this can be accomplished without sacrificing short-term testing objectives.

• How to employ simplified use cases to understand what to test
• Improve overall testing coverage with scenario-based testing
• The differences between vague mental models and formalized models
F3
Outsourcing
Mistakes Outsourcing Customers Make
Kees Blokland, POLTEQ IT Services BV
 
Ten years of experience with test outsourcing at Polteq Lucent Technologies has shown that it can be successful. However, on the way to success, many— and sometimes painful—lessons were learned. Kees Blokland shares the most common test outsourcing mistakes others have made with the hope that you will not repeat them. One key mistake is the expectation of large and rapid cost savings—many that have been seduced by this temptation have not been successful. Another mistake is to believe that the outsourcing vendor actually knows how to test your applications—just because they are far away doesn’t mean they know your business. Kees presents a full list of outsourcing mistakes and discusses how you can prevent them from happening—or repair the damage if mistakes have already occurred. If you’re planning to outsource testing or are in the middle of an outsourced project, you will find Kees’ insight very useful.

• The top ten mistakes made by outsourcing customers
• Valuable techniques to prevent trouble—both yours and your organization’s
• Shared test outsourcing experiences and knowledge
F4
Static Testing
Stop Finding Bugs, Start Building Quality
Alan Page, Microsoft
 
Many testers believe that their job is to find bugs. While finding bugs is indeed an important aspect of testing, detecting bugs earlier or preventing them from ever occurring has a far greater impact on improving software quality. You have probably seen charts showing the exponential increase in cost of fixing bugs late in the product development cycle; yet despite calls to “move quality upstream”, the end of the product cycle is where many software projects focus their testing efforts. Long time Microsoft tester Alan Page will discuss how common functionality, security, and performance bugs can be prevented or detected much earlier on software projects of any size using simple scripts, or tools such as a source code compiler or FxCop.

• Causes of common bugs
• Analyze source code
• Make detection techniques automatic
F5
Performance Testing
Performance Testing Web Applications with OpenSTA
Dan Downing, Mentora Group
 
OpenSTA is a solid open-source testing tool that, when used effectively, fulfills the basic needs of performance testing of Web applications. Dan Downing introduces you to the basics of OpenSTA including downloading and installing the tool, using the Script Modeler to record and customize performance test scripts, defining load scenarios, running tests using Commander, capturing the results using Collector, interpreting the results, and exporting captured performance data into Excel for analysis and reporting. As with many open source tools, self-training is the rule. Support is provided not by a big vendor staff but by fellow practitioners via email. Learn how to find critical documentation that is often hidden in FAQs and discussion forum threads. If you are up to the support challenge, OpenSTA is an excellent alternative to other tools.

• The capabilities and limitations of OpenSTA
• How to analyze and report performance data
• Ways to detect performance bottlenecks with OpenSTA
 Friday, May 18, 2007 11:15 a.m.
F6
Test Management
The Case of a Failed Project: A Mystery Solved
John Scarborough, Aztecsoft
 
John Scarborough recounts the aftermath of a test project failure that stunned engineers and managers alike. The project was highly strategic yet very challenging. Team members were proud to be assigned to it. Early warning signs did not go unheeded. However, after the customer rejected a release of code, confidence plummeted, and the controls that the team had put in place were no longer sufficient to keep deliveries on track. The harder they worked, the more their deficiencies became apparent. Fortunately, all was not lost. Through a defined retrospective process with open and sometimes painful self-assessment, the team was able to deliver a positive study that led to overhauling and improving the company’s processes for quality management. Take back an approach that can lead you from failure and disappointment to progress and success.

• How to establish an atmosphere of openness and candor
• Transform meaningless labels such as “failure” and “success” into explicit improvement actions
• Ways to accept uncertainty rather than striving for perfection that will never come
F7
Test Techniques
Bugs on Bugs! Hidden Testing Lessons from the Looney Tunes Gang
Robert Sabourin, AmiBug.com, Inc.
 
Robert Sabourin finds that characters from the Looney Tunes Gang—Bugs Bunny, Road Runner, Foghorn Leghorn, Porky Pig, Daffy Duck, Michigan J. Frog, and others—provide wonderful metaphors for the challenges of testing. From Bugs we learn about personas and the risks of taking the wrong turn in Albuquerque. Michigan J. Frog teaches valuable lessons about bug isolation and how ambiguous pronouns can dramatically change the meaning of our requirements. The Tasmanian Devil not only teaches us about the risks of following standard procedures but also shows us practical approaches to stress and robustness testing. And, of course, we learn about boundary conditions and challenging physics from Yosemite Sam. Bugs teaches lessons for the young at heart—novice and experienced alike. Robert shares some powerful heuristic models that you can apply right away.

• The value of modeling personas for test design
• How metaphors can help us understand and communicate
• Heuristic models are not only useful—they’re fun
F8
Outsourcing
An Outsource Model for Quality Assurance and Automated Testing
Jeff Somerville, RBC Financial Group
 
Efficiency and effectiveness are the cornerstones of successful quality assurance and test automation effort. Jeff Somerville describes how RBC Financial Group successfully implemented a quality assurance and automation outsourcing engagement, using a blended onshore/offshore approach. He describes the details of the engagement model and outlines the risks they encountered. Jeff describes their mitigation strategy, governance structure, and the metrics used to evaluate their implementation. Learn a communication strategy and automation framework you can use to implement automation using an outsourcing partnership. Find out what setup is required before any outsourcing model can be successful: detailed requirements, a complete set of test data, and a test lab that is accessible to all. Jeff describes the common pitfalls of offshore engagements and the three categories of outsourcing problems— people, process, and governance.

• How to implement a successful blended onshore/offshore model
• The criteria that should be evaluated before implementing this model
• Ways to measure the costs and value of outsourcing
F9
Static Testing
A Flight Plan for Testing to Keep Us Safe
Sid Snook, Software Quality Engineering
 
Just as an airplane pilot always uses a checklist when preparing for a flight, a test engineer should use a checklist when preparing for testing. Join Sid Snook, a licensed pilot, as he provides comprehensive, high-level testing guidelines, checklists, attack methods, and documentation templates. Sid presents a menu of potential testing items for you to select from based on the unique context of your testing project. Although the complete set of tools is not intended to be applicable on any given project, Sid recommends that all items should be considered for applicability and only be rejected for sound, technically defensible reasons. Note: Project risk may increase in some proportion to the items you do not select—and you may get lost somewhere along the way on your testing trip.

• The benefits and limitations that come from the use of checklists
• How the basic axioms of flying and software testing are similar
• Specific project testing checklists and templates
F10
Performance Testing
Challenges in Performance Testing of AJAX Applications
Rajendra Gokhale, Aztecsoft
 
The AJAX model for Web applications has been rapidly gaining in popularity because of its ability to bring the richness and responsiveness of desktop applications to the Web. Because one of the key drivers for the rapid adoption of AJAX is its promise of superior performance, it is surprising that there has been very little discussion of AJAX-specific performance testing. In fact, AJAX has a significant impact on aspects of the performance testing lifecycle including definition of goals, user modeling, and test scripting. Rajendra Gokhale discusses issues to consider: AJAX engine simulation and optimization, cross-client performance of AJAX applications, and design choices related to test scripting. Using Google's "Google Suggest" service as a case study, Rajendra examines the unique challenges of carrying out performance testing of AJAX-based applications and offers suggestions for overcoming them.

• How AJAX applications differ from standard Web applications
• Modeling user interactions with AJAX applications
• The need for complex test scripts to test AJAX-based applications

  Go To:   Wednesday  |  Thursday  |  Friday  


 
 
Send us Your Feedback
Software Quality Engineering  •  330 Corporate Way, Suite 300  •  Orange Park, FL 32073
Phone: 904.278.0524  •  Toll-free: 888.268.8770  •  Fax: 904.278.4380  •  Email: [email protected]
© 2007 Software Quality Engineering, All rights reserved.