Logo Top for Software Testing Analysis & Review (Testing & Qualty Conference)Date Header for Software Testing Analysis & Review (Testing & Qualty Conference)
Software Testing Analysis & Review (STAR) Conference

Contact Software Quality EngineeringRegister for Software Testing Analysis & Review


Concurrent Sessions
WEDNESDAY, MAY 16

Go to Thursday's Sessions Go to Friday's Sessions
 
Topical Tracks–Real World Experiences.
The STAREAST conference program is designed to serve the needs of software testing and quality engineering managers and professionals. At STAR, you’ll learn about the latest strategies, ideas, and techniques being used by leading software organizations. STAR’s unique, real-world approach provides you with the knowledge and practical skills you need to ultimately build and deliver better software.
 
WEDNESDAY, MAY 16, 1:00 PM    Go to 2:15 PM    Go to 3:15 PM
W1   Test Management
A Senior Manager’s Perspective on Software QA and Testing
Paul Lupinacci, Changepoint Corporation
Quality assurance (QA) and testing are critical to the success of any software company. However, the senior management team doesn’t always understand this and needs to be educated about the world of soft are QA and testing. Learn how to raise the profile of QA within your organization and communicate effectively with senior management by understanding their perspective. Explore various strategies for educating and communicating with the management team.

• An understanding of senior management’s perspective on QA and testing
• How to effectively communicate with senior management
• How to make senior management an “ally” versus an “enemy”
 
W2   Test Techniques
Performance Testing 101
David Torrisi, CommerceQuest
Organizations are often so eager to “jump in” and use load testing tools that the critical steps necessary to ensure successful performance testing are sometimes overlooked—leading to testing delays and wasted effort. Learn the best practices and tips for successful automated performance testing in areas such as assembling a proper test team, planning, simulating production environment, creating scripts, and executing load tests.

• How to assemble a proper test team
• Load test planning—how to determine objectives and pass/fail criteria
• Common issues, caveats, and “gotchas”
 
W3   Test Measurement
Measuring the Value of Testing
Dorothy Graham, Grove Consultants
How can we make testing more visible and appreciated? Without measurement, we only have opinions. This presentation outlines simple and practical ways to measure the effectiveness and efficiency of testing particularly the metric Defect Detection Percentage. Learn how this measure can be implemented in your organization to keep track of defects found in testing (and afterwards). Explore choices, problems, and benefits in using this measure a well as other useful measures.

• What testing can—and cannot—tell us
• Defect-based measure
• Confidence-based measures
 
W4   Test Automation
Software Test Automation: Planning and Infrastructure for Success
Bill Boehmer, Siemens Building Technologies, Inc
Automation tools are often viewed as a cure-all to reduce test cost and effort. Without up-front planning and infrastructure design, however, these tools soon become nothing more than expensive shelfware. This presentation describes how to initiate a successful automation effort by creating standards and processes for automation. Learn how to identify and set up an automation environment in your organization.

• How to define an automation infrastructure
• How to create coding standard
• How to develop reusable automation
 
W5   Web/Ebusiness Testing
Testing in Internet Time—A Case Study
Eamonn Mcguinness, Aimware
Testing before eBusiness was tough—and now it is even more difficult. This presentation gives an overview of three typical eBusiness development lifecycles that exist today (in hours, weeks, and months) and offers a testing lifecycle for each. Learn of one software company’s successful implementation of its Internet testing lifecycle and the benefits (numerically quantified) derived from it.

• A description of three Internet testing lifecycles
• Examples of how to implement each lifecycle
• A case study illustrating how these lifecycles work in the real Internet world
 
WEDNESDAY, MAY 16, 2:15 PM    Go to 1:00 PM    Go to 3:15 PM
W6   Test Management
Baby Steps—Testing Therapy for Developers
Susan Joslyn, SJ+ Systems Associates, Inc.
Learn from a “developer-in-recovery” the strategies for overcoming testing phobia and testing animosity among developers. Now a “convert” to disciplined, quality-oriented software development, Susan Joslyn provides you with approaches that are helpful in educating developers, most of whom actually want to make a better contribution to quality practices. The testers who must beg, cajole, and trick their developers into using them will benefit greatly from attending this session.

• Understanding what makes testing so hard for developers
• Approaches for educating developers and testers
• How to define and explain “unit testing” to developer
 
W7   Test Techniques
The Global Challenge: Quality Assurance for Worldwide Markets
Steve Nemzer, Veritest
Many software applications are hosted in worldwide data centers, simultaneously launched with multiple language user interfaces, and continuously upgraded in rolling release cycles. Yet few software development organizations have a clear strategy for testing internationalized (I18N) products. Join presenter Steve Nemzer for an insider’s view into the fascinating cultural, technical, and linguistic challenges faced by today’s internationalization engineers.

• How leading technology firms develop and test internationalized code
• Best practices for identifying, reporting, and correcting defects
• Proven methods for localization testing
 
W8   Test Measurement
Metrics Collection and Analysis for Web Sites
Joe Polvino, Element K
With the surge in Web-based software solutions, the need for accurate measurement is essential for success. This presentation describes how one organization realized a need for metrics, determined which metrics to collect, and wrote a metric collection/reporting solution to measure their product. Explore sample charts, summaries, and a live presentation of the Excel-based metrics collection tool used by Element K to illustrate “what if” scenarios and understand trend analysis.

• How to understand which metrics provide the best value in the online world
• How to collect and report your metrics findings
• Understanding trends in metrics and determining possible root cause
 
W9   Test Automation
Standards for Test Automation—A Case Study
Brian Tervo, Microsoft Corporation
Implementing a set of automation standards adopted and followed by the test team will benefit everyone. This presentation discusses methods of creating and implementing standards, guidelines, and practices for teams of testers writing automated tests. Learn about decisions that can be made early in the product cycle that will have a long-term impact. Explore examples of systems that have worked well—and those that have not.

• Benefits of having test automation standards
• Methods to develop and implement standards adopted by your test team
• Ideas and standards that did—and did not—work well for one Microsoft team
 
W10   Web/Ebusiness
Designing Test Strategies for eBusiness Applications
Beverly Kopelic, Amberton Group, Ltd.
Identifying the failure points in complex eBusiness systems is becoming increasingly difficult. These systems may integrate business-to-business components, support e-commerce, and facilitate the delivery of electronic content. Learn how to evaluate the hardware, communications, and software architectures to design a successful test strategy to validate functional and structural requirements.

• How to identify potential failure points in technical and application architectures
• How to structure testing to validate content, functionality, and infrastructure
• How to design a test strategy to support your project goals
 
WEDNESDAY, MAY 16, 3:15 PM    Go to 1:00 PM    Go to 2:15 PM
W11   Test Management
A New Paradigm for Testing and Quality Assurance—The Internal Service Bureau
Robert Tashbook, Responsys.Com
In the area of Web testing—with its short launch schedules and frequent changes—quality assurance (QA) needs to change from its original methodical (full coverage) mode and instead act as a service bureau for product marketing by providing directed and quantitative answers to specific questions. Discover how this new and novel QA approach allow you to turn an adversarial relationship with marketing into a friendly one.

• The difference between traditional software testing and fast-paced Web testing
• Ways to perform high-level Web testing on a limited (or nonexistent) tools budget
• How to create useful results with very limited resources
 
W12   Test Techniques
Targeted Software Fault Insertion
Paul Houlihan, Mangosoft Corporation
This presentation makes a compelling case for the use of the targeted software fault insertion in testing. Paul Houlihan presents data on the effectiveness of this technique. Learn the advantages and risks of software fault insertion and receive tips on gaining cultural acceptance within your software organization.

• Benefits and risks of targeted software fault insertion
• How to develop a process for identifying and creating faults
• High payback areas to target with software fault insertion
 
W13   Test Measurement
Using Commonly Captured Data to Improve Testing Processes
Dean Lapp, Minitab Inc.
Dean Lapp provides you with ideas on how to improve testing practices by using data that is commonly collected during the software lifecycle. Whether you are a shrink-wrap organization working on multiple versions of the same product or a test organization attempting to become more data-driven in your process improvement attempts, learn how to use this data over a long time period to monitor and improve your test effectiveness.

• The value of three simple databases (defect tracking, customer calls, and test logging)
• How to use existing data to improve your testing processes
• How to combine, eliminate, and prioritize your existing test material
 
W14   Test Automation
Automated Test Results Processing
Edward Smith, Mangosoft Corporation
Test automation often leverages success on the ability to perform continuous, non-stop testing with the results distilled into problem reports. Consistency in problem reporting is key to being able to distinguish new problems and to providing the details engineering needs to isolate a defect. Discover how automating this process is a key step in developing an effective and efficient test automation strategy.

• How to structure clear, concise problem reports
• How to construct an automated log parser for extracting failure reasons and supporting details
• How to use Microsoft® ’s latest debugging tools to automate results processing
 
W15   Web/Ebusiness Testing
Configuration Management for Testers Working in Web Development Environments
Andrea Macintosh, QA Labs Inc.
Configuration management has long been a staple activity for large, traditional software engineering projects but has been markedly absent from most Web development project. This presentation gives a brief overview into configuration management from a tester’s perspective. Learn of the costs, drawbacks, and benefits of configuration management. Discover quick and simple ways your testing staff can add configuration management to your Web development environment.

• An introduction to configuration management from a tester’s perspective
• An overview of the four key areas of configuration management identification, documentation, control, and audit
• Quick and simple ways your testing staff can add configuration management to your Web development environment
 
Go to Thursday's Sessions Go to Friday's Sessions
 
   


SQE Home        STAREAST Home        Travel Info        Get a Brochure        Register Now

A Software Quality Engineering Production

Software Quality Engineering
 Software Quality Engineering: Phone and FaxEmail SQE Customer Service
Send feedback to Software Quality EngineeringSoftware Quality Engineering Copyright 2001