Logo Top for Software Testing Analysis & Review (Testing & Qualty Conference)Date Header for Software Testing Analysis & Review (Testing & Qualty Conference)
Software Testing Analysis & Review (STAR) Conference

Contact Software Quality EngineeringRegister for Software Testing Analysis & Review


Concurrent Sessions
FRIDAY, NOV. 2

 
Topical Tracks–Real World Experiences
The STARWEST conference program is designed to serve the needs of software testing and quality engineering managers and professionals. At STAR, you’ll learn about the latest strategies, ideas, and techniques being used by leading software organizations. STAR’s unique, real-world approach provides you with the knowledge and practical skills you need to ultimately build and deliver better software.
 
FRIDAY, NOV. 2, 10:15 AM    Go to 11:15 AM
F1   Test Management
Creating Quality From Scratch: How to Build a Testing Organization
Eric Patel, Nokia Home Communications
With more and more companies realizing the need for testing throughout the product development process, there’s a growing demand for bigger, better QA teams. If you find yourself the first member of a newly formed department, it’s likely you’ll be asked to do the testing and build the team simultaneously. This presentation presents a strategy for meeting the day-to-day testing challenges, while planning for the future of the department. It tells you how you can lay the foundation and build the house — all at the same time.

• How to manage the effects of “solitary confinement”
• Ways to balance present testing demands and future challenges
• Effective time management and test planning techniques
 
F2   Test Techniques
Data in Functional Testing — You Can’t Live Without It
James Lyndsay, Workroom Productions
Good data is a sure-fire solution for improving functional testing. Test data can be structured to enhance understanding and testability. Its contents, correctly chosen, can reduce maintenance effort and allow flexibility. Plus, preparation of the data can focus the business where requirements are vague. The three kinds of test data include environmental, setup, and input. This presentation deals with how to recognize these types of data and the common problems associated with them during pre-test, test, and go-live.

• Advantages and pitfalls of various methods to load data
• Common problems and possible solutions for data maintenance
• Naming conventions for more accurate and easier-to-interpret data
 
F3   Test Automation
Test-Result Checking Patterns
Keith Stobie, Microsoft
This presentation will help you determine how a test case detects a product failure, and how it requires several test case design tradeoffs to do so. These tradeoffs include the size and composition of the test case logic, the characteristics of test data used, and how to determine when comparisons should be done. It also discusses several test-result checking patterns.

• Learn new methods for checking test results
• How result checking impacts test design
• Determine which results to keep together
 
F4   Test Measurement
Revealing the Mysteries of Test Measurement
David Hutcheson, Glen Abbot Ltd.
Contrary to popular belief, test measurement is not a mysterious art. Rather, it’s a vital part of test management. This presentation will assist those new to test measurement — and those that have been intimidated by measurement and its complexities in the past — discover that test measurement can and should be a part of your test plan. Learn metrics and techniques that are easy to understand as well as implement.

• How to devise test preparation measures
• Measurement of test progress — planned, attempted, successful
• Measurement of environmental problems
 
F5   Advanced Topics
Concise, Standardized, Organized Testing in Complex Test Environments
Gerhard Strobel, IBM Germany
There’s a need for standardized, organized hardware and software infrastructure, and for a common framework, in a complex test environment. Gerhard Strobel focuses on the experience of testing diverse products on many different platforms (UNIX, Windows, OS2, z/OS, OS400) — how they differ and how much they have in common. He explains how to configure and profile test machines, then highlights the technical areas where test efficiency can be increased. He also covers methods of execution control.

• Configure and set up test machines to test across different platforms
• Create granular, self-contained, portable automated test suites
• Develop equivalent tests and test environments for different products and product levels
 
FRIDAY, NOV. 2, 11:15 AM    Go to 10:15 AM
F6   Test Management
How to Find the Level of Quality Your Sponsor Wants
Sue Bartlett, Step Technology
The level of quality a product will attain always comes down to a business decision. While testers understand how to ensure a top-notch product, the customer sponsoring the development may balk once time and labor start translating into dollars. Sue Bartlett leads this session designed to help you recognize customer needs — whether stated or implied — and turn them into an effective software testing process. Explore requirements gathering techniques, test efficiency, communication skills, and persuasion tactics.

• Do you have “quality” goals?
• How to turn fuzzy demands into concrete test cases
• A methodical approach to prioritizing testing around quality goals
 
F7   Test Techniques
Introduction to Usability Testing
Cheryl L. Nesta, Vanteon
What is usability? Why is it important? If these questions wake you in the middle of the night, then this presentation is for you. Cheryl Nesta discusses the relevance of usability testing within the broad framework of quality assurance and appropriate expectations based on its uses and applicability. Explore methodology, process flow, goal identification, and definition. Real-world examples create a hands-on introductory experience.

• A clear definition of usability
• How to write and set goals for usability testing
• Where to obtain usability reference material
 
F8   Test Automation
Evolution of Automated Testing for Enterprise Systems
Cherie Coles, BNSF Railroad
The key to accelerating test automation in any project is for a well-rounded, cohesive team to emerge that can marr y its business knowledge with its technical expertise. This session is an in-depth case study of the evolution of automated testing at the BNSF Railroad. From record-and-playback to database-driven robust test scripts, this session will take you through each step of the $24 billion corporation’s efforts to implement test automation.

• Discover your business driver, then use it to appropriately form your vision and process
• How to cultivate a successful partnership with the application development team and the customer
• Return on test automation investment: increased productivity and higher quality testing
 
F9   Test Measurement
Managing the Test Effort Using Requirements-Based Testing Metrics
Gary Mogyorodi, Bloodworth Integrated Technology Inc.
It’s difficult to quantify the true state of a test effort. Often, it’s measured by quantity of work combined with deadline compliance. But if this is the case, then the true level of quality remains unknown. The Requirements-Based Testing (RBT) process offers a set of metrics that can be utilized throughout the development cycle. These metrics can provide an accurate picture of the test effort at any given time.

• The derivations of requirements-based test metrics and their impact on the software development process
• How requirements-based test metrics reduce the risk of delivering untested code
• Techniques for writing testable requirements
 
F10   Advanced Topics
A Framework for Testing Real-Time and Embedded Systems
Alan Haffenden, The Open Group
What do we mean when we say local, remote, simultaneous, and distributed testing? Alan Haffenden of The Open Group explores the differences, and explains why the architecture of a distributed test execution system must be different from that of non-distributed systems. An overview of POSIX 1003.13 profiles and units of functionality helps advanced users build a good foundation for testing both their real-time and embedded systems.

• The differences between remote, local, and distributed testing
• The architecture of a distributed test execution framework
• How to manage real-time and embedded systems testing with a test execution management system
 
 
   


SQE Home        STARWEST Home        Travel Info        Get a Brochure        Register Now

A Software Quality Engineering Production

Software Quality Engineering
 Software Quality Engineering: Phone and FaxEmail SQE Customer Service
Send feedback to Software Quality EngineeringSoftware Quality Engineering Copyright 2001