Better Software Conference 2006 Conference Proceedings 

Home


GENERAL INFO.

Register

Get a Brochure


PRESENTATIONS

Keynotes

Tutorials

Concurrent

Master Schedule


IN THE EXPO

Sponsors

Exhibitors

To Exhibit


MORE INFO.

SQE Home

Other Conferences


Email Software Quality EngineeringRegister Now

Software Quality Engineering



                
Better Software Conference & EXPO 2006 Concurrent Sessions

Go To:  Agile Development  |  Managing Projects and Teams  |  Measurement  |  Outsourcing  |  Plan-Driven Development  |  
Process Improvement  |  Quality Assurance  |  Security  |  Special Topics  |  System Requirements  |  Testing


View by Date

 Measurement
T4
Thursday, June 29, 2006 9:45 AM
Code Coverage Myths and Realities
Andrew Glover, Vanward Technologies

You�ve made a commitment to automate unit testing as part of your development process or you are spending precious resources for automated functional testing at higher levels. You may be asking yourself: How good are those tests anyway? Are many tests checking the same thing while large parts of the code go completely untested? Are your tests triggering the exceptions that normally show up only in production? Are your automated tests adequately covering the code, the requirements�both, neither? Andrew discusses the truths and untruths about code coverage and looks at the tools available to gather and report coverage metrics in both the opensource and commercial worlds. He describes the different types of code coverage, their advantages and disadvantages, and how to interpret the results of coverage reports.

� The concept of mutation testing and how it fits into a code coverage strategy
• How the different types of code coverage (branch, line coverage, etc.) can mislead
• Actionable items obtained from coverage results
T10
Thursday, June 29, 2006 11:15 AM
A Metrics Dashboard to Drive Goal Achievement
Wenje Lai, Cisco Systems Inc

Some measurement programs with high aims fall short, languish, and eventually fail completely because few people regularly use the resulting metrics. Based on Cisco Systems� five years of experience in establishing an annual quality program employing a metrics dashboard, Wenje Lai describes their successes and challenges and demonstrates the dashboard in use today. He shows how the metrics dashboard offers an easy-to-access mechanism for individuals and organizations within Cisco Systems to understand the gap between the current standing and their goals. A mechanism within the dashboard allows users to drilldown and see the data making up measurement to identify ownership of issues, root causes, and possible solutions. Learn what programs they implemented to ensure that people use the metrics dashboard to help them in their day-to-day operations.

� How to build an effective metrics dashboard to help achieve quality goals
• Demonstration of a mature metrics dashboard
• Ways to achieve buy-in from management and engineers for using a dashboard
T16
Thursday, June 29, 2006 1:30 PM
Industry Benchmarks: Insights and Pitfalls
Jim Brosseau, Clarrus Consulting Group, Inc.

Software and technology managers often quote industry benchmarks such as The Standish Group�s CHAOS report on software project failures; other organizations use this data to judge their internal operations. Although these external benchmarks can provide insights into your company�s software development performance, you need to balance the picture with internal information to make an objective evaluation. Jim Brosseau takes a deeper look at common benchmarks, including the CHAOS report, published SEI benchmark data, and more. He describes the pros and cons of these commonly used industry benchmarks with key insights into often-quoted statistics. Take away an approach that Jim has used successfully with companies to help them gain an understanding of the relationship between the demographics, practices, and performance in their groups and how these relate to external benchmarks.

� Strengths and weaknesses of commonly used benchmarks
• An approach that combines internal company data with industry measurements
• Gaining actionable insights based on the unique set of data from your organization
T22
Thursday, June 29, 2006 3:00 PM
Software Metrics to Improve Release Management
Nirmala Ramarathnam, The MathWorks Inc

In large organizations with multiple groups or multiple projects, developing consistent and useful metrics for release management is highly challenging. However, when targeted at specific release goals, metrics can help monitor the development schedule and provide both managers and developers with the data needed to improve quality. With nearly eighty products that must be released on the same date, Mathworks has developed a release metrics program with a consistent method to categorize and prioritize bugs based on severity and frequency. Learn how they track progress toward bug fix targets for each category of bugs and monitor them consistently across their product line throughout the release cycle. See examples of metrics reports designed for management and daily use by teams, including historical trending analysis of overall and customer-reported bug counts.

� How to set up metrics aligned with release quality and schedule goals
• A method to consistently categorize and prioritize bugs across multiple products
• The different release metrics for the product team and management



Better Software Conference & EXPO 2006 is a Software Quality Engineering Production

The Current Conference   |   StickyMinds.com   |   Better Software magazine

Software Quality Engineering   •   330 Corporate Way, Suite 300   •   Orange Park, FL 32073

Phone: 904-278-0524   •   Toll Free: 800-423-8378   •   Fax: 904-278-4380   •   Email: [email protected]

© 2006 Software Quality Engineering. All rights reserved.