Home About Software Quality Engineering Sponsors Contact Us SQE.com
STARWEST 2008
 
 
 
STARWEST 2008 Concurrent Sessions

GO TO:   Wednesday  |  Thursday  |  Friday  

Concurrent Sessions for Thursday, October 2 — 9:45 a.m.
T
1
 
The Marine Corps Principles of Leadership for Testers
Rick Craig, Software Quality Engineering
 
You can have the best tools and processes in the world, but if your staff is not motivated and productive, your testing effort will be, at best, inefficient. Good test managers must also be good leaders. Retired Marine Colonel Rick Craig describes how using the Marine Corps Principles of Leadership can help you become a better leader and, as a result, a better test manager. Learn the difference between leadership and management and why they complement each other. Join in the discussion and share ideas that have helped you motivate your testers (and those that didn�t). Also, share your thoughts on what characteristics are associated with leaders and whether you believe that �leaders are made� or �leaders are born�. Rick discusses motivation, morale, training, span of control, immersion time, and promoting the testing discipline within your organization. He also address the importance of influence leaders and how they can be used as agents of change.

Learn more about Rick Craig
Rick Craig   

T
2
 
Patterns and Practices for Model-Based Testing
Keith Stobie, Microsoft
 
To apply model-based testing (MBT) to many different applications, simply learning the high-level principles is not enough. You need extra guidance and practice to help orient testers and developers to begin using models for testing. Many people attempting MBT, confused about programming around observation and control, try to duplicate the underlying system functionality in models. Keith Stobie shows you real-world MBT case studies to illustrate MBT ideas you can incorporate into your own practices. Learn to apply MBT patterns and practices to both traditional and model-based test design. See abstracting examples and how these abstractions can help testers with any test suite—model-based or not. Learn to create adapters that act as a specialized programming language—similar to keyword-based testing—for the abstractions of your domain under test. Detect under-testing and over-testing by creating a logging framework using assertions to trace tests back to requirements. With MBT patterns and practices, you can do MBT—More Better Testing!

Learn more about Keith Stobie
Keith Stobie 

T
3
 
End-to-End Test Automation for Complex Systems
Thomas Thunell, Ericsson AB
 
As a world-leading provider of telecommunications equipment, Ericsson knows that test automation is a key factor for driving a successful test organization. Thomas Thunell describes their automation solution—a test system for complex, end-to-end environments. Ericsson’s networks typically consist of mobile terminals, base stations, radio network controllers, switching systems, protocol analyzers, and possibly other components. Thomas discusses the lessons Ericsson has learned—obtain management commitment up front, use dedicated automation teams, and take the long-term view in automation work. When it came to planning, establishing guidelines, and getting the right people on board, Ericsson treated test automation exactly the same as any other software development project. In so doing, they built—and depend on—a rock-solid, easy-to-use, reliable test automation framework. Future plans include automated post-processing of test logs and delivering test automation metrics directly from the system. Find out how Ericsson is doing test automation to see how you can follow their path.

Learn more about Thomas Thunell
 Thomas Thunell

T
4
 
Testing AJAX Applications with Open Source Tools
Frank Cohen, PushToTest
 
AJAX testers and developers have serious challenges developing unit tests, functional tests, and load/performance tests in a time when AJAX and other Web development technologies continue to expand. Frank Cohen explains a proven methodology to identify—and solve—scalability, performance, and reliability issues in AJAX applications. Frank explains how to apply this methodology using open source testing tools, including Selenium, soapUI, TestGen4Web, PushToTest, and others. He demonstrates hands-on testing examples created with the Appcelerator and Google Widget Toolkit (GWT) frameworks. You’ll also see how to construct a functional unit test for a business flow, identify ways to create operational test data at run time, validate test responses, and automate the entire test. Learn to use Firebug and Firefox to identify and instrument AJAX user interface elements. Add these new tools and methods to your toolkit for a better AJAX testing experience.

Learn more about Frank Cohen
Frank Cohen 

T
5
 
Man and Machine: Combining Tools with the Human Mind
Jonathan Kohl, Kohl Concepts, Inc.
 
When you think of automated testing, you usually think of computer software executing unattended tests. When you think of manual testing, you think of a human being executing tests without the aid of software. Instead of thinking of tests as either automated or manual, Jonathan Kohl explores ways you can blend the two. He helps you answer the questions, “How can automation improve my exploratory and scripted testing work?” and “What do we lose if we run these tests without any human supervision?” With numerous examples, Jonathan demonstrates the different mindset he uses to implement test automation as he highlights techniques from a hybrid testing approach. He demonstrates examples from his personal testing experiences and from other disciplines to change your mind on man and machine testing.

Learn more about Jonathan Kohl
 Jonathan Kohl

Concurrent Sessions for Thursday, October 2 — 11:15 a.m.
T
6
 
Calculate the Value of Testing: It’s Not Just About Cost
Leo van der Aalst, Sogeti Netherlands BV
 
It seems that senior management is always complaining that testing costs too much. And their opinion is accurate if they consider only the costs—and not the benefits—of testing. What if you could show management how much you have saved the organization by finding defects during testing? The most expensive defects are ones not found during testing—defects that ultimately get delivered to the user. Their consequential damages and repair costs can far exceed the cost of finding them before deploying a system. Instead of focusing only on the cost of testing, Leo van der Aalst shows you how to determine the real value that testing adds to the project. He shares a model that he has used to calculate the losses testing prevents—losses that did not occur because testing found the error before the application was put into production. Leo explains the new testing math: Loss Prevented – Cost of Testing = Added Value of Testing.

Learn more about Leo van der Aalst
Leo van der Aalst 

T
7
 
The Savvy Web Tester’s Tool Kit
Erik Petersen, emprove
 
Did you know that you can get many free—or nearly free—tools to supercharge your Web testing efforts? Amazingly, at the click of a button, you can download some very advanced capabilities to make you seem like a testing genius. With a focus on Web application tools, Erik Petersen looks at tools that can help all testers. Erik examines mind mapping and how you can use mind maps for schedules, strategies, even tests themselves. He demonstrates several tools for managing tests and others to help you look “under the hood” and manipulate Web applications. Join Erik to learn some innovative ways to test your Web applications; build test data to include dummy people with realistic addresses; capture what you've done; and view, tweak, and break the software. You’ll also see “portable applications”, versions of tools that run off a memory stick on any PC without being installed.

Learn more about Erik Petersen
 Erik Petersen

T
8
 
Demystifying Virtual Test Lab Management
Ian Knox, Skytap, Inc.
 
The benefits of a virtualized test lab environment are compelling and quantifiable—rapid provisioning and tear down of environments, faster test cycles, and powerful new capabilities to resolve defects. Although many test teams have experimented with virtual machines and have experienced some of the benefits, they’ve also discovered issues with virtual machine “sprawl,” difficulties administering the lab, and lack of virtual private networking. Ian Knox provides solutions to these problems and offers ways to simplify both using and managing virtualization in your test environment. Ian describes the basics of virtualization and how you can use virtual labs to solve some of the most pressing and expensive challenges in testing. He guides you through the important implementation choices for building a virtual lab and explores the common pitfalls with real-life case studies. Take back an understanding of a virtual lab’s capabilities and limitations and learn how to automate your lab with tools and build integration.

Learn more about Ian Knox
 Ian Knox

T
9
 
Building an SOA Quality Center of Excellence
Rajeev Gupta, iTKO LISA
 
Before we can realize the promises of technical agility and reuse from a distributed, service-oriented architecture (SOA), we must first establish trust among stakeholders that SOA will meet business requirements. Rajeev Gupta believes that the best way to instill this sense of trust and make SOA adoption possible is through a shared center of excellence focused on SOA quality. Both service providers and businesses consuming services must be confident that services and the underlying implementation and data layers behind them reliably meet business goals, even as they change and evolve over time. An SOA Quality Center of Excellence demonstrates that quality is everyone’s duty—not just the testing team’s responsibility. Learn the four key activities that the SOA Quality Center of Excellence must manage: structural certification, behavioral validation, performance testing, and service virtualization of test environments. If all stakeholders work together to ensure quality with a continuing focus, SOA can and will succeed in your organization.

Learn more about Rajeev Gupta
Rajeev Gupta 

T
10
 
Beyond Functional Testing: On to Conformance and Interoperability
Derk-Jan De Grood, Collis
 
Although less well known than security and usability testing, conformance and interoperability testing are just as important. Even though conformance and interoperability testing—all about standards and thick technical specifications documents—may seem dull, Derk-Jan De Grood believes that these testing objectives can be interesting and rewarding if you approach them the right way. SOA is one example in which numerous services must interact correctly with one another—conform to specs—to implement a system. Conformance and interoperability testing ensures that vendors’ scanners can read your badge in the EXPO and that your bank card works in a foreign ATM. Derk-Jan explains important concepts of interface standards and specifications and discusses the varied test environments you need for this type of testing. Get insight into the problems you must overcome when you perform conformance and interoperability testing.

Learn more about Derk-Jan De Grood
 Derk-Jan De Grood

Concurrent Sessions for Thursday, October 2 — 1:30 p.m.
T
11
 
Managing Your Personal Stress Level
Randall Rice, Rice Consulting Services, Inc.
 
In a recent survey of 130 U.S. software testers and test managers, Randall Rice learned that 83 percent of the respondents have experienced burnout, 53 percent have experienced depression of some type, and 97 percent have experienced high levels of stress at some time during their software testing careers. Randall details the sources of these problems and the most common ways to deal with them—some healthy, some not. There are positive things testers and managers can do to reduce and relieve their stress without compromising team effectiveness. By understanding the proper role of testing inside your organization and building a personal support system, you can manage stress and avoid its destructive consequences. Randall identifies the stress factors you can personally alleviate and helps you deal with those stressors you can't change. Avoid burnout and don’t be taken down by unreasonable management expectations, negative attitudes of other people, unexpected changes, and other stressors in your work.

Learn more about Randall Rice
Randall Rice 

T
12
 
Reloadable Test Data for Manual Testing
Tanya Dumaresq, Macadamian Technologies, Inc.
 
Do you need to execute and then quickly re-execute manual test cases under tight timelines? Do bugs marked as “Cannot Reproduce” bouncing back and forth between developers and testers frustrate your team? Would you like to have more realistic, production-like test data? Join Tanya Dumaresq as she explains the hows and whys of developing and using pre-created, reloadable test data for manual testing. By planning ahead when designing test cases, you can cut test execution time in half and virtually eliminate those “works on my machine” bugs. Learn how to create and load test data in different formats and choose the one that is best for your application under test. Sometimes, you can even use the application itself to create the data! You’ll end up with test data and an environment far more representative of your users’ world than if you create data on the fly during test execution.

Learn more about Tanya Dumaresq
 Tanya Dumaresq

T
13
 
Driving Development with Tests: ATDD and TDD
Elisabeth Hendrickson, Quality Tree Software, Inc.
 
A perennial wish of testers is to participate early in the projects we test—as early as when the requirements are being developed. We also often wish for developers to do a better job unit testing their programs. Now with agile development practices, both of these wishes can come true. Development teams practicing acceptance test-driven development (ATDD) define system-level tests during requirements elicitation. These tests clarify requirements, uncover hidden assumptions, and confirm that everyone has the same understanding of what “done” means. ATDD tests become executable requirements that provide ongoing feedback about how well the emerging system meets expectations. Agile developers who also are practicing test-driven development (TDD) design methods create automated unit tests before writing component code. The result of ATDD + TDD is an automated set of system- and unit-level regression tests that execute every time the software changes. In this session, Elisabeth explains how ATDD and TDD work and demonstrates them by completely implementing a new feature in a sample application.

Learn more about Elisabeth Hendrickson
 Elisabeth Hendrickson

T
14
 
Performance Engineering: More Than Just Load Testing
Rex Black, QA Software Consultant/Trainer
 
Performance testing that is done once or a few times as part of the system test is not the right approach for many systems that must change and grow for years. Rex Black discusses a different approach—performance engineering—that is far more than performing load testing during the system test. Performance engineering takes a broad look at the environment, platforms, and development processes and how they affect a system’s ability to perform at different load levels on different hardware and networks. While load testers run a test before product launch to alleviate performance concerns, performance engineers have a plan for conducting a series of performance tests throughout the development lifecycle and after deployment. A comprehensive performance methodology includes performance modeling, unit performance tests, infrastructure tuning, benchmark testing, code profiling, system validation testing, and production support. Find out the what, when, who, and how to conduct each of these performance engineering activities. As a performance engineer, you’ll learn the questions you need to ask—early in the project—to identify risks for load, stress, capacity, and reliability.

Learn more about Rex Black
Rex Black 

T
15
 
Test Management for Very Large Programs: A Survival Kit
Graham Thomas, Independent Consultant
 
In large organizations with multiple, simultaneous, and related projects, how do you coordinate testing efforts for better utilization and higher quality? Some organizations have opened Program Test Management offices to oversee the multiple streams of testing projects and activities, each with its own test manager. Should the Program Test Manager be an über-manager in control of everything, or is this office more of an aggregation and reporting function? Graham Thomas examines the spectrum of possible duties and powers of this position. He also shares the critical factors for successful program test management, including oversight of the testing products and deliverables; matrix management of test managers; stakeholder, milestone, resource, and dependency management; and the softer but vital skills of influence and negotiation with very senior managers. Relating experience gained on several large testing programs, Graham shares a practical model—covering the key test management areas of organization, people, process, tools, and metrics—that your organization can adapt for its needs.

Learn more about Graham Thomas
 Graham Thomas

Concurrent Sessions for Thursday, October 2 — 3:00 p.m.
T
16
 
The Three Faces of Quality: Control, Assurance, Analysis
Stephen Michaud, Luxoft Canada
 
Many of the misunderstandings within software development organizations can trace their roots to different interpretations of the role of testers. The terms quality control (QC), quality assurance (QA), and quality analysis are often used interchangeably. However, they are quite different and require different approaches and very different skill sets. Quality control is a measurement of the product at delivery compared to a benchmark standard, at which point the decision is made to ship or reject the product. Quality assurance is the systematic lifecycle effort to assure that a product meets expectations in all aspects of its development. It includes processes, procedures, guidelines, and tools that lead to quality in each phase. Quality analysis evaluates historical trends and assesses the future customer needs as well as trends in technology to provide guidance for future system development. Stephen Michaud describes how to set yourself up in all three roles and covers the skills you need to be successful in each role.

Learn more about Stephen Michaud
Stephen Michaud 

T
17
 
Acceptable Acceptance Testing
Grigori Melnik, Microsoft Corporation
Jon Bach, Quardev, Inc.
This is the tale of a team of software professionals at Microsoft patterns & practices group who wrote a book on software acceptance testing. Grigori Melnik was the content owner, writer, and project manager. Jon Bach was the writer, material producer, and the acceptance testing reality checker, ensuring that the project team used its own methods so the book would be acceptable to you, the reader. To develop the book, Grigori and Jon employed key ideas of agile projects—creating a backlog using story cards, working in short iterations, exploring requirements and expectations, building customer trust through iterative acceptance, and staying connected to the customer community through frequent preview releases, surveys, and interviews. They created a heuristic acceptance testing model for knowing when they had reached enough "acceptability" to stop "developing" the book and publish it. Join Grigori and Jon to discover how you can apply an innovative acceptance testing methodology to your software testing. You'll learn how to implement an iterative and incremental acceptance testing approach on your next testing project.

Learn more about Grigori Melnik
Learn more about Jon Bach
Grigori Melnik Jon Bach 

T
18
 
Are Agile Testers Different?
Lisa Crispin, ePlan Services, Inc.
 
On an agile team everyone tests, blurring the lines between the roles of professional developers and testers. What’s so special about becoming an agile test professional? Do you need different skills than testers on traditional projects? What guides you in your daily activities? Lisa Crispin presents her “Top Ten” list of principles that define an agile tester. She explains that when it comes to agile testers, skills are important but attitude is everything. Learn how agile testers acquire the results-oriented, customer-focused, collaborative, and creative mindset that makes them successful in an agile development environment. Agile testers apply different values and principles—feedback, communication, simplicity, continuous improvement, and responsiveness—to add value in a unique way. If you’re a tester looking for your place in the agile world or a manager looking for agile testers, Lisa can help.

Learn more about Lisa Crispin
 Lisa Crispin

T
19
 
Life as a Performance Tester
Scott Barber and Dawn Haynes, PerfTestPlus, Inc.
 
At the core of most performance testing challenges and failed performance testing projects are serious misunderstandings and miscommunications within the project team. Scott Barber and Dawn Haynes share approaches to overcoming some of the most common frustrations facing performance testers today. Rather than simply telling you how to improve understanding and communicate performance testing concepts, Scott and Dawn demonstrate their approaches through an amusing role play of interactions between a lead performance tester and a non-technical executive. Based on real-life experiences (with names and places changed to protect the innocent, of course), they demonstrate ways for you to address questions such as, “Should we be doing performance, load, stress, or capacity testing?”, “How relevant and realistic (or not) is this load test?”, “How will we know if we are done?”, and “What is a concurrent user, anyway?” As you enjoy the interplay, you’ll learn valuable lessons that are sure to make your performance testing better and personally more rewarding.

Learn more about Scott Barber
Learn more about Dawn Haynes
 Scott Barber Dawn Haynes

T
20
 
Adding Measurement to Reviews
Riley Rice, Booz Allen Hamilton
 
Conceptually, most testers and developers agree that reviews and inspections of software designs and code can improve software and reduce development costs. However, most are unaware that measuring reviews and inspections greatly magnifies these improvements and savings. Riley Rice presents data from more than 4,000 real-world software projects in different domains—defense, commercial, and government. He compares the results of three scenarios: doing few or no reviews, doing unmeasured reviews, and doing measured reviews. For each scenario, Riley compares resulting metrics: defects delivered to customers, total project pre-release costs, total project post-release costs, total project lifecycle costs, project duration, mean time between failures, and productivity. The results are surprising—measured reviews are substantially more effective—and go far beyond what most people would expect. Learn how the effectiveness of inspections and reviews is significantly improved by the simple act of measuring them.

Learn more about Riley Rice
Riley Rice 

Top of Page
 
Send us Your Feedback

Software Quality Engineering  •  330 Corporate Way, Suite 300  •  Orange Park, FL 32073
Phone: 904.278.0524  •  Toll-free: 888.268.8770  •  Fax: 904.278.4380  •  Email: [email protected]
© 2008 Software Quality Engineering, All rights reserved.