Home About Software Quality Engineering Conference Sponsors Contact Us SQE.com
Why Attend?
Conference FAQs
Conference at-a-Glance
Keynote Presentations
Preconference Tutorials
Concurrent Sessions
Certification Training
Special Events
Testing EXPO
Networking Events
Alumni Testimonials
Conference Sponsors
Contact Us
About Us
Past STAR Conferences
Other Conference Events
 
 
 

STARWEST 2007 Concurrent Sessions

Go To:   Wednesday  |  Thursday  |  Friday  

 Thursday, October 25, 2007 9:45 a.m.
T1
Test Management

The Secrets of Faking a Test Project
Jonathan Kohl, Kohl Concepts Inc.
 
It's never been easier to fool your manager into thinking that you're doing a great job testing! In his presentation, Jonathan Kohl covers today’s most respected test fakery. These techniques include misleading test case metrics, vapid but impressive looking test documentation, repeatedly running old tests "just in case they find something", carefully maintaining obsolete tests, methodology doublespeak, endless tinkering with expensive test automation tools, and taking credit for a great product that would have been great even if no one had tested it. Jonathan also covers best practices for blame deflection. By the time you're through, your executive management won't know whether to fire the programmers or the customers. But, it won't be you. (Disclaimer: It could be you if an offshore company fakes it more cheaply than you do.)


• Cautionary true stories of test fakery, both purposeful and accidental
• Why surprisingly common practices often surprisingly go wrong
• Signs that your testing may be fake

T2
Test Techniques

Improving Testing with Quality Stubs
Lee Clifford, Virgin Mobile UK

 
Many testers use stubs—simple code modules that simulate the behavior of much more complicated things. As components and their interfaces evolve, it is easy to overlook the need for associated stubs to evolve with them. Lee Clifford explains that the stubs Virgin Mobile previously used to simulate the functionality of third-party software were basic and static—simply returning hard-coded data values. While adequate, the stubs were difficult to maintain. So Virgin Mobile’s testers decided to design, build, test, and deploy their own smart “quality stubs,” not only for use by the test team but also for development and performance testing. The testers created fully configurable and programmable stubs that interface their systems to third-party products. The key advantage is that anyone in the test team can update the stubs with minimal cost and without the need to learn a programming language.
  

• The need for and value of quality stubs when testing
• The different types of stubs you may need
• How to build smart quality stubs that are first-class software tools

T3
Test Automation

The Ten Most Important Automation Questions—and Answers
Mukesh Mulchandani, ZenTEST Labs
 
As test automation becomes more complex, many important strategic issues emerge.  Mukesh Mulchandani shares key questions you must answer before you begin a test automation project or an improvement program. He begins with the elementary questions. Should I automate now or wait? What specifically should I automate? What approach should I adopt? Mukesh then considers more complex questions: vertical vs. horizontal automation, handling static and dynamic data, and testing dynamic objects. The final questions relate to future automation trends: moving beyond keywords automation technology, making automation scripts extensible, introducing test-driven development, starting automation when the application is not yet stable, and offering the automation scripts to clients. Whether you are just starting with test automation or planning to improve your automation, find out which of these questions resonate with you—and learn Mukesh’s suggested answers.
 


• Elementary, complex, and forward-looking test automation questions
• How to improve the ROI of test automation projects
• Future trends in test automation that you can work toward now

T4
Testing the New Web

Testing SOA Applications: What’s New, What’s Not
Brian Bryson, IBM
 
The Service Oriented Architecture (SOA) approach to building applications is rapidly approaching critical mass. With this architecture comes a new set of challenges for testers. Brian Bryson demystifies the testing practices to ensure SOA application quality. He begins by building and deploying a Web service to introduce you to SOA. Brian then examines the requirements and risks of SOA quality management including functional, performance, and security testing challenges. Brian demonstrates testing a Web service using both open source and commercial software. Throughout his demonstration, Brian discusses what new skills and strategies, such as a strong focus on unit testing, are required for SOA testing and the more common strategies, such as a strong focus on requirements based testing, that still apply in the new world of SOA.  
 

• The test and quality ramifications of the SOA paradigm
• Live SOA application and testing demonstration
• Open source and commercial tools for SOA quality management

T5
Special Topics

Lightning Talks: A Potpourri of 5-Minute Presentations
Facilitated by Dawn Haynes
 
Lightning Talks are nine five-minute talks in one conference session. Lightning Talks represent a much smaller investment of time than track speaking and offer the chance to try conference speaking without the heavy commitment. Lightning Talks are an opportunity to present your single biggest bang-for-the-buck idea quickly. Use this as an opportunity to give a first time talk or to present a new topic for the first time. Maybe you just want to ask a question, invite people to help you with your project, boast about something you did, or tell a short cautionary story. These things are all interesting and worth talking about, but there might not be enough to say about them to fill up a full conference session.

 Thursday, October 25, 2007 11:15 a.m.
T6
Test Management

A “Framework for Testing” for Repeatable Success
Randy Slade, Kaiser Permanente HMO
 
Do you have defined and documented processes that describe all the activities and deliverables for testing?  Do you have a documented road map for repeating test project successes?  The test group at Kaiser found themselves overwhelmed with too many projects, understaffed on most projects, lacking repeatable procedures, and without testing tools.  Randy Slade describes how they identified the needed test processes and tools, set priorities, developed new procedures, and implemented them.   Their “Framework for Testing” has become the blueprint for all testing activities.   Its flexibility makes it applicable to software projects of all types and sizes.  It guides testers and managers from A to Z in performing their duties by describing the “what, when, how, and why” of all testing activities and deliverables.



• Five phases of a software testing life-cycle
• How to develop, pilot, and evaluate new processes
• Measures to gauge the value of new software testing procedures and tools

T7
Test Techniques

Emotional Test Oracles
Michael Bolton, DevelopSense
 
An oracle is a heuristic principle or mechanism by which we may recognize a problem.  Traditionally, discussion within testing about oracles has focused two references: (1) requirements specifications that provide us with the “correct” answer and (2) algorithms we execute to check our answers. Testing textbooks talk about identifying a bug by noting the differences between the actual results against those references. Yet high-quality software is not created by merely analyzing conformance to specifications or matching some algorithm. It is about satisfying—and not disappointing—the people who interact with the product every day. Michael Bolton introduces the idea that our emotional reactions to programs as we test them—frustration, confusion, annoyance, impatience, depression, boredom, irritation, curiosity, and amusement—are important triggers for noticing real problems that matter to real people. Take back a new way to use your own emotional test oracle to evaluate the software you are testing.


• Why an obsession with automation may cause us to miss important problems
• How our emotions can help us to recognize important problems
• A model for assessing subjective and emotional responses to software

T8
Test Automation

Apodora: An Open Source Framework for Web Testing
Seth Southern, Aculis, Inc.
 
Are you frustrated with automated test scripts that require constant maintenance and don't seem to be worth the effort? Seth Southern introduces Apodora, a new open source framework for automating functional testing of Web applications. Apodora was released under the GNU General Public License to the open source community with the goal of collaboratively creating a superior, free, automated Web testing tool. The key benefit of Apodora is to help you reduce the maintenance and overhead of test automation scripts. Seth introduces you to the open source project, demonstrates the use of Apodora, and highlights some of the key differences between Apodora and other test automation tools currently available. Seth shows how Apodora can save you time when the software under test changes and scripts require maintenance.


• Web test tool gaps that Apodora fills
• Features of Apodora for functional Web testing
• Future plans for the Apodora open source project

T9
Testing the New Web

Load Testing New Web Technologies
Eran Witkon, RadView
 
Web 2.0 applications represent a major evolution in Web development. These applications are based on new technologies such as AJAX, RIA, Web services, and SOA. Unless you, as a tester, understand the inner workings of these technologies, you cannot adequately test their functionality or prepare realistic and valid performance tests. Eran Witkon explains the new Web technologies, how to design and implement appropriate load tests, execute these tests, and interpret the results. For example, Eran describes why the classic “client requests a page and then waits” model used in performance testing the old Web does not adequately represent AJAX processing in which only parts of pages are requested and one request need not complete before another is initiated. Even if you have never been a programmer or developer, Eran’s presentation will help you understand and develop testing strategies to mitigate the risks we all face with these new technologies.

• The differences between traditional Web and Web 2.0 technologies
• Testing challenges of AJAX, RIA, Web services, and SOA
• Demonstrations of load testing tools

T10
Special Topics

Even Cavemen Can Do It: Find 1,000 Defects in 1,000,000 Lines of Code in 30 Days
Gregory Pope and William Oliver, Lawrence Livermore National Laboratory
 
Due to the increased emphasis on computer security, great advances have been made in static analyzer tools that can detect many code errors that often elude programmers, compilers, test suites, and visual reviews. Traditional tools such as “lint” detectors are plagued with high false positive rates. Gregory Pope discusses the steps his organization used to evaluate and select a static analyzer tool and pilot its implementation. He describes how they rolled out the tool to developers and how it is being used today. Greg shares the results they achieved on real code (C, C++, and Java) and the valuable code metrics they obtained as a byproduct of its use. Greg discusses the skills needed to use the tools, ways to interpret the results, and techniques they used for winning over developers.


• The features of static code analyzers
• Defects that can be found with these tools
• How to maximize your success using static analysis

 Thursday, October 25, 2007 1:30 p.m.
T11
Test Management

Selecting Mischief Makers: Vital Interviewing Skills
Andy Bozman, Orthodyne Electronics
 
Much of testing is tedious—the focus on details, the repetitive execution of the same code, the detailed paperwork, the seemingly endless technical discussions, and the complex data analysis. All good testers have the skills and aptitude necessary to deal with these activities. However, great testers have one other characteristic—they are mischievous. As a hiring manager, detecting mischievous testers is a challenge you should pursue to build the best testing staff. How do you uncover a candidate’s mischievous traits during the selection process?  Résumés do not help, and phone interviews or email conversations are too easily misunderstood. The best chance you have for detecting mischief is during the interview. Andy explores the ways he identifies the clever people who make great testers and shares techniques that you can easily add to your interview process to find the best people for your team.


• The need for well-directed mischief in testers
• How to distinguish clever people for testing
• Techniques for detecting the people you need and avoiding troublemakers

T12
Test Techniques

Taming the Code Monolith—A Tester’s View
Randy Rice, Rice Consulting
 
Many organizations have systems that are large, complex, undocumented, and very difficult to test. These systems often break in unexpected ways at critical times. This is not just limited to older legacy systems—even more recently built Web sites are also in this condition. Randy Rice explores strategies for testing these types of systems, which are often monolithic mountains of code. He describes methods he has used to understand and “refactor” them to break up their huge complex codebase into something more testable and more maintainable. Randy describes how to build a set of tests that can be reused even as the system is being restructured. Find out how to perform regression, integration, and interoperability testing in this environment. See how new technologies such as service oriented architecture (SOA) can help achieve better system structures, and learn when and where test automation fits into your plans.
 


• How to test large, undocumented, and highly integrated systems
• Regression and integration testing in a complex environment
• New technologies for testing and refactoring systems

T13
Test Automation

User Interface Testing with Microsoft Visual C#
Vijay Upadya, Microsoft
 
Manually testing software with a complex user interface (UI) is time-consuming and expensive. Historically the development and maintenance costs associated with automating UI testing have been very high. Vijay Upadya presents a case study on the approaches and methodologies his Microsoft Visual C# test team adopted to answer the testing challenges that have plagued them for years. Vijay explains how the test team worked with developers to design high levels of testability into Microsoft Visual Studio 2005. These testability features enabled the test team to design a highly robust and effective test suite which completely bypasses the UI. Join Vijay to find out how they adopted data driven testing below the UI and achieved dramatic cost reductions in developing and maintaining their tests.
 


• How to bypass the user interface without compromising test effectiveness
• Designs for software with high testability
• Approaches for data driven testing below the user interface

T14
Exploratory Testing

Mission Possible: An Exploratory Testing Experience
Erik Petersen, Emprove
 
Interested in exploratory testing and its use on rich Internet applications, the new interactive side of the Web? Erik Petersen searched the Web to find some interesting and diverse systems to test using exploratory testing techniques. Watch Erik as he goes on a testing exploration in real time with volunteers from the audience. He demonstrates and discusses the testing approaches he uses everyday—from the pure exploratory to more structured approaches suitable for teams. You'll be amazed, astounded, and probably confounded by some of Erik’s demonstrations. Along the way, you'll learn a lot about exploratory testing and have some fun as well. Your mission, should you choose to accept it, is to try out your testing skills on the snappiest rich Internet applications the Web has to offer.

 


• Key concepts in exploratory testing demonstrated
• Learn to test Rich Internet applications (RIA’s)
• Hands-on exploratory testing with audience volunteers

 
T15
Special Topics

The Hard Truth about Offshore Testing
Jim Olsen, Dell, Inc.
 
If you have been a test manager for longer than a week, you have probably experienced pressure from management to offshore some test activities to save money. However, most test professionals are unaware of the financial details surrounding offshoring and are only anecdotally aware of factors that should be considered before outsourcing. Jim Olsen shares his experiences and details about the total cost structures of offshoring test activities. He describes how to evaluate the maturity of your own test process and compute the true costs and potential savings of offshore testing. Learn what is needed to coordinate test practices at home with common offshore practices, how to measure and report progress, and when to escalate problems. Jim shares the practices  for staffing and retention, including assessing cultural nuances and understanding foreign educational systems. 


• Practices and techniques of successful offshore testing
• How to compute the true cost and potential savings of offshore testing
• The cultural nuances of overseas organizations and their cultures

 Thursday, October 25, 2007 3:00 p.m.
T16
Test Management

The Top Ten Signs You Need to Improve Your Testing Process
Robert Watkins, Metavante
 
Does this sound familiar? Patch #94 was just released for the application you shipped last month; your customers refuse to upgrade to the latest version until someone else tries it first; your project manager casually asks if the application was tested on Windows 98 because that’s what your biggest customer uses. Robert Watkins discusses these and other signs of test process breakdowns. He then suggests ways to improve the testing process by making sure the testing activities are in line with the needs of all stakeholders (customers, business owners, support staff, developers, and testers). Find new ways to establish appropriate quality gates that everyone honors, enlist the best champion for your improvement efforts, and communicate the right information to the right people at the right time.
 


• Improvements to mitigate or eliminate test process breakdowns
• How to evaluate the effectiveness of test process improvement
• Ways to make sure that positive changes stick

T17
Test Techniques

Holistic Test Analysis and Design
Neil Thompson, Thompson Information Systems Consulting Ltd.
 
To test professionally and understand software risks fully, we need to know what our tests cover. Counting test cases is not enough—that’s like sizing business requirements by counting program modules. Neil Thompson presents a test analysis and design method that integrates four key elements into a holistic approach: test items, testable features, test basis documents, and product risks. Testing standards and many textbooks have anaesthetized us into the delusion that test cases are simple and can easily be derived through basic techniques. This is false thinking. According to Neil, we must consider and prioritize all available test techniques, incorporating both exploratory techniques and new thinking into our testing. Join Neil to learn a holistic approach for test design and the need for more complete information traceability. 
 


• The different types of coverage—logical and physical
• How coverage should play a part in governance scorecards
• A measurement framework for management to understand testing better

T18
Test Automation

Managing Keyword-Driven Testing
Hans Buwalda, LogiGear
 
Keyword-driven test automation has become quite popular and has entered the mainstream of test automation. Although some hail it as a panacea, many companies using it in one form or another have been disappointed. Keyword-driven testing projects succeed only if they are managed well. This presentation is not about the keyword method itself. Instead, Hans Buwalda focuses on the management side: how to manage a keyword-driven project. What are the factors that indicate progress and success? What are the common risks for a keyword project? Hans shares insights he has gathered in countless keyword projects in many industries all over the world. Many of the lessons he presents were learned the hard way. Learn from Hans’ successes and mistakes and  become more successful with your keyword-driven automation.
 


• The success factors and risks for keyword-based automation
• How to create and organize the team for automation success
• The proper automation environment for keyword-driven testing

T19
Exploratory Testing

Session-Based Exploratory Testing—With a Twist
Brenda Lee, Parallax, Inc.
 
Session-based exploratory testing is an effective means to test when time is short and requirements are not clearly defined. Is it advisable to use session-based exploratory testing when the requirements are known and documented? How about when the test cases are already defined? What if half of the test team is unfamiliar with the software under test? The answers are yes, yes, yes. Brenda Lee explains how her team modified the session-based exploratory testing approach to include requirements and test cases as part of its charter. In one instance, during the short seven-day test window the team validated forty-one out of forty-five requirements, executed more than 200 test cases using seventeen charters, and identified fifteen new, significant issues. The team was able to present a high-level test summary to the customer only two days after the conclusion of system test. What did the customer say? "This had to be the shortest system test cycle ever."
 


• A structured and managed approach for faster system testing
• How session-based exploratory testing works with traditional development projects
• Ways to obtain management support for experimentation

T20
Special Topics

The Zen of Software Testing: Discovering Your Inner Tester
Dawn Haynes, PerfTestPlus, Inc.
 
Testing techniques and methods are usually based on models or theories—models derived from experience and theories from science. An alternative approach is Zen, a Buddhist doctrine stating that enlightenment can be attained through direct intuitive insight. Zen is all about harmony and balance. Dawn Haynes believes that a Zen approach to testing can help you meld disparate testing practices and gain new insights into your test processes and your everyday testing activities. We’ve all had those “aha” moments—like when you just knew it was a buffer overflow problem and immediately found where it was located in the code. When we “Zen” it, we figure out something through meditation or a sudden flash of enlightenment. Join Dawn to learn the Zen way to apply the models and theories you currently use for testing and then apply your intuitive insights to discover the rest.
 


• The parallels between Zen and scientific methods of testing
• A new way to see your formal and informal test processes
• The role of ethics in the Zen philosophy and its application to testing



Top of Page


 
Send us Your Feedback Software Quality Engineering  •  330 Corporate Way, Suite 300  •  Orange Park, FL 32073
Phone: 904.278.0524  •  Toll-free: 888.268.8770  •  Fax: 904.278.4380  •  Email: [email protected]
© 2007 Software Quality Engineering, All rights reserved.