Home About Software Quality Engineering Sponsorship Opportunities Contact Us SQE.com  
STAREAST
 
Register Now
SQE Home
 
 
STAREAST 2009 Concurrent Sessions

Go To:   Wednesday  |  Thursday  


Concurrent Sessions for Wednesday, May 6, 2009 
W1  
The Marine Corps Principles of Leadership
Rick Craig, Software Quality Engineering
 
Even if you have the best tools and processes in the world, if your staff is not motivated and productive, your testing efforts will be weak and ineffective. Retired Marine Colonel Rick Craig describes how using the Marine Corps Principles of Leadership can help you become a better leader and, as a result, a better test manager. Learn the difference between leadership and management and how they complement each other. Join in the discussion and share ideas that have helped energize your testers (and those that didn’t). Rick discusses motivation, morale, training, span of control, immersion time, and how to promote the testing discipline within your organization. He also addresses the importance of influence leaders and how they can be used as agents of change. In this interactive session, you’ll be able to share your ideas on characteristics of great leaders and learn what you can do to improve your leadership skills and your team’s ability to deliver on its mission.   
Learn more about Rick Craig  

W2  
Improve Your Testing with Static Analysis 
Paul Anderson, GrammaTech
 
Static analysis is a technique for finding defects in code without executing it. Static analysis tools are easy to use because no test cases are required. In addition, today’s technology has advanced significantly over the last few years. Although their use is increasing, many misconceptions about the capabilities of these innovative tools still exist. Paul Anderson describes static analysis tools and how they work and clarifies their strengths and limitations. He demystifies static analysis jargon, explaining terms such as "object-sensitive" and "context-sensitive." Paul describes how they can be used to help traditional testing activities be more effective and how best to use the tools in the software lifecycle. Paul presents data from real case studies to demonstrate the tools’ effectiveness in practice. Gain a better understanding of the technology so that you can decide whether to apply it. Gain more knowledge and confidence to deal better with vendors.  
Learn more about Paul Anderson  

W3  
Seven Key Factors for Agile Testing Success 
Lisa Crispin, ePlan Services, Inc.
 

Agile development approaches present unique challenges for testers and test teams. Working in short iterations, often with limited written requirements, agile development teams can leave traditional testers behind. Common testing-related activities, such as user acceptance testing, testing inter-product relationships, and installation testing, need different approaches to fit into agile projects.  Lisa Crispin explains seven key factors for testing success within agile projects that you can also apply to more traditional methodologies. Using a whole team approach and adopting an agile testing mindset are among the important components of a successful agile testing strategy. Learn how to overcome cultural and organizational obstacles and barriers to success in areas such as test automation. Discover the seven critical factors that provide a foundation for building your team's focus on quality and that deliver maximum value to your business.

  
Learn more about Lisa Crispin  

W4  
Five Things Every Tester Must Do
Julie Gardiner, Grove Consultants
 

Are you a frustrated tester or test manager? Are you questioning whether or not a career in testing is for you? Do you wonder why others in your organization seem unenthusiastic about quality? If the answer is “yes” to any of these questions, this session is for you. Julie Gardiner explores five directives to help testers make a positive impact within their organization and increase professionalism in testing. Remember quality—it’s not just effort, it’s effort and quality; it’s date and quality; it’s functionality and quality. Learn to enjoy testing and have fun—the closest job to yours is blowing up things in the movies. Relish your testing challenges—it’s you against the software and sometimes, it seems, the world. Choose your battles—take a stand on issues that are vital and let the small things go. And most importantly, remember that the only real power we have springs from our integrity—don’t sell that at any price.

 
Learn more about Julie Gardiner  

W5  
Test Process Improvement on a Shoestring  
Martin Pol, POLTEQ IT Services BV
 
In these times of economic crisis, cost reduction is usually the #1 motive for test process improvement. Although improvement models such as TMM® and TPI® are very popular, they require formal assessments, process change working groups, extensive implementation programs, and new organizational structures.  Instead, you can quickly implement measures that improve your testing process incrementally within your day-to-day activities. Martin Pol presents a set of easy-to-implement measures that can rapidly improve the testing’s contribution to your project’s success—simple risk analysis, pro-active test design, coverage targeting, and novel ways to reuse  tools, environments, expertise, and existing testware. Learn how low-budget test process improvement can become a natural behavior for your testing staff. Achieve quick wins by working more closely with development and using what you have instead of buying or creating new tools.  
Learn more about Martin Pol  

W6  
Integrating Divergent Testing Approaches at Cisco  
Bill Schongar, Cisco Systems, Inc.
 
Many large organizations have evolved their test processes project by project and department by department, leading to inefficient practices, overlapping activities, redundant test environments, shelfware test tools, and more. It is possible, however, to focus on a few key areas and bring even the most wildly different test approaches together. Bill Schongar describes the real-world testing problems at Cisco: thousands of test engineers, a never-ending variety of practices, and numerous tools—all deployed across a large spectrum of environments. Employing collaborative communication among test groups, standardized test coding practices, common test environments, best of breed test tooling, and consolidated results tracking, Cisco was able to integrate their diverse testing approaches successfully. Discover what worked (and what failed) and how Cisco made testing faster, more effective, and less painful—even fun at times. Discover how to keep all this chaos organized, while still allowing people to experiment with new approaches.  
Learn more about Bill Schongar  

W7  
Building a Quality Dashboard for Your Project
Jason Bryant, Schlumberger
 
Jason Bryant shows how you can transform readily available raw data into visual information that improves the decision-making process with simple measures that yield power for both testing and development managers.  A quality dashboard helps focus regression tests to cover turmoil risk, ensures issues are resolved before beta, identifies risks in the defect pool, and provides information to monitor the team’s adherence to standard processes.  Creating, measuring, and monitoring release criteria are fundamental practices for ensuring consistent delivery of software products. Schlumberger has implemented a quality dashboard that helps them continuously gauge how projects are progressing against their quality release criteria (QRC). By using dashboard data, Schlumberger makes better decisions and subsequently is able to see how those decisions affect projects. Jason highlights the philosophy and architecture that has made Schlumberger’s quality dashboard a success and how your team can follow in their footsteps!  
Learn more about Jason Bryant  

W8  
The Case Against Test Cases
James Bach, Satisfice, Inc.
 
A test case is a kind of container. You already know that counting the containers in a supermarket would tell you little about the value of the food they contain. So, why do we count test cases executed as a measure of testing’s value? The impact and value a test case actually has varies greatly from one to the next. In many cases, the percentage of test cases passing or failing reveals nothing about the reliability or quality of the software under test. Managers and other non-testers love test cases because they provide the illusion of both control and value for money spent. However, that doesn’t mean testers have to go along with the deceit. James Bach stopped managing testing using test cases long ago and switched to test activities, test sessions, risk areas, and coverage areas to measure the value of his testing. Join James as he explains how you can make the switch—and why you should.  
Learn more about James Bach  

W9  
Agile Testing: Solving the Agilist's Dilemma 
Rob Myers, Independent Test Consultant

One problem with iterative software development is that teams are forced to write and test software incrementally—and repeatedly.  Testers know that any change could break features in both obvious and hidden ways.  Developers know that a change to their stable design is just around the corner.  So, should we go back to designing software all up front and testing the whole product just before delivery?  Of course not!  So how do we solve this “Agilist’s Dilemma?” Rob Myers describes the two popular practices that can solve this dilemma:  unit level test-driven development (TDD) and acceptance test-driven development (ATDD). Join Rob to explore the similarities and differences of these agile practices and learn how they support each other.  Find out why ATDD is much more than traditional test-automation and how its practice drastically alters the role of the agile tester.

 
Learn more about Rob Myers  

W
10
 
Improving the Skills of Software Testers
Krishna Iyer & Mukesh Mulchandani, ZenTEST Labs
 

Many test training courses include the topic of “soft skills for testers,” specifically their attitudes and social behaviors. Testers are told that to be effective they need a negative mindset and a negative approach. Krishna Iyer and Mukesh Mulchandani challenge this belief. Having trained more than 5,000 testers in testing skills and more than 500 testers in essential thinking skills, Krishna and Mukesh demonstrate that testers must be creative rather than critical, positive rather than destructive, and empathetic rather than negative. Join them as they lead exercises in creative thinking, critical writing, and collaborative speaking to improve your eye for detail, nose for sniffing out defects, and ear for bias. Eliminate the old beliefs that hinder testers and find out how to deconstruct them and inculcate new, more powerful ones into your test organization.

  
Learn more about Krishna Iyer
Learn more about Mukesh Mulchandani
 

W
11
 
Insource or Outsource Testing: Understanding Your Context  
Michael Bolton, DevelopSense

As world trade becomes global, goods are  produced and are services performed everywhere.  Software development services are no exception to this trend. Indeed, they are leading these changes because the cost of developing or testing software in one place may be substantially lower than in another. Yet cost isn't all there is to the equation—you must also consider value. What do you need to know to outsource successfully?  While testers offshore might cost less than they cost locally, why might it be more valuable to insource testing?  In this interactive and experiential session, Michael Bolton relates his experience with outsourcing and insourcing software testing models and leads exercises to explore the advantages and disadvantages of each. Join Michael and your peers to learn how understanding organizational context is vital to the insourcing versus outsourcing decision for testing.

 
Learn more about Michael Bolton  

W
12
 
Exploratory Session-Based Testing...with a Twist
Alexander Andelkovic, Maquet Critical Care Ab
 
Maquet Critical Care develops medical equipment in an FDA-regulated environment and uses exploratory testing as a valuable complementary test technique to requirements-based testing. By combining more traditional techniques with exploratory testing, they achieve high quality software embedded within life-critical equipment. Although performing exploratory testing in a way that meets regulatory standards poses some interesting challenges, Maquet has discovered that session-based test management (SBTM) is effective for testing and acceptable to its external auditors. Alexander Andelkovic shares the challenges of convincing company management and outside auditors that SBTM is both efficient and acceptable. Explore the choice between buying or building an SBTM reporting tool, the benefits of well-managed exploratory testing, and the increased visibility of quality-related information. Whether your organization requires FDA approval or not, this session will help you to integrate exploratory testing with traditional methods in your test organization successfully.  
Learn more about Alexander Andelkovic  

W
13
 
Test Estimation: Painful or Painless? 
Lloyd Roden, Grove Consultants
 
As an experienced test manager, Lloyd Roden believes that test estimation is one of the most difficult aspects of test management. You must deal with many unknowns, including dependencies on development activities and the variable quality of the software you test. Lloyd presents seven proven methods he has used to estimate test effort. Some are easy and quick but prone to abuse; others are more detailed and complex but may be more accurate. Lloyd discusses FIA (finger in the air), formula/percentage, historical reference, Parkinson’s Law vs. pricing, work breakdown structures, estimation models, and assessment estimation. He shares spreadsheet templates and utilities that you can use and take back to help you improve your estimations. By the end of this session, you might just be thinking that the once painful experience of test estimation can, in fact, be painless. Useful utilities will be given out during the session to help with estimation.
Learn more about Lloyd Roden  

W
 14
 
Testing Lessons from Delivery Room Triage
Rob Sabourin, Amibug.com
Anne Sabourin, Royal Victoria Hospital
 
Bug triage, like labor and delivery triage, is about deciding on a course of action on the spot, often with minimal information to guide decision-making. The four basic steps of labor and delivery triage apply directly to testing—preliminary assessment, interview, exploration and observation, and taking action. In testing, preliminary assessment triggers immediate action before any bug review meetings or further testing. Interview exposes important contextual information. And, just as exploration helps medical professionals better understand a patient’s condition, software testers use exploratory testing to better understand a bug. In labor triage, taking action could involve the mother being admitted, sent home, or tested further.  In software testing, bug priority decisions guide bug fix decisions.  Join Rob Sabourin, a software engineer, and his wife Anne Sabourin, a nurse, to explore case studies from labor/delivery and software testing triage and learn new ways to immediately improve your testing practices.   
Learn more about Rob Sabourin
Learn more about Anne Sabourin
 

W
15
 
End-to-End Testing in an Enterprise Agile Environment
Billie Bell, Intuit, Inc.
 

All too often, surprises occur late in development when independent projects—agile or not—at varying stages of completion must merge into a cohesive deliverable. These surprises often result in schedule slips and unfulfilled customer needs. At Intuit, Billie Bell found the root causes of these problems and developed an end-to-end testing model to address them. Billie discovered that test progress reports did not contain the right information to help decision makers anticipate issues early, resulting in design defects being discovered late in development. Join Billie to find ways to prevent end-of-project surprises by identifying dependencies early with use case analysis, mapping functional touch points to test cases, empowering test teams through knowledge sharing across projects, and rethinking standard project milestones. Discover the test metrics you can implement to highlight overall functional business risks that will allow stakeholders to make changes in scope, resources, or schedule earlier in development.

 
Learn more about Billie Bell  

W
16
 
What Price Truth? When a Tester is Asked to Lie 
Fiona Charles, Quality Intelligence, Inc.

As testers, our job is to report the current state of software quality on our projects. But in the high-stakes, high-risk business of software development, some may pressure us to distort the message. When projects are late or quality is poor, software managers’ reputations—even their jobs—may be on the line. Our testing progress report could be the biggest obstacle to a “green light” project status report or an on-time delivery. When testers see project disconnects—rosy status reports and repeatedly late delivery; managers shutting down open discussions of project risks; managers trying to close down testing that is exposing major bugs; or suggestions to “get creative” with the metrics—we need to beware. Fiona Charles discusses the reasons testers must refuse to compromise reality, how to secure detailed records of project progress and status, and the possibility of having to “blow the whistle”—regardless of the consequences.

   
Learn more about Fiona Charles  

W
17
 
Challenges in SOA Performance Testing
Manikanda Raman Viswanathan, Cognizant Technology Solutions 

A system built using a Service-Oriented Architecture (SOA) consists of many different services that interact with each other to provide the system’s functionality. Unlike the traditional stand-alone or client-server architecture, SOA separates functions as distinct services making them reusable and easily accessible over distributed networks. Moreover, SOA aims at a loose coupling of these business-level services. Manikanda Viswanathan explains why the traditional performance testing approach which is more application-centric will no longer yield good results. In traditional testing, each application is tested individually. These applications have few external dependencies and performance bottlenecks can be found, contained, and repaired more easily. In SOA testing, applications are distributed, highly dependent on one another, deployed on heterogeneous platforms, and often have availability challenges. While exploring these challenges, Mani presents a simple approach to test SOA applications at the individual component level, at a service interface level and at the end-user experience level using industry standard SOA testing and load testing tools.    
Learn more about Manikanda Raman Viswanathan  


W
18
 
A Test Offshoring Model that Works
Brook Klawitter, USG Corporation

USG is a Fortune 500 building products manufacturer with over fifty North American locations.  USG has effectively managed to integrate their offshore and local test teams to provide support for a major ERP implementation and subsequent software releases.  Brook Klawitter describes how they established a strategy for selecting work to be completed locally versus offshore; developed management capabilities locally and offshore; enhanced the technical and functional testing capabilities of the offshore team; and leveraged the strengths of each team to establish a culture of continuous improvement that encourages team member growth. Learn practical techniques to integrate an offshore team with your local test team, developers, and users. Find out how to select onshore and offshore team members who will be a good technical and cultural fit and learn ways to develop offshore team members for leadership roles.

 
Learn more about Brook Klawitter  



Top of Page

 
Send us Your Feedback
Software Quality Engineering  •  330 Corporate Way, Suite 300  •  Orange Park, FL 32073
Phone: 904.278.0524 or 888.268.8770  •  Fax: 904.278.4380  •  Email: [email protected]
© 2009 Software Quality Engineering, All rights reserved.