Home About Software Quality Engineering Conference Sponsors Contact Us SQE.com
Why Attend?
Conference FAQs
Conference at-a-Glance
Keynote Presentations
Preconference Tutorials
Concurrent Sessions
Certification Training
Special Events
Testing EXPO
Networking Events
Alumni Testimonials
Conference Sponsors
Contact Us
About Us
Past STAR Conferences
Other Conference Events
 
 
 

STARWEST 2007 Keynote Presentations

 Wednesday, October 24, 2007 8:45 a.m.
 


The Five “Doings” of Software Testing 

Mark Fewster and Dorothy Graham, Grove Consultants
 
As testers, we sometimes are so busy “doing”, we forget about the “why’s” and “how’s” of what we are doing. Dorothy Graham and Mark Fewster take a closer look at five key activities of testing: searching for defects, checking against requirements and specifications, assessing software readiness, measuring quality, and sampling software and data. Dorothy and Mark have found that these software testing activities have strong parallels with things that we do in ordinary life. They also have found that most testers are not conscious of how useful their personal skills and knowledge can be to their testing work. Drawing on some surprising examples of things we do every day that can make us better testers, Mark and Dorothy examine the why’s and how’s of all five testing “doings.” Raise your consciousness level, and gain a deeper understanding of testing activities to improve your performance and your team’s results.

Mark Fewster  

Mark Fewster has more than twenty years of industrial experience in software testing, specializing in the areas of software testing tools, techniques, and test automation. As a consultant, Mark has helped many organizations improve their testing—both by the better use of techniques and by the successful introduction of a software testing tool. Mark has given keynote talks and presented papers at international conferences and seminars and has served as Chairman for the BCS working group developing the draft standard for software component testing.

 Dorothy Graham   The founder of UK-based Grove Consultants, Dorothy Graham provides advice, training, and inspiration in software testing, testing tools, and inspection. Originally from Grand Rapids, Michigan, she has lived and worked in the UK for more than thirty years. Dorothy is co-author of Software Inspection (with Tom Gilb), co-author of Software Test Automation (with Mark Fewster), and co-author of Foundations of Software Testing: ISTQB Certification. Dorothy was Program Chair for the first EuroSTAR Conference and was awarded the IBM European Excellence Award in Software Testing in 1999.
 
 
 Wednesday, October 24, 2007 10:00 a.m.
 

Why is “Test Driven Development” Not Driven by Testers?
Antony Marcano, testingReflections.com
 
For years, testers implored developers to do better unit testing. Our pleas fell mostly on deaf ears. Testers were constantly frustrated, finding bugs that should never have escaped the developers. Then, out of nowhere, a few developers started preaching Test Driven Development—test early and often, write unit tests for the code, then write the code. Suddenly, unit testing was cool! Why did testers fail to entice developers to test earlier, more, and better? Why is Test Driven Development a practice that was not driven by testers? Antony Marcano examines these questions and explains how the testing community can become a driving force of software improvement practices. If testers want to be more influential in our day-to-day projects and in our organizations, we must broaden our horizons. Join Antony to find out how to provide concrete ideas that make things easier for everyone—not just ourselves. Take back ways to demonstrate the benefits of testing—and how to publicize that information—so we are seen as a value-added service rather than gatekeepers and naysayers. 
Antony Marcano has a dozen years of experience in software testing across numerous sectors including mobile and fixed telecommunications, banking, publishing, broadcasting, advertising, law, and education. Since 2000, much of Antony's work has been on agile projects. Now, as a practitioner, mentor, coach, and consultant, he helps teams realize the benefits associated with agile development. Antony is creator and curator of testingReflections.com, one of the most influential software testing sites on the Internet. A regular speaker at peer-workshops and conferences, his views have been quoted in numerous publications including Corporate Insurance & Risk magazine, VNUNet, and the British Computer Society journal The Tester.    Antony Marcano
 
 
 Wednesday, October 24, 2007 4:30 p.m.
 


The Coming SOA Revolution: What It Means To Testers
Frank Cohen, PushToTest
 
Applications deployed with service oriented architectures are implemented as producers and consumers of services. Testing a Service Oriented Architecture (SOA) application is unlike anything you've done before because every service can be invoked by consumers of whom you have no knowledge. This requires you to understand the specifications of those services in order to build valid, robust tests. Before SOAs began appearing in IT organizations, testers often dealt with lack of management commitment, poor testing tools, and minimal testing environments. Now, with SOA, the risks of failure are high, and the powerful processes, protocols, and tools that software developers use to build applications can also be used by testers to verify, validate, and test SOA applications. In SOA testing, instead of using antiquated tools, we use a variety of dynamic scripting languages (Rhino, Python, and Ruby) and procedure-less test scenario documents including WADL, LMX, and WSIT. Service oriented architectures make test designs more complex—you must express the full meaning and goals of the services in the tests—but make executing tests much easier with standard SOA development tools for test automation. 

Frank Cohen  

Frank Cohen is the leading authority for testing and optimizing software developed with Service Oriented Architecture (SOA) and Web Service designs. Frank is CEO and founder of PushToTest and inventor of TestMaker, the open-source SOA governance and test automation tool that helps software developers, testers, and IT managers understand and optimize the scalability, performance, and reliability of their systems. Frank is author of several books on optimizing information systems—Java Testing and Design from Prentice Hall in 2004 and FastSOA from Morgan Kaufmann in 2006. He co-founded Inclusion.net and TuneUp.com (now Symantec Web Services). Contact Frank at [email protected] and www.pushtotest.com. 

 
 
 Thursday, October 25, 2007 8:30 a.m.
 


Customer Advocacy: The Key to Testing Success
Theresa Lanowitz, voke, Inc.
 
Testing professionals are often viewed as the pessimists of the software world. Some people think testers will do anything to prevent an application’s release into production. In reality, testers should be pro-active protectors of the organization and a strong voice for its customers—lines of business, end-users of the applications, system designers, developers, and the operations group responsible for application support. Theresa Lanowitz believes that testers should be customer advocates, representing all constituents in each and every stage of the application development lifecycle. As such, testers help ensure delivery of quality products that meet the needs of all. To be a successful customer advocate, you must understand and balance the complex web of requirements, constraints, roles, skills, and abilities of all stakeholders. At the same time, you must understand the capabilities and limitations of the application’s technology and operational environment. Test managers and testers must learn that their roles need to be modernized and fine-tuned—even reinvented. Gone are the days of the pessimist. You must enhance your image while revitalizing your testing organization by becoming a strong customer advocate. 

Theresa Lanowitz is recognized worldwide as a strategic thinker and market influencer. With more than twenty years of technology experience, Theresa has been a trusted advisor to some of the world’s largest software companies. From 1999 through 2006, Theresa was a research analyst with Gartner, where she pioneered the application quality ecosystem, championed the application security space, and consistently identified new and emerging companies to watch. As the lead industry analyst for billion dollar plus companies such as Mercury(HP) and Compuware, Theresa has a wealth of expertise in developing marketing and launch strategies, corporate and product messaging, and identifying partnering and acquisition opportunities for industry-leading organizations. Prior to Gartner, Theresa played instrumental roles at McDonnell Douglas, Borland Software, Taligent, and Sun Microsystems.  

   Theresa Lanowitz
 
 
 Thursday, October 25, 2007 4:15 p.m.
 


The Nine Forgettings 
Lee Copeland, Software Quality Engineering
 
People forget things. Simple things like keys and passwords and the names of friends long ago. People forget more important things like passports and anniversaries and backing up data. But Lee Copeland is concerned with things that the testing community is forgetting—forgetting our beginnings, the grandfathers of formal testing and the contributions they made; forgetting organizational context, the reason we exist and where we fit in our company; forgetting to grow, to learn and practice the latest testing techniques; and forgetting process context, the reason that a process was first created but which may no longer exist. Join Lee for an explanation of the nine forgettings, the negative effects of each, and how we can use them to improve our testing, our organization, and ourselves.

 Lee Copeland  

Lee Copeland has more than thirty-five years of experience as a consultant, instructor, author, and information systems professional. He has held a number of technical and managerial positions with commercial and non-profit organizations in the areas of applications development, software testing, and software development process improvement. Lee frequently speaks at software conferences both in the United States and internationally and currently serves as Program Chair for the Better Software Conference & Expo, the STAR testing conferences, and SQE�s new Agile Development Practices conference. Lee is the author of A Practitioner�s Guide to Software Test Design, a compendium of the most effective methods of test case design.

 
 
 Friday, October 26, 2007 8:30 a.m.
 


Testing on the Toilet: Revolutionizing Developer Testing at Google 
Bharat Mediratta and Antoine Picard, Google
 
You work in an organization with incredibly smart and diligent software engineers. Deadlines are tight and everyone is busy. But when developers outnumber testers by ten to one and the code base is growing exponentially, how do you continue to produce a quality product on time? Google addressed these problems by creating the Testing Grouplet—a group of volunteer engineers who dedicate their spare time to testing evangelism. They tried various ideas for reaching their audience. Weekly beer bashes were fun but too inefficient. New-engineer orientation classes, Tech Talks by industry luminaries, and yearly “Fixit” days became successful and continue to this day. But no idea caught the attention of engineers like Testing on the Toilet. This weekly flyer, posted in every Google bathroom, has sparked discussions, controversy, jokes, and parodies. More importantly, it has taught everyone about techniques such as code coverage, dependency injection, mock objects, and testing time-dependent code. Learn the story of its development—from a deceptively simple idea to a company-wide cultural phenomenon that has received national acclaim. Perhaps Testing on the Toilet can bring better testing to your organization.

Bharat Mediratta is the Technical Lead of the Google Web Server (GWS) team and co-founder of the Testing Grouplet. Bharat has been a tireless advocate of developer testing both in GWS and Google as a whole. Thanks to his efforts, GWS has increased its number of unit tests by an order of magnitude and raised its code coverage by 50% while cutting the number of emergency pushes in half. His team's success has become the benchmark by which other teams measure their developer testing progress.

   Bharat Mediratta
Antoine Picard is the Technical Lead of the unit testing team. Antoine's team is responsible for providing Google's developers with the tools they need to write unit tests and with fast and accurate test results at every change list. Antoine authored the first-ever edition of Testing on the Toilet and is now one of a handful of regular contributors.    Antoine Picard
 
 

Top of Page

 
Send us Your Feedback Software Quality Engineering  •  330 Corporate Way, Suite 300  •  Orange Park, FL 32073
Phone: 904.278.0524  •  Toll-free: 888.268.8770  •  Fax: 904.278.4380  •  Email: [email protected]
© 2007 Software Quality Engineering, All rights reserved.