Performance & Software Testing Books:
-
Performance Testing Guidance for Web Applications Microsoft patterns & practices -- by: J.D. Meier, Scott Barber, Carlos Farre, Prashant Bansode, and Dennis Rea
(Also as a free .pdf download here.) - Web Load Testing for Dummies, Compuware special edition by: Scott Barber and Colin Mason
-
Beautiful Testing O'Reilly Media -- Edited by: Tim Riley, Adam Goucher
Chapter 4: Collaboration Is the Cornerstone of Beautiful Performance Testing by: Scott Barber
-
How to Reduce the Cost of Software Testing CRC Press -- Edited by: Matthew Heusser, Govind Kulkarni
Chapter 16: Rightsizing the Cost of Testing: Tips for Executives by: Scott Barber
-
Improving
.NET Application Performance and Scalability Microsoft patterns & practices
-- Forward and select content contributed by: Scott
Barber
Recent Articles and Papers:
- Rightsizing the Cost of Testing: Tips for Executives; Chapter 16 of written for How to Reduce the Cost of Testing; CRC Press, 2011.
- Right Click -> View Source and other Tips for Performance Testing the Front End; Tips for Performance Testers based on High Performance Web Sites: Essential Knowledge for Frontend Engineers by Steve Souders, O’Reilly, 2007
- High Performance Testing; written for LogiGear's newsletter in 2007, this article first appeared in Better Software, May/June 2005.
- NASA's Anomaly: A Lesson for Software Testing; written for LogiGear's newsletter in 2007, this is an adaptation of an article Scott wrote for Software Test & Performance in September, 2005
- An Explanation of Performance Testing on an Agile Team (Part 1 of 2); written for LogiGear's newsletter in 2007, this is an adaptation of some of Scott's contributions to Microsoft's patterns & practices: Performance Testing Guidance
- An Explanation of Performance Testing on an Agile Team (Part 2 of 2); written for LogiGear's newsletter in 2007, this is an adaptation of some of Scott's contributions to Microsoft's patterns & practices: Performance Testing Guidance
- Performance Testing Plus: Do the Math!; written for LogiGear's newsletter in 2007, this is an adaptation of an article Scott wrote for Software Test & Performance in October, 2006
- Investigation vs. Validation; written for LogiGear's newsletter in 2007, this is an adaptation of an article Scott wrote for Software Test & Performance in September, 2005
- Introducing the Captain of your Special Teams... The Performance Test Lead; written in support the EuroSTAR 2006 Keynote of the same title
- SOA Driven Testing?; written in support of a webinar by SQE, May 2006
- How Fast Does a Website Need To Be?;Ongoing Research
- User Community Modeling Language (UCML™)v1.1; Visual Modeling Technique (Visio Template | SmartDraw Template)
- Creating Effective Load Models for Performance Testing with Incomplete Empirical Data; Sixth IEEE International Workshop on Web Site Evolution(WSE'04)
Interviews:
- Computer Sweden During Let's Test: Testare måste förstå affären bättre (Swedish)
- uTest: Testing Roundtable: What Do You Like Most About Testing?
- RevolutionIT: Fast Five with Scott Barber
SearchSoftwareQuality.com:
Peak Performance monthly columns:
- Center of the Universe Syndrome
- Software testing is improved by good bug reporting
- Why do we test for performance?
- Identity crisis or delusions of grandeur?
- Magic formula for successful performance testing
- Use "SCORN" to test the front end of a website for performance
- The state of performance testing
- Exploratory and (not vs.) scripted tests
- Don't mistake user acceptance testing for acceptance testing
- Software performance testing: There is no 'I' in 'team'
- When the flag drops, will your software perform?
- Software testing and the business of borders
- What software testers can learn from children
- Developing an approach to performance testing
- Software performance testing: You can't test everything
- What is performance testing?
- Acceptable application response times vs. industry standard
"Ask the Expert" Answers:
- Acceptance Testing for Websites
- Performance testing SOA
- How to set up a test environment
- Prioritizing software testing on little time
- Software testing processes and development methodologies
- Smoke and sanity testing
- Automating regression test cases
- Test plan and test strategy
- User acceptance testing that satisfies users and requirements
- Understanding performance, load and stress testing
- From Web programmer to software tester
- Software testing tools: How to interpret results from OpenSTA
- User acceptance testing and test cases
- Entering the realm of software performance testing
- Skills for entry-level software testers
- What to do when the test environment doesn't match production
Better Software Magazine: Feature Articles
- Hurry Up and Wait: When Industry Standards Don't Apply; Better Software, June 2007
- Tester PI: Performance Investigator; Better Software, March 2006
- High Performance Testing; Better Software, May/June 2005
Software Test & Performance Magazine:
Peak Performance monthly columns:
- A Good Idea Whose Time Has Come
- Linux Performance Tuning
- How Do You Deal With Anomalies
- Revolution or Realization
- Investigation vs. Validation
- Two Kinds Of Performance Requirements
- Remember Yesterday
- Good Test Tools and How To Pick Them
- The Shoulds and Shouldn'ts of Testing
- 'hmm... That's odd.': Embracing Surprise and Curiosity
- Answers to the Second Most Asked Question
- Admit Your Issues of Abandonment
- An Evolution In Performance Testing
- Performance Testing Moves Toward Maturity
- Micro to Macro and Back Again
- Performance Testing Plus: Do The Math!
- After All This Time, A Tool Maker Listens!
- The Balance of Software, Ethics and Professionalism
Feature Articles:
- How to Identify the Usual Performance Suspects; Software Test & Performance, May 2005
- Diagnosing Symptoms and Solving Problems; Software Test & Performance, July 2005
Commissioned Papers:
- Get performance requirements right - think like a user; commission by and co-branded with Compuware, January 2007
User Experience, not Metrics Series
This is Scott's first series of articles where he starts by asking How many times have you surfed to a web site to accomplish a task only to give up and go to a different web site because the home page took too long to download? "46% of consumers will leave a preferred site if they experience technical or performance problems." (Juniper Communications) In other words, "If your web site is slow, your customers will go!" This is a simple concept that all Internet users are familiar with. When this happens, isn't your first thought always, "Gee, I wonder what the throughput of the web server is?" Well no, that is certainly not the thought that comes to mind. Instead, you think "Man, this is SLOW! I don't have time for this. I'll just find it somewhere else." Now consider this, what if it was YOUR web site that people were leaving because of performance?Face it, users don't care what your throughput, bandwidth or hits per second metrics prove or don't prove, they want a positive user experience. There are a variety of books on the market, which discuss how to engineer maximum performance. There are even more books that focus on making a web site intuitive, graphically pleasing and easy to navigate. The benefits of speed are discussed, but how does one truly predict and tune an application for optimized user experience? One must test, first hand, the user experience! There are two ways to accomplish this. One could release a web site straight into production, where data could be collected and the system could be tuned, with the great hope that the site doesn't crash or isn't painfully slow. The wise choice, however, would be to simulate actual multi-user activity, tune the application and repeat (until the system is tuned) before placing your site into production. Sounds like a simple choice, but how does one simulate actual multi-user activity accurately? That is the question this series of articles attempts to answer.
- Part 1: Introduction
- Part 2: Modeling Individual User Delays
- Part 3: Modeling Individual User Patterns
- Part 4: Modeling Groups of Users
- Part 5: What should I time and where do I put my timers?
- Part 6: What is an outlier and how do I account for one?
- Part 7: Consolidating Test Results
- Part 8: Choosing Tests and Reporting Results to Meet Stakeholders Needs
- Part 9: Summarizing Across Multiple Tests
- Part 10: Creating a Degradation Curve
- Part 11: Handling Authentication and Session Tracking
- Part 12: Scripting Conditional User Path Navigation
- Part 13: Working with Unrecognized Protocols
Beyond Performance Testing Series
This is a companion series to the User Experience, not Metrics series and will address topics related to what happens after those initial results are collected, the part it takes a human brain to accomplish. We will explore what the results mean and what can be done to improve them. We will take the next step beyond simply testing and explore how to identifying specific, fixable, issues. What are these issues? Poor end user experience, scalability and confidence in our software applications.Performance Testing and Analysis is the discipline dedicated to optimizing the most important application performance trait, user experience. In this series of articles, we will explore those performance engineering activities that lie beyond performance testing. We will examine the process by which software is iteratively tested, using Rational Suite TestStudio, and tuned with the intent of achieving desired performance by following an industry-leading performance engineering methodology that compliments the Rational Unified Process. This first article is intended to introduce you to the high-level concepts used throughout the series and to give you an overview of the articles that follow.
- Part 1: Introduction
- Part 2: A Performance Engineering Strategy
- Part 3: How Fast Is Fast Enough?
- Part 4: Accounting for User Abandonment
- Part 5: Determining the Root Cause of Script Failures
- Part 6: Interpreting Scatter Charts
- Part 7: Identifying the Critical Failure or Bottleneck
- Part 8: Modifying Tests to Focus on Failure or Bottleneck Resolution
- Part 9: Pinpointing the Architectural Tier of the Failure or Bottleneck
- Part 10: Creating a Test to Exploit the Failure or Bottleneck
- Part 11: Collaborative Tuning
- Part 12: Testing and Tuning Common Tiers
- Part 13: Testing and Tuning Load Balancers and Networks
- Part 14: Testing and Tuning Security
Technical Articles Written for IBM-Rational
Automated Testing for Embedded Devices: As the number of new applications being developed for wireless/embedded devices such as PDAs, pagers, and cell phones increases, so does the demand for tools to automate the testing process on these new platforms. Through several recent consulting engagements, we have had the opportunity to pioneer the use of Rational TestStudio to automate functional (or GUI) and performance testing of new applications developed to run on a variety of embedded devices. This automation is made possible by the use of emulators - the same emulators used by developers of applications for embedded devices. In this article, we're going to show you how to use Rational TestStudio to record and play back test scripts against emulators.--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me
Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing
"If you can see it in your mind...
you will find it in your life."