Friday, July 14, 2006

Choosing Performance Testing with Scott Barber (Stickyminds interview reprint)

A Word with the Wise:
Choosing Performance Testing with Scott Barber
by Joseph McAllister

Every kid eventually puts some thought into the question "What do you want to be when you grow up?" For PerfTestPlus CTO Scott Barber, who specializes in context-driven performance testing and analysis for distributed multi-user systems, the answer was not "performance tester." He planned to follow in the footsteps of his father, an industrial arts teacher, and sought an ROTC-scholarship-funded degree in civil engineering. In his junior year of college, though, Scott learned that his first years with the Army Corps of Engineers would involve digging foxholes for infantry rather than building bridges with the Seabees.

"I decided that if I was going to be crossing the front lines, I'd much rather be carrying heavy weaponry than heavy shovels," he says.

He became a maintenance officer, with additional duties as an automation officer--essentially fixing computers. One day, while complaining about outdated software, he was serendipitously overheard by someone there to evaluate a replacement system and was recruited into a government contracting position. Later, while weighing his career options, a friend offered Scott a job as a performance engineer with a promise of "Don't worry, you'll like it."

"From that day, I've been doing performance testing," Scott says. "What I love about it is it spans every part of the software lifecycle. It dials into every discipline from development to business analysis."

Load-generation tools, which most people call performance-testing tools, are just one half of the equation, Scott says. The collected data often also requires automated analysis or "performance monitoring."

"As far as buying a tool and executing all your tests in house versus outsourcing, basically the first heuristic is this: If you've got one big performance test, the first one is going to be cheaper to outsource," Scott says. "Your break-even point, depending on what tool you choose, what your in-house resources are, and the price of a performance tester in your market, is somewhere between two and three performance-testing projects if you're in a software development shop. That's a huge ballpark."

Regular performance testing may call for an in-house process, while testing just once or once in a while may be better served through outsourcing, according to Scott.

If you choose in-house testing, Scott has a few recommendations for evaluating potential tools. Foremost is that you demo the tool on your application, particularly if your product is unusual or new technology.

"Virtually every tool out there does a good job against the type of application it was originally designed to test, but we develop with cutting-edge technology," he says. "Make sure that the tool that you're getting supports your application."

Also, don't pay for unnecessary features and support for communications protocols that you'll never use. Finally, be sure that your performance tester can, as Scott says, "make that tool sing"--let the tester guide the tool-evaluation process.

Of course, as with most software tools, one must also contemplate commercial or open source. The big differences are that each open source tool only works against a limited number of applications or communications protocols, and they are limited in terms of what Scott calls the "nice-to-haves"--extras or add-on features.

"When it comes to actually generating load, the difference in accuracy and in response time from enterprise pay tools and free open source--that's not the point to compare," he says. "Obviously some free and open source tools are going to be better than others. Some enterprise tools are better than others for your particular application. At that point, accuracy of load and response time is something that you need to evaluate against your particular application, as opposed to generally across whether you're paying for a tool or not."

Scott sees brighter days ahead for the field of performance testing. He believes the market no longer views performance testing as a nice-to-have option, but rather as a necessity. The industry, he says, "is finally starting to accept as a whole that performance testing isn't just functional testing on steroids or in 3-D. It is a unique set of skills."

To read more of what Scott Barber has to say, search for his articles on

Scott Barber
President & Chief Technologist, PerfTestPlus, Inc.

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, & How To Reduce the Cost of Testing

"If you can see it in your mind...
      you will find it in your life."

No comments: