Monday, September 17, 2007

Gentleman, Start Your Engines!!!

My most recent column, inspired by a surprise trip to the Brickyard 400, has just been posted on TechTarget in which I discuss the distinction between "delivery" and "done" when it comes to testing the performance of software systems.
Countless hours of development are now in the past. Testing indicates that everything is ready for the big day. The whole team is on hand, and the world is watching. It's the moment of truth; time to find out if all of the hard work is going to pay off. Anticipation builds until the command is given…
"Gentlemen, start your engines!"
The cars come to life. They take a few pace laps and at last, the green flag drops. In fewer than 90 seconds the cars are back on the front stretch approaching speeds of 200 mph -- the pinnacle of stock car performance.
This summer I worked on a project in Indianapolis. Usually when I travel to remote client sites I fly home on the weekends, but there was one weekend that I chose to stay. I chose to stay for two reasons. First, the flights for that weekend were insanely expensive and second, I have some friends in Indianapolis whom I'm always happy to have an excuse to visit. As luck would have it, the flights were expensive because that was the weekend of the Brickyard 400, and one of the friends I wanted to spend time with had a spare ticket, which I shamelessly accepted when he offered.
During the pomp and circumstance leading up to the start of the race I realized what a fabulous example the race was of one of my most-quoted sound bites related to performance testing: "Don't confuse delivery with done."
 
See the column for the rest of the story.
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Saturday, August 25, 2007

Model Workloads for Performance Testing: FIBLOTS

This is the third installment of a currently unknown number of posts about heuristics and mnemonics I find valuable when teaching and conducting performance testing.
 
Other posts about performance testing heuristics and mnemonics are:
 
 
For years, I have championed the use of production logs to create workload models for performance testing. During the same period, I've been researching and experimenting with methods to quickly create "good enough" workload models without empirical data that increase the value of the performance tests. I recently realized that these two ideas are actually complimentary, not exclusionary, and that with or without empirical usage data from production logs, I do the same thing, I:
 
FIBLOTS.
While the play on words makes this mnemonic particularly memorable, I'm not saying that I just make it up. Rather the acronym represents the following guideword heuristics that have served me well in deciding what to include in my workload models over the years.
 
  • Frequent: Common application usage.
  • Intensive: i.e. Resource hogging activities.
  • Business Critical: Even if these activities are both rare and not risky
  • Legal: Stuff that will get you sued or not paid.
  • Obvious: Stuff that is likely to earn you bad press
  • Technically Risky: New technologies, old technologies, places where it’s failed before, previously under-tested areas
  • Stakeholder Mandated: Don’t argue with the boss (too much).
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Friday, August 3, 2007

Classify Performance Tests: IVECTRAS

This is the second installment of a currently unknown number of posts about heuristics and mnemonics I find valuable when teaching and conducting performance testing.
 
Other posts about performance testing heuristics and mnemonics are:
 
 
I have struggled for over 7 years now with first figuring out and then trying to explain all the different "types" of performance tests. You know the ones:
 
  • Performance Test
  • Load Test
  • Stress Test
  • Spike Test
  • Endurance Test
  • Reliability Test
  • Component Test
  • Configuration Test
  • {insert your favorite word} Test
 
Well, I finally have an alternative.
 
IVECTRAS
IVECTRAS is valuable for classifying performance tests (or test cases if you like that term better) and performance test objectives. Better still, it is easy to map to Criteria, Requirements, Goals, Targets, Thresholds, Milestones, Phases, Project Goals, Risks, Business Requirements, Scripts, Suites, Test Data, etc. Yet even better, you can use it as a heuristic to assist with determining performance testing objectives and performance test design. So what is it?
 
To determine, design or classify a performance test objective or test, ask is this an:
 
INVESTIGATION or VALIDATION
of END-TO-END or COMPONENT
response TIMES and/or RESOURCE consumption
under ANTICIPATED or STRESSFUL conditions
 
For me (and my clients since I came up with this) there is a lot less confusion when one says "We need to INVESTIGATE COMPONENT level RESOURCE consumption for the application server under STRESSFUL conditions" than it is to say "We need to do a unit stress test against the application server." Even if there are still questions to be answered after applying IVECTRAS, at least the questions should be more obvious -- and if nothing else, *that* adds value for me.
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Monday, July 30, 2007

Hourly Rant...

I just finished answering a question posted on LinkedIn by Esther Schindler in researching a article she is working on for CIO.com

She asks (summarized):

"There's just one question to answer: If you could get the (client) boss(es) to understand JUST ONE THING about computer consulting and contracting, what would it be?

Or, to put the same question another way: If you were given a single wish of something to change (about a current or past client) what would it be?"

My response (lightly edited from the original):

Monday, June 18, 2007

Software Testing Lessons from my Children

My most recent column has just been posted on TechTarget in which I discuss some of the lessons I‘ve learned from my children about software testing.
I had planned an entirely different topic for this month, but I‘m sitting down to write this on Father‘s Day while my sons (Nicholas, age 8, and Taylor, age 4) are napping, and realizing that I‘ve never written about what I have learned about testing from my boys.
Before I share some of these lessons, let me first share a little about me and fatherhood. For all of the dedication, time, and passion I give to my career, it is not even comparable to the dedication, time. and passion I give to my boys. For example, I stopped consulting for a while so I could see my boys every day when they were young because I couldn‘t stand the thought of being on the road for their first steps, new words, and all of the other developmental wonders that occur on almost a daily basis during the first several years of life. When I went back to consulting, I started my own company—not because I wanted to run a company, but because I didn‘t want to have to answer to anyone else when I chose to not travel during baseball season so I could coach my son’s team. In the same spirit, when I work from home, I frequently do so in a room with my boys, who are naturally curious about what I‘m doing. Over the past few years of this, I’ve learned a lot of things about being a good tester from them. Some of the most significant are these:
    • Don‘t be afraid to ask "Why?"
    • Exploratory play is learning
    • Recording your testing is invaluable
    • "Intuitive" means different things to different people
    • Fast enough depends on the user
    • You can never tell what a user may try to do with your software
    • Sometimes the most valuable thing you can do is take a break
 
See the column for more behind the lessons
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Monday, May 21, 2007

Performance Testing Core Principles: CCD IS EARI

This is the first installment of a currently unknown number of posts about heuristics and mnemonics I find valuable when teaching and conducting performance testing.
 
Other posts about performance testing heuristics and mnemonics are:
 
 
There is not a "one-size-fits-most" approach to performance testing, but I have become rather convinced that there are nine principles that are (almost always) applied (or at least actively considered) in successful performance testing projects. I remember those principles by remembering:
 
CCD IS EARI
  • Context: Project context is central to successful performance testing.
  • Criteria: Business, project, system, & user success criteria.
  • Design: Identify system usage, and key metrics; plan and design tests.
  • Install: Install and prepare environment, tools, & resource monitors.
  • Script: Implement test design using tools.
  • Execute: Run and monitor tests. Validate tests, test data, and results.
  • Analyze: Analyze the data individually and as a cross-functional team.
  • Report: Consolidate and share results, customized by audience.
  • Iterate: "Lather, rinse, repeat" as necessary.
 
For more see Developing an approach to performance testing -- CCD IS EARI.
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Friday, March 30, 2007

Five Questions with Jon Bach, by the Braidy Tester

I met Jon about 3 years ago. It was a funny story, actually. I was at STAREast talking with a bunch of folks at the bar after the last presentation of the day. Some guy came over and introduced himself to the person sitting next to me.

I heard his name and I stopped, mid-word, stood up excitedly, started shaking his hand and talking a mile-a- minute...

(Scott) "OhMyGod! Jon Bach! I'mSoExcitedToMeetYou! IReadYourBookAnd... I'm sorry, my name is Scott Barber, I've done some work with your brother..."
(Jon) "Wait! JamesToldMeAboutYou! You'reThePerformanceGuy! IRunATestingLabInSeattleAnd... How about we sit at the bar, I'll buy you a beer."

Jon has been one of my best friends ever since. Oh yeah, he's also one of the best testers and teachers of testers I've ever met. If you don't know Jon -- or even if you do -- Michael Hunter posted Five Questions With Jon Bach today. Take a look, it's a good read. 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Thursday, March 22, 2007

Custom Performance Testing Search Engine

About 24 hours ago, Google Co-op publicly released the ability for folks to make their own CSE's (Custom Search Engines). From the site:

Harness the power of Google search
Create a highly specialized Custom Search Engine that reflects your knowledge and interests. Place it on your website and, using our AdSense for Search program, make money from the resulting traffic.

See examples of how a Custom Search Engine works.

What you can do with a Custom Search Engine
  • Place a search box and search results on your website.
  • Specify or prioritize the sites you want to include in searches.
  • Customize the look and feel to match your website.
  • Invite your community to contribute to the search engine.

Sound cool? I thought so. So cool, in fact, that it threw me into a fit of ADD obsession. It took me about 21 of the last 24 hours to do, but now there is CSE just for Performance Testers. Just think about it, no more results for tuning sports cars, training for a marathon or measuring employee productivity when searching for material related to software performance testing.

Check it out and let me know what you think!
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Wednesday, January 31, 2007

Resumes in Context

On a forum related to James Bach's Rapid Software Testing On-line (Beta) class (which I highly recommend! A few more technical issues to work out and it should be ready for prime-time) another student (Anne Marie Martin, from Atlanta) posted the following (lightly edited):
Here's something I struggle with though, and would love to hear thoughts on. I have about 11 years experience in testing, and try to invest time in learning more about testing, and learning more in general that can help me with testing - such as the things we've all been discussing about philosophy and learning and Weinberg and a hundred other things that have tickled my brain during our discussions and threads that made my 'to do' list of things to read or explore or learn from.

Monday, November 20, 2006

What Best Practices really are. -- CIO Article

Of all the places I expected to find an article supporting the fact that Best Practices is nothing more than a square on someone's buzz-word bingo card, CIO wasn't it. The highlights are these...
Using celebs for endorsements has become such best practice that everyone does it. So what is best practice about it? Nothing. The phrase is simply a demonstration of how cliched business language dresses up the concept of copying something someone else has done. And when lots of companies copy the copier, it becomes dull, intellectually stagnant and offers no competitive advantage. It's just a me-too strategy executed by the cynical, the lazy, or the lazy cynics.

Friday, November 17, 2006

Happy About Global Software Test Automation

I just posted this review for Hung Nguyen's new book on Amazon. All you testers and test managers out there, slip this book under your boss's door when they aren't looking and watch how quickly the company starts embracing and respecting software testing!

***

Happy About Global Software Test Automation: A Discussion of Software Testing for Executives is an absolute must read for any executive in a company that develops, customizes or implements software.

Sunday, November 12, 2006

Modeling Application Usage Visually, Google Tech Talk

Some folks have said that I should get this on my blog, so here it is. If you like it, rate it... if you don't... umm... well... let your conscious be your guide. ;)

Modeling Application Usage Visually

Play Video

Description: Google TechTalks April 24, 2006 Scott Barber is the CTO of PerfTestPlus, Inc. and Co-Founder of the Workshop on Performance and Reliability (WOPR). Scott's particular specialties are testing and analyzing performance for complex systems, developing customized testing methodologies, group facilitation and authoring instructional materials.

Astract Modeling application usage is more than just parsing log files and calculating page frequencies. Whether we are analyzing navigation path effectiveness, planning for scenario testing, documenting performance test workload models or mapping services or objects to user activity having a single, intuitive picture to reference makes the job easier. In this session, we'll explore a highly adaptable method for visualizing application usage and how to use this model to improve cross-functional team communication without requiring team members to invest time learning some new fad of a modeling language that they'll probably never use again. This method references UCML™ which has been described as "what collaboration diagrams should have been."

Wednesday, November 8, 2006

4-Second Rule?

 It looks like Juniper Research has finally done away with the 8-second rule in favor of a 4-second rule. I want to point something out right up front... This new "rule" is based on a survey that asks the question...
Question: Typically, how long are you willing to wait for a single Web page to load before leaving the Web site? (Select one.)
A. More than 6 seconds.
B. 5-6 seconds.
C. 3-4 seconds.
D. 1-2 seconds.
E. Less than 1 second.
Sorry Juniper - I promise that if we sat down with your respondents and asked them to identify how many seconds various pages took to load that MOST of them would not get it right and that MOST of the wrong ones *think* a page takes longer to load than it actually does. Reviewing the report for yourself here: http://www.akamai.com/html/about/press/releases/2006/press_110606.html
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Wednesday, November 1, 2006

How to Ask (and Not Ask) for Free Consulting

James Bach has posted a great blog about how to and how not to ask industry leaders for assistance.

http://www.satisfice.com/blog/archives/70

This rang true with me and my experiences, but some folks seemed to find his perspective to be arrogant or rude. Below I've copied a representative quote and my response.

But the way he handled it, and because I know that James Bach is a very experienced person in answering forum like questions, it looks as if Bach planed it all and maneuvered the poor guy to this corner, maybe to show him how he should behave. The way Bach handled it is IMHO was one of the worse that I have seen. Instead of getting healthy results (the guy understands his mistake, apologizes and learns from it) it looks like Bach did what ever he could to insult the guy in order to get that kind of reaction. I can learn a lot from James Bach but I am not going to take this approach as a good example to learn from. As Linda said, it doe’s him no credit. 

I have to disagree. I admit that I consider Jim to be a close personal friend. I further admit that my first impression of James Bach was that he was a pompous ass. It was only after meeting him that I came to absolutely adore conversing with him for all the reasons that can be taken as "pompous ass" to anyone who approaches him with defensiveness and self-righteousness.

Friday, October 6, 2006

Thoughts on Certification

 I got an email asking a question about certification that I thought others might find interesting.

Hello,

I'm new to the QA arena, and haven't found a mentor yet, beyond the publications of those like yourselves. So far, I don't see that there is one internationally accepted certification for QA in general. I know there is the CSTE http://www.softwarecertifications.org/, and the ISTQB http://www.sqe.com/certification.asp?f=dis&ci=stf , which at least one of you worked on. My perception is that CSTE is a bit more accepted (when I search dice.com for both acronyms, I get a few more for CSTE, but still not many), but otherwise it's certification specific to tools like WinRunner or languages.


I have found also that CompTIA ( www.comptia.org) suggests these:

CompTIA A+
CompTIA e-Biz+
CompTIA i-Net+
CompTIA Server+
Certiport's Internet & Computing Core Certification - IC³

But no one else inside QA seems to have heard of them. Are they helpful, or is CompTIA just trying to earn money? Are there general certs that help QA?

Thanks for any insight you can offer! Keep up the great QA work!

and my response...

Wednesday, July 26, 2006

HP to buy Mercury Interactive

On Tuesday 7/25/2006, CNNMoney.com (along with *many* others) broke the news that the rumored HP/Mercury deal is really happening. A summary and my reaction is below. See the entire release here and draw your own conclusions.
July 26 2006: 9:22 AM EDT
NEW YORK (Reuters) -- Hewlett-Packard agreed on Tuesday to buy Mercury Interactive for about $4.5 billion in stock, or $52 per share, in a bid to expand the computer maker's business software operations.
The deal, which sent shares of the No. 2 personal computer maker down 4 percent, should help boost sales of HP's (Charts) OpenView systems management software, which makes it easier for far-flung businesses to monitor the hardware, software and networks running throughout their organizations.
The purchase of the former star Israeli technology company also puts HP in closer competition with other major systems management software providers, including IBM's Tivoli unit, CA Inc.'s UniCenter and BMC Software.
Since last year, a number of top Mercury executive have left amid a regulatory probe into its stock option granting practices. The financial scandal drove Mercury, once a top performing stock, to delist from the Nasdaq market.
Folks, you may not realize it, but this is major. Until about a year ago, over 75% (up to 90% depending on which year and which report you read) of the total revenue in the test automation and test management tools market went to Mercury, Rational and Segue since the beginning of the "Dot-Com Era". Over the last 13 months this seemingly consistent market has been turned on it's head:

Friday, July 14, 2006

Choosing Performance Testing with Scott Barber (Stickyminds interview reprint)

A Word with the Wise:
Choosing Performance Testing with Scott Barber
by Joseph McAllister

Every kid eventually puts some thought into the question "What do you want to be when you grow up?" For PerfTestPlus CTO Scott Barber, who specializes in context-driven performance testing and analysis for distributed multi-user systems, the answer was not "performance tester." He planned to follow in the footsteps of his father, an industrial arts teacher, and sought an ROTC-scholarship-funded degree in civil engineering. In his junior year of college, though, Scott learned that his first years with the Army Corps of Engineers would involve digging foxholes for infantry rather than building bridges with the Seabees.

"I decided that if I was going to be crossing the front lines, I'd much rather be carrying heavy weaponry than heavy shovels," he says.

Thursday, June 29, 2006

Paint the room heuristic

The other day, my wife asked me if I could finish painting the bedroom before my conference call in 90 minutes. Naturally, I said that I could and like a good husband, I immediately got started. It wasn't until my phone rang that I realized that I hadn't made it in time. Luckily enough, it was no problem to delay the call by 30 minutes.

While I was finishing up, I realized what had happened. When my wife asked me if I could accomplish the painting in a certain amount of time, my thought process was...
  1. If I do it now, it will make her happy.
  2. If it takes a little too long, the worst that will happen is that she'll be a little grumpy until I finish, but once I'm done she'll be happy.
  3. Once I start, no one is actually going to make me stop before I'm finished... I mean, who wants a mostly painted room?!?
  4. I completely overlooked the fact that delaying the phone call could be problematic.

Sunday, April 9, 2006

Tester thinking...

Say you were given the following requirements...

  •   Users shall be able to enter any of nine predefined data objects
  •   User interface shall consist of nine blocks of three rows and three columns
  •   Each row, column and/or block shall accept only one member of each data object

What am I describing?

Sunday, April 2, 2006

Why all the hype about SOA & Testing?

I've been working on a Webinar and article about testing SOA... because I've been asked to... because SOA is all the rage or something. So what's the big deal?!? Objects that are based on a business process is called a Service... Ok. There are competing "standards" for communication protocols for services... Ok. There are SOA Management Software packages that do what middleware has always done... Ok. Services are assumed to be remote and developed by someone else... Ok. And?

What's the new part? What *haven't* we had to deal with before? What *haven't* we had to deal with in combinations before?

Am I just WAAAAAAAAAY out of the loop, or this 90% hype and 10% pushing problems we've been dealing with for *at least* 6 years working their way into new places?

O-well, back to the article... maybe I'll come up with something more useful to say in it.

--
Scott Barber
President & Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, & How To Reduce the Cost of Testing

"If you can see it in your mind...
you will find it in your life."