Showing posts with label Heuristics. Show all posts
Showing posts with label Heuristics. Show all posts

Thursday, October 13, 2011

Top 10 Automation Tips from STP Online Summit

I had the pleasure of hosting the second Online Summit, delivered by Software Test Professionals: Achieving Business Value With Test Automation.  The online summit format consists of 3 sessions each for 3 consecutive days.  The sessions for this summit were:
One of my duties as host was to try to summarize the most valuable nuggets of information from across all of the presentations into a "top tips" list.  This is what I came up with:

Scott's Top 10 Automation Tips from:


Monday, August 1, 2011

Performance Testing Practice Named During Online Summit

Last week, I hosted STP's Online Performance Summit, a 3 half-day, 9 session, live, interactive webinar. As far as I know, this was the first multi-presenter, multi-day, live webinar by testers for testers. The feedback from attendees and presenters that I have seen has all been very positive, and personally, I think it went very well. On top of that, I had a whole lot of fun playing "radio talk show host".

The event sold out early at 100 attendees with more folks wanting to attend, but were unable. Since this was an experiment of sorts in terms of format and delivery, we made a commitment to the smallest and least expensive level of service from the webinar technology provider, and by the time we realized we had more interest than "seats", it was simply too late to make the necessary service changes to accommodate more folks. We won't be making that mistake again for our next online summit to be held October 11-13 on the topic of "Achieving Business Value with Test Automation". Keep your eyes on the STP website for more information about that and other future summits.

With all of that context, now to the point of this post. During Eric Proegler's session (Strategies for Performance Testing Integrated Sub-Systems), a conversation emerged in which it became apparent that many performance testers conduct some kind of testing that involves real users interacting with the system under test while a performance/load/stress test was running for the purposes of:
  • Linking the numbers generated through performance tests to the degree of satisfaction of actual human users.
  • Identifying items that human users classify as performance issues that do not appear to be issues based on the numbers alone.
  • Convincing stakeholders that the only metric we can collect that can be conclusively linked to user satisfaction with production performance is the percent of users satisfied with performance during production conditions.
The next thing that became apparent was that everyone who engaged in the conversation called this something different. So we didn't do what one would justifiably expect a bunch of testers to do (i.e. have an ugly argument about who's term came first, is more correct, that continues until no decision is made and all goodwill is lost). Instead, we held a contest to name the practice. We invited the speakers and attendees to submit their ideas, from which we'd select a name of the practice. The stakes were that the submitter of the winning submission would receive a signed copy of Jerry Weinberg's book Perfect Software, and that the speakers and attendees would use and promote the term.

The speakers and attendees submitted nearly 50 ideas. The speakers voted that list down to their top 4, and then the attendees voted for their favorite. In a very close vote, the winning submission from Philip Nguyen was User Experience Under Load (congratulations Philip!).

Saturday, January 3, 2009

A misleading benchmark...

No further commentary needed.

Dilbert.com
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Saturday, August 25, 2007

Model Workloads for Performance Testing: FIBLOTS

This is the third installment of a currently unknown number of posts about heuristics and mnemonics I find valuable when teaching and conducting performance testing.
 
Other posts about performance testing heuristics and mnemonics are:
 
 
For years, I have championed the use of production logs to create workload models for performance testing. During the same period, I've been researching and experimenting with methods to quickly create "good enough" workload models without empirical data that increase the value of the performance tests. I recently realized that these two ideas are actually complimentary, not exclusionary, and that with or without empirical usage data from production logs, I do the same thing, I:
 
FIBLOTS.
While the play on words makes this mnemonic particularly memorable, I'm not saying that I just make it up. Rather the acronym represents the following guideword heuristics that have served me well in deciding what to include in my workload models over the years.
 
  • Frequent: Common application usage.
  • Intensive: i.e. Resource hogging activities.
  • Business Critical: Even if these activities are both rare and not risky
  • Legal: Stuff that will get you sued or not paid.
  • Obvious: Stuff that is likely to earn you bad press
  • Technically Risky: New technologies, old technologies, places where it’s failed before, previously under-tested areas
  • Stakeholder Mandated: Don’t argue with the boss (too much).
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Friday, August 3, 2007

Classify Performance Tests: IVECTRAS

This is the second installment of a currently unknown number of posts about heuristics and mnemonics I find valuable when teaching and conducting performance testing.
 
Other posts about performance testing heuristics and mnemonics are:
 
 
I have struggled for over 7 years now with first figuring out and then trying to explain all the different "types" of performance tests. You know the ones:
 
  • Performance Test
  • Load Test
  • Stress Test
  • Spike Test
  • Endurance Test
  • Reliability Test
  • Component Test
  • Configuration Test
  • {insert your favorite word} Test
 
Well, I finally have an alternative.
 
IVECTRAS
IVECTRAS is valuable for classifying performance tests (or test cases if you like that term better) and performance test objectives. Better still, it is easy to map to Criteria, Requirements, Goals, Targets, Thresholds, Milestones, Phases, Project Goals, Risks, Business Requirements, Scripts, Suites, Test Data, etc. Yet even better, you can use it as a heuristic to assist with determining performance testing objectives and performance test design. So what is it?
 
To determine, design or classify a performance test objective or test, ask is this an:
 
INVESTIGATION or VALIDATION
of END-TO-END or COMPONENT
response TIMES and/or RESOURCE consumption
under ANTICIPATED or STRESSFUL conditions
 
For me (and my clients since I came up with this) there is a lot less confusion when one says "We need to INVESTIGATE COMPONENT level RESOURCE consumption for the application server under STRESSFUL conditions" than it is to say "We need to do a unit stress test against the application server." Even if there are still questions to be answered after applying IVECTRAS, at least the questions should be more obvious -- and if nothing else, *that* adds value for me.
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Monday, May 21, 2007

Performance Testing Core Principles: CCD IS EARI

This is the first installment of a currently unknown number of posts about heuristics and mnemonics I find valuable when teaching and conducting performance testing.
 
Other posts about performance testing heuristics and mnemonics are:
 
 
There is not a "one-size-fits-most" approach to performance testing, but I have become rather convinced that there are nine principles that are (almost always) applied (or at least actively considered) in successful performance testing projects. I remember those principles by remembering:
 
CCD IS EARI
  • Context: Project context is central to successful performance testing.
  • Criteria: Business, project, system, & user success criteria.
  • Design: Identify system usage, and key metrics; plan and design tests.
  • Install: Install and prepare environment, tools, & resource monitors.
  • Script: Implement test design using tools.
  • Execute: Run and monitor tests. Validate tests, test data, and results.
  • Analyze: Analyze the data individually and as a cross-functional team.
  • Report: Consolidate and share results, customized by audience.
  • Iterate: "Lather, rinse, repeat" as necessary.
 
For more see Developing an approach to performance testing -- CCD IS EARI.
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Thursday, June 29, 2006

Paint the room heuristic

The other day, my wife asked me if I could finish painting the bedroom before my conference call in 90 minutes. Naturally, I said that I could and like a good husband, I immediately got started. It wasn't until my phone rang that I realized that I hadn't made it in time. Luckily enough, it was no problem to delay the call by 30 minutes.

While I was finishing up, I realized what had happened. When my wife asked me if I could accomplish the painting in a certain amount of time, my thought process was...
  1. If I do it now, it will make her happy.
  2. If it takes a little too long, the worst that will happen is that she'll be a little grumpy until I finish, but once I'm done she'll be happy.
  3. Once I start, no one is actually going to make me stop before I'm finished... I mean, who wants a mostly painted room?!?
  4. I completely overlooked the fact that delaying the phone call could be problematic.