Monday, December 5, 2011

10 Must-Know Tips for Performance Test Automation

More than other automation, bad performance test automation leads to:
  • Undetectably incorrect results
  • Good release decisions, based on bad data
  • Surprising, catastrophic failures in production
  • Incorrect hardware purchases
  • Extended down-time
  • Significant media coverage and brand erosion
More than other automation, performance test automation demands:
  • Clear objectives (not pass/fail requirements)
  • Valid application usage models
  • Detailed knowledge of the system and the business
  • External test monitoring
  • Cross-team collaboration
Unfortunately, bad performance test automation is:
  • Very easy to create,
  • Difficult to detect, and
  • More difficult to correct.
The following 10 tips, based on my own experiences, will help you avoid creating bad performance test automation in the first place.

Tip #10: Data Design
  • *Lots* of test data is essential (at least 3 sets per user to be simulated – 10 is not uncommon)
  • Test Data to be unique and minimally overlapping (updating the same row in a database 1000 times has a different performance profile than 1000 different rows)
  • Consider changed/consumed data (a search will provide different results, and a item to be purchased may be out of stock without careful planning)
  • Don’t share your data environment (see above)

Tuesday, November 29, 2011

10 Things About Testing That Should Die

I've taken some heat for discussing the whole "is test dead" concept due to a feeling that I was validating the concept of testing being unnecessary. Allow me to clarify my position. I do not believe, for one heartbeat, that testing as an activity is in any way unnecessary. I do believe that there are things related to the current state of and common beliefs about testing that should die. With that said...

Scott Barber's Top 10 Things About Testing That Should Die: 

10. Egocentricity

Face reality testers, neither the product nor the business revolve around you. Think about it. No business, no product, no developers => no need for testers. In fact, if developers wrote perfect code, there'd be no need for you. You are a service provider and your primary clients are the managers, developers, and/or executives. Your secondary clients are product users and investors. So stop whining and stomping your feet when your clients don't make decisions you like with the information you provide. It's not your call. If you want it to be your call, get on track to become a project manager, product manager, or executive, otherwise, get right with the fact that you provide a service (hopefully a valuable one) and get back to providing it.

Tuesday, November 8, 2011

On the Alleged Death of Testing

Out of respect for your time, I'll give you the bottom line up front as a simulated interview that I privately hoped for, but never came. After the mock interview is a supporting narrative for those of you more interested in my thinking on the matter.

Q: There's been a lot of talk recently about testing being dead, so my first question is testing dead?
A: No.

Q: Some of those talking about the alleged death of testing are saying that it's not that testing as a whole is dead, but that testing as it is commonly understood today dead. Is it?
A: No.

Q: Ok, so is testing as it is commonly understood today dying?
A: Not that I can see.

Q: Then why all the talk about testing "as we know it" being dead?
A: IMHO? Wishful thinking.

Thursday, October 27, 2011

WOPR 17, my takeaways


The Workshop On Performance and Reliability (WOPR) 17 was held Oct 20-22, 2011 on the theme of “Finding Bottlenecks”.  Beyond the fact that this was a historic event in the sense that no other peer workshop inspired by LAWST has convened this many times.  Of course, as a co-founder of WOPR, I’m (somewhat unreasonably) proud of this accomplishment, but the fact that over the last 9 years so many folks have been so inspired by the community and value of WOPR that they have been willing to volunteer their time to plan and organize these events, their companies have been willing to donate meeting space (and often food & goodies), and participants have been frequently willing to pay their own way (sometime taking vacation time) to attend makes 17 events, one every 6 months, since WOPR 1 is a significant achievement – whether or not my “founder’s pride” is justified.  :)

As is the tradition of WOPR 20-25 folks, selected or invited by the “content owner” (a.k.a. the person or team who chose the theme to be explored this time) brought their personal experiences related to “Finding Bottlenecks” to share and explore with one another.  Also as is the tradition, certain patterns and commonalities emerged as these experiences were described and discussed. Everyone has their own take, there are no official findings, and I’m not even going to pretend that I can attribute all the contributing experiences and/or conversations to my takeaways below.

  • Finding bottlenecks can be technically challenging, examples include: 
    •  Analyzing the test & the data is far from straight forward 
    •  The “most useful” tools to narrow down the bottleneck may not be available – forcing us to be technically “creative” to work around those roadblocks.   
  • Finding bottlenecks can be *very* socio-politically challenging, examples include:o   
    • Lack of Trust (e.g. “That’s not a bottleneck, that’s the tool!”)
    • Denial (e.g. “It’s not possible that’s related to my code!”)
    • Lack of cross-team collaboration (e.g. “No, you can’t install that monitor on *our* system!”)
  • Sometimes human bottlenecks need to be resolved before technical bottlenecks can be found. (e.g. Perf Team being redirected, resources being re-allocated, excessive micromanagement, etc.)
Some other topics that came up that were relevant and interesting (such as the frequent discrepancy between tester/technical goals & business goals), but since these weren’t “on theme” we didn’t discuss these topics deeply enough for me to draw any conclusions other than “the points and positions that did come up were consistent with what I would have anticipated if I’d thought about it in advance”, which, for me, is a nice confirmation.

My point in sharing these thoughts on finding bottlenecks is so that all the folks out there who feel like theirs is the only organization that is thwarted by socio-political challenges even more than technical ones can realize that they really aren’t alone.

The findings of WOPR17 are the result of the collective effort of the workshop participants: AJ Alhait, Scott Barber, Goranka Bjedov, Jeremy Brown, Dan Downing, Craig Fuget, Dawn Haynes, Doug Hoffman, Paul Holland, Pam Holt, Ed King, Richard Leeke, Yury Makedonov, Emily Maslyn, Greg McNelly, John Meza, Blaine Morgan, Mimi Niemiller, Eric Proegler, Raymond Rivest, Bob Sklar, Roland Stens, and Nishi Uppal.
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Monday, October 24, 2011

Best Ice Cream Practice

A twitter conversation from Friday, Oct 21...

@TesterAB Anna Baik: What's best practice for icecream? I don't know what flavour of icecream I should eat, and I'm afraid to get it wrong.
  • @skillinen Sylvia Killinen: @TesterAB Test ALL the ice cream, that way you'll know which one best satisfies conformance. :)
  • @adampknight Adam Knight: @TesterAB it's vanilla, if you're not eating vanilla you are doing it wrong. I'd suggest getting yourself CVM certified as soon as you can
    • @adampknight  Adam Knight: @sbarber @TesterAB we should be specific. I'll clarify in my "10 ways to check if you are truly vanilla" blog post #BestIceCreamPractice
    • @sbarber Scott Barber: @adampknight @TesterAB Certified Valuation Manager ™? No, no, no, that's only appropriate for *children's* icecream! #BestIceCreamPractice
  • @testingqa Guy Mason: @TesterAB Best to go for that which you most prefer at that point in time?
    • @TesterAB Anna Baik: @testingqa No no no. There must be one flavour of icecream that is best for everybody to eat at all points in time.
    • @TesterAB Anna Baik: @sbarber @testingqa Yes! None of this wishy-washy nonsense, I only want to eat the BEST flavour of icecream. #BestIceCreamPractice
    • @sbarber Scott Barber: @TesterAB @testingqa So chocolate, pistachio, lemon sorbet, raspberry swirl, topped with caramel & orange soda, right? #BestIceCreamPractice
    • @TesterAB Anna Baik: @sbarber @testingqa #BestIceCreamPractice Finally, someone who'll give me an answer! ...wait. How do I know you're qualified?
    • @sbarber Scott Barber: @TesterAB @testingqa #BestIceCreamPractice I founded a non-profit to establish BICP qualification stds and issued myself a certification.
    • @TesterAB Anna Baik: @sbarber @testingqa #BestIceCreamPractice Sounds reassuring, I knew there'd be an Official Body somewhere to tell me what icecream to eat
    • @sbarber Scott Barber: @TesterAB @testingqa #BestIceCreamPractice The invoice for my services are in the mail. $400/hr + $25,000 for the BICP flavor report.
    • @TesterAB Anna Baik: @sbarber @testingqa Eeek! Don't I even get something to show to people to prove I now know #BestIceCreamPractice?
    • @sbarber Scott Barber: @TesterAB @testingqa #BestIceCreamPractice when check clears we mail you a Certified BICP Practitioner Certificate (suitable for framing)
Questions?  No? Didn't think so.  :)

 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Tuesday, October 18, 2011

Please, no new "certifications"

I just saw an advertisement for this Building a Certification Testing Program - Cutting through the hype to see how it really works on LinkedIn, and I couldn't stop myself from adding the following comment:
Please make it stop. We don't need more "certification" programs -- not unless you are going to be the first organization that allows itself to be held legally and financially accountable when people you "certify" can't do what you "certified" they can.

Otherwise, conduct all the training you want. Assess student performance if you want. Only "pass" students who "pass" the assessment if you want.

Just do us all a favor and *STOP* calling it certification until you are willing to do things like:
  • reimburse hiring expenses to employers who hire folks you certified as being able to X who can't X
  • implement periodic re-assessment to enforce some bar of continued knowledge/skill/ability over time
  • implement some way to revoke certifications of folks who fail to demonstrate knowledge/skill/ability in the workforce
The list goes on, but I know it's pointless. The certification machine will continue no matter how loudly, or how frequently I point out the ways in which it is frequently (at least arguably) unethical and fraudulent - at least in "testerland."
Seriously, this drives me insane.  Others can make stands about content, assessment methods, etc. -- I have my opinions on those things, but honestly that part of the topic bores me.  People decide what university to attend, what to major in, what electives to take, etc. for their degree programs ... they can decide on whether or not the content of some professional development program (with or without "certification" program) is worth their effort.  What I want to see is the "certifying bodies" being held accountable for complying with the claims they make about the individuals they "certify."

I mean, seriously, have any of you seen any data that you'd consider either statistically significant, empirical (vs. anecdotal), or free enough from obvious experimental design flaws to support the claims we see from "certifying bodies"?  If you have, please share the data with me and I'll list it in line -- unless of course, it's flawed, in which case, I'd be happy to point out how and why the data doesn't support the conclusion.

Otherwise, please, please, please don't engage in creating more of these things.  Please.

--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Monday, October 17, 2011

Having lunch with a giant...

I "officially" started my career in software performance in Feb of 2000, only much later to realize I'd started down that path years prior.  In the fall of 2001 (10 years ago), I felt I was stagnating in my self-guided education and went on a hunt for books, articles, training, and/or people to learn from.  I found some peers (and eventually co-founded WOPR with Ross Collard to maximize peer learning) and I found 3 "giants" on whose shoulders I've stood since then (meaning, all of my work was and has remained consistent, complimentary, and/or extended from their work in the field).  Those "giants" are Connie Smith, Ph.D. (Software Performance Engineering), Daniel Menasce, Ph,D. (Capacity and Scalability Planning) and Alberto Savoia (Performance Testing).

Last fall, I had the honor of being on a panel with Connie and spending some time talking to Daniel during the CMG conference in Orlando.  I'd never spoken or corresponded with them before that, but it was nice to meet them and we had some great conversations.

Over the years, however, I have corresponded regularly with Alberto Savoia.  As it turns out, he was moving on to what he would now call his next "it" from software performance as I was becoming known in the industry, so we didn't converse regularly, but we did follow each others careers.  During that time, I drew a lot of inspiration from Alberto.  Not just from the work he'd done in the software performance space, but also from his other accomplishments in technology, the kind and complimentary recommendations he gave me and by graciously agreeing to write a forward for Performance Testing Guidance for Web Applications when I asked.

So earlier this year when I had the chance, I dropped everything to review and comment on his new "it", Pretotyping. He said the review was helpful and that some of what I'd commented on would be included in the next version.

Today, I finally met Alberto face to face.  We had lunch.  We talked about projects & passions old and new, we recalled history and speculated about the future.  He gave me a signed copy of Pretotype It, and I gave him a signed copy of Web Load Testing for Dummies, both of which had been prepared in advance.  And while Alberto has accomplished far more in his technology career than I have, somehow I didn't feel like I was having lunch with the "giant" on whose shoulders most of the work I am known for stands, I felt like I was having lunch with an old friend that I hadn't seen in too long.

To some of you, I suspect this seems a silly thing for me to be making a big deal about, but for a guy who left a small town twenty-some-odd years ago, never imagining that I'd meet anyone "famous", let alone become a "celebrity" of sorts in my (admittedly very small) field, it means a lot to me that someone who I've often credited as being a luminary to me, would not only take the time to have lunch with me, but to share thoughts and ideas with me like friends do.

So, thanks Alberto.  Thanks for the years of inspiration & thanks for the confirmation of friendship.  It means a lot to me, and know that you've provide me with another lead I intend to follow with anyone I may inspire during my career and later have the opportunity to meet.


--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Thursday, October 13, 2011

Top 10 Automation Tips from STP Online Summit

I had the pleasure of hosting the second Online Summit, delivered by Software Test Professionals: Achieving Business Value With Test Automation.  The online summit format consists of 3 sessions each for 3 consecutive days.  The sessions for this summit were:
One of my duties as host was to try to summarize the most valuable nuggets of information from across all of the presentations into a "top tips" list.  This is what I came up with:

Scott's Top 10 Automation Tips from:


Wednesday, October 5, 2011

Web Load Testing for Dummies: Book Announcement



"More so now than ever before, your company’s website and web applications are critical to the success of your business initiatives. Think of all the business generated or sustained via the World Wide Web today compared to any other time in history — in today’s digital culture, a business with any sort of crucial web presence needs to make sure that its website is working hard for the business and not against it. That’s what web load testing is all about.

"Key to success on the web is customer experience, which means that web application performance is a priority. Not convinced? Spend a few moments thinking about the impact to your business (in other words, think about how angry the CEO and/or investors would be) if:
  ✓ Your new application launch is delayed due to performance problems
  ✓ Your site breaks under the load of your successful marketing promotion
  ✓ High-traffic volume causes such poor web performance on your busiest online shopping day that abandonment skyrockets and conversions plummet
  ✓ Your new infrastructure is configured improperly, grinding the website to a crawl

"Managers and executives of organizations that derive significant portions of their revenue from web applications realize that they need to focus more on protecting revenue, reducing risk, and ensuring that customers have great experiences. They see how web applications that perform well on release day and throughout their production lives strengthen the company’s brand and reputation, creating customer loyalty. In other words, web load testing is a critical component to any risk management plan for web applications."

Get the eBook version free here.

--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Tuesday, October 4, 2011

Is Junosphere the world's first cloud testing environment? Not really - IT in Context

Is Junosphere the world's first cloud testing environment? Not really - IT in Context

I don't get too irked by companies coining new phrases to make subtle marketing distinctions in services, but when they do it so they can make first/best claims flips my bozo bit. Seriously, if your service is so bland or weak that you need to invent a new term so you can claim that it's the "best the best thing called blah" without being called out for fraud, maybe you should just improve your service.

--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
you will find it in your life."

Monday, October 3, 2011

An overview of Performance Testing for Agile/Lean teams

I'm going to be giving a short webinar on Oct 20 titled "An overview of Performance Testing for Agile/Lean teams" as part of a really cool recurring online mini-conference/webinar series call "Bathtub Conferences"  Check out the website for more information.

http://bit.ly/nWIvzk
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Stop Cheating and Start Running Realistic Tests

I did a webinar with SOASTA on 9/29/2011, in case you missed it, I've copied the description and links from SOASTA's Info Center so you can have a look.  If the twitter-verse is to be believed, it didn't suck.  :)

--

Stop Cheating and Start Running Realistic Tests

Constrained by inflexible test hardware, poor tool scalability, exorbitant pricing models, and lack of real time performance information, performance testers have been forced to cheat for too long! Cloud Testing opens up elastic, full-scale load generation from global locations at affordable cost, rapid and accurate test building, and real time views of internal and external performance metrics.
  • Stop removing “think times” to work around technical or license issues
  • Build tests using real business workflow, not just a flood of page hits
  • Run tests that preserve session states and accurate timings, end-to-end
  • Inspect every component as tests run, not just from the outside-in
Watch the Webinar | Download the Webinar
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Friday, September 30, 2011

Agile backlash series...

From SearchSoftwareQuality.com:

Agile backlash series: Exploring Agile development problems and solutions


 I think Jan Stafford did a great job on this series.  I don't agree with every opinion from everyone interviewed, but I wouldn't expect to.  I think it's fair, honest, insightful, and (best of all) focuses on experiences, challenges, and ideas about overcoming challenges instead of theory, marketing fluff, and excessive exaggeration.  :)

Of course, I'm always happy when someone is willing to publish quotes of mine like the following excerpts from Why Agile should not marginalize software testers:

"SSQ: You come in frequently to integrate testing into Agile development. What kind of problems do you see organizations having when integrating testing?

Scott Barber: The first thing that I hear about is, ‘What do we need testers for if we’re doing Agile? Isn’t everyone in Agile a generalist?’

Thursday, September 29, 2011

Making Every Test Count

This is from a while back, but I wouldn't call it dated.  It's a webinar, it runs for 48 min.  I like it, for whatever that's worth.  ;)

Abstract:

Do you ever find yourself wondering what the point is to executing this test... again!?!  Have you ever felt like the purpose of a test is to ensure there is a check mark in a particular check box?  Are you ever asked to get *more* information in even less time with even fewer resources than the lst test project you worked on?

In this presentation, Scott Barber will introduce you to a variety of tips and techniques you can apply to virtually any testing you do as you strive to make ever test you execute add value to the project.


--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Tuesday, September 20, 2011

Candidate Statement for CMG Director

I've been nominated as a director candidate for the CMG. My candidate statement is posted below because my views related to CMG mirror my views for application performance in organizations and the industry as a whole and I believe that is (or, at least, I hope it is) interesting to anyone involved or concerned with challenges related to application performance now and in the future.

If you are a CMG member, I encourage you to review all of the candidate statements and to vote your conscience here.
Remember, if you don't vote, you have no right to complain. ;)

Statement of Willingness to Serve:
I am willing and would consider it an honor to serve as a director for CMG if elected.

Professional Work Experience:
In my nearly 20 years of experience working in software and technology, I have performed the duties associated with virtually all of the commonly thought of roles; from analyst to project management, configuration management to IT support, and developer to CIO. These many experiences coalesced shortly after Y2K into a career focused on helping organizations improve software system performance to enhance user experience and enable smooth growth while avoiding speed, stability, and scalability catastrophes in a fiscally responsible manner.

Friday, September 2, 2011

Thoughts on Agile & Agile Testing

This past weekend, I finally made time to start reading Agile Testing: A Practical Guide For Testers And Agile Teams, Lisa Crispin & Janet Gregory, Addison-Wesley (2009).  I made it through the first two chapters before life called me away.  After I put the book down and starting going about accomplishing a mundane series of errands, I realized that I was feeling disappointed and that the disappointment had started growing just a few pages into the book.  Not because of what the book had to say, what it said was pretty good – not exactly how I would have expressed a few things, but thus is the plight of a writer reading what someone else has written on a topic they also care and write about.  What was disappointing me was the fact that the stuff in those chapters needed to be said at all.

You see, as Lisa and Janet were describing what Agile Testing and Testing on Agile Teams was all about, and explaining how it is “different” than “traditional testing”, my first thought was:

Tuesday, August 23, 2011

STP Online Summit: Achieving Business Value with Test Automation

Due to the overwhelming success and positive reviews of the last STP Online Summit: Business Value of Performance Testing, we've decided to do it again -- only this time, we're going to explore Achieving Business Value with Test Automation.

Join me (while I continue practicing my radio host skills for my emergency back-up career as a sportscaster) and 7 other presenters that I consider to be elite practitioners, teachers, and thinkers in their test automation areas of specialization for 3 half days online to learn their tips and methods for achieving business value with test automation. If you or your organization are using, or thinking about using, automation to enhance or improve your testing, you're not going to want to miss this online summit. I honestly can't think of anywhere else you can get this concentration of relevant and thematically targeted information at a better price, but you be the judge:

When: Tuesday October 11 10:00AM - Thursday October 13 1:30PM PST

Cost: $195 USD before 9/26/11 $245 USD after 9/26/11

Theme: For more than 15 years organizations have been investing in the promise of better, cheaper, and faster testing through automation. While some companies have achieved demonstrable business value from their forays into test automation, many others have experienced questionable to negative returns on their investments. Join your host, Scott Barber, for this three day online summit, to hear how seven recognized leaders in test automation have achieved real business value by implementing a variety of automation flavors and styles for their employers and clients. Learn how to answer the ROI question by focusing on business value instead of testing tasks, and how to implement automation in ways that deliver that value to the business, not just to the development and/or test team.



Thursday, August 4, 2011

Scott Barber Interviewed by Matt Heusser; Podcast

Two part podcast on the STP site. I say some interesting stuff... or at least I say some stuff that's interesting to me. :)

Twist #52 - With Scott Barber

Twist #53 - The Return of the Barber
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Monday, August 1, 2011

Performance Testing Practice Named During Online Summit

Last week, I hosted STP's Online Performance Summit, a 3 half-day, 9 session, live, interactive webinar. As far as I know, this was the first multi-presenter, multi-day, live webinar by testers for testers. The feedback from attendees and presenters that I have seen has all been very positive, and personally, I think it went very well. On top of that, I had a whole lot of fun playing "radio talk show host".

The event sold out early at 100 attendees with more folks wanting to attend, but were unable. Since this was an experiment of sorts in terms of format and delivery, we made a commitment to the smallest and least expensive level of service from the webinar technology provider, and by the time we realized we had more interest than "seats", it was simply too late to make the necessary service changes to accommodate more folks. We won't be making that mistake again for our next online summit to be held October 11-13 on the topic of "Achieving Business Value with Test Automation". Keep your eyes on the STP website for more information about that and other future summits.

With all of that context, now to the point of this post. During Eric Proegler's session (Strategies for Performance Testing Integrated Sub-Systems), a conversation emerged in which it became apparent that many performance testers conduct some kind of testing that involves real users interacting with the system under test while a performance/load/stress test was running for the purposes of:
  • Linking the numbers generated through performance tests to the degree of satisfaction of actual human users.
  • Identifying items that human users classify as performance issues that do not appear to be issues based on the numbers alone.
  • Convincing stakeholders that the only metric we can collect that can be conclusively linked to user satisfaction with production performance is the percent of users satisfied with performance during production conditions.
The next thing that became apparent was that everyone who engaged in the conversation called this something different. So we didn't do what one would justifiably expect a bunch of testers to do (i.e. have an ugly argument about who's term came first, is more correct, that continues until no decision is made and all goodwill is lost). Instead, we held a contest to name the practice. We invited the speakers and attendees to submit their ideas, from which we'd select a name of the practice. The stakes were that the submitter of the winning submission would receive a signed copy of Jerry Weinberg's book Perfect Software, and that the speakers and attendees would use and promote the term.

The speakers and attendees submitted nearly 50 ideas. The speakers voted that list down to their top 4, and then the attendees voted for their favorite. In a very close vote, the winning submission from Philip Nguyen was User Experience Under Load (congratulations Philip!).

Friday, July 29, 2011

Google Page Speed Service – The death of the Web Performance Optimization consultant?

Fred Beringer of SOASTA posed that question on his blog yesterday.

An interesting question, so being a tester, what did I do? Right, I tested it. It took all of one test for me to come to my conclusion...

NOT WITH RESULTS LIKE THIS!!

Google Page Speed Service Test 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."