Showing posts with label Best Practices. Show all posts
Showing posts with label Best Practices. Show all posts

Tuesday, November 29, 2011

10 Things About Testing That Should Die

I've taken some heat for discussing the whole "is test dead" concept due to a feeling that I was validating the concept of testing being unnecessary. Allow me to clarify my position. I do not believe, for one heartbeat, that testing as an activity is in any way unnecessary. I do believe that there are things related to the current state of and common beliefs about testing that should die. With that said...

Scott Barber's Top 10 Things About Testing That Should Die: 

10. Egocentricity

Face reality testers, neither the product nor the business revolve around you. Think about it. No business, no product, no developers => no need for testers. In fact, if developers wrote perfect code, there'd be no need for you. You are a service provider and your primary clients are the managers, developers, and/or executives. Your secondary clients are product users and investors. So stop whining and stomping your feet when your clients don't make decisions you like with the information you provide. It's not your call. If you want it to be your call, get on track to become a project manager, product manager, or executive, otherwise, get right with the fact that you provide a service (hopefully a valuable one) and get back to providing it.

Tuesday, November 8, 2011

On the Alleged Death of Testing

Out of respect for your time, I'll give you the bottom line up front as a simulated interview that I privately hoped for, but never came. After the mock interview is a supporting narrative for those of you more interested in my thinking on the matter.

Q: There's been a lot of talk recently about testing being dead, so my first question is testing dead?
A: No.

Q: Some of those talking about the alleged death of testing are saying that it's not that testing as a whole is dead, but that testing as it is commonly understood today dead. Is it?
A: No.

Q: Ok, so is testing as it is commonly understood today dying?
A: Not that I can see.

Q: Then why all the talk about testing "as we know it" being dead?
A: IMHO? Wishful thinking.

Monday, October 24, 2011

Best Ice Cream Practice

A twitter conversation from Friday, Oct 21...

@TesterAB Anna Baik: What's best practice for icecream? I don't know what flavour of icecream I should eat, and I'm afraid to get it wrong.
  • @skillinen Sylvia Killinen: @TesterAB Test ALL the ice cream, that way you'll know which one best satisfies conformance. :)
  • @adampknight Adam Knight: @TesterAB it's vanilla, if you're not eating vanilla you are doing it wrong. I'd suggest getting yourself CVM certified as soon as you can
    • @adampknight  Adam Knight: @sbarber @TesterAB we should be specific. I'll clarify in my "10 ways to check if you are truly vanilla" blog post #BestIceCreamPractice
    • @sbarber Scott Barber: @adampknight @TesterAB Certified Valuation Manager ™? No, no, no, that's only appropriate for *children's* icecream! #BestIceCreamPractice
  • @testingqa Guy Mason: @TesterAB Best to go for that which you most prefer at that point in time?
    • @TesterAB Anna Baik: @testingqa No no no. There must be one flavour of icecream that is best for everybody to eat at all points in time.
    • @TesterAB Anna Baik: @sbarber @testingqa Yes! None of this wishy-washy nonsense, I only want to eat the BEST flavour of icecream. #BestIceCreamPractice
    • @sbarber Scott Barber: @TesterAB @testingqa So chocolate, pistachio, lemon sorbet, raspberry swirl, topped with caramel & orange soda, right? #BestIceCreamPractice
    • @TesterAB Anna Baik: @sbarber @testingqa #BestIceCreamPractice Finally, someone who'll give me an answer! ...wait. How do I know you're qualified?
    • @sbarber Scott Barber: @TesterAB @testingqa #BestIceCreamPractice I founded a non-profit to establish BICP qualification stds and issued myself a certification.
    • @TesterAB Anna Baik: @sbarber @testingqa #BestIceCreamPractice Sounds reassuring, I knew there'd be an Official Body somewhere to tell me what icecream to eat
    • @sbarber Scott Barber: @TesterAB @testingqa #BestIceCreamPractice The invoice for my services are in the mail. $400/hr + $25,000 for the BICP flavor report.
    • @TesterAB Anna Baik: @sbarber @testingqa Eeek! Don't I even get something to show to people to prove I now know #BestIceCreamPractice?
    • @sbarber Scott Barber: @TesterAB @testingqa #BestIceCreamPractice when check clears we mail you a Certified BICP Practitioner Certificate (suitable for framing)
Questions?  No? Didn't think so.  :)

 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Monday, August 1, 2011

Performance Testing Practice Named During Online Summit

Last week, I hosted STP's Online Performance Summit, a 3 half-day, 9 session, live, interactive webinar. As far as I know, this was the first multi-presenter, multi-day, live webinar by testers for testers. The feedback from attendees and presenters that I have seen has all been very positive, and personally, I think it went very well. On top of that, I had a whole lot of fun playing "radio talk show host".

The event sold out early at 100 attendees with more folks wanting to attend, but were unable. Since this was an experiment of sorts in terms of format and delivery, we made a commitment to the smallest and least expensive level of service from the webinar technology provider, and by the time we realized we had more interest than "seats", it was simply too late to make the necessary service changes to accommodate more folks. We won't be making that mistake again for our next online summit to be held October 11-13 on the topic of "Achieving Business Value with Test Automation". Keep your eyes on the STP website for more information about that and other future summits.

With all of that context, now to the point of this post. During Eric Proegler's session (Strategies for Performance Testing Integrated Sub-Systems), a conversation emerged in which it became apparent that many performance testers conduct some kind of testing that involves real users interacting with the system under test while a performance/load/stress test was running for the purposes of:
  • Linking the numbers generated through performance tests to the degree of satisfaction of actual human users.
  • Identifying items that human users classify as performance issues that do not appear to be issues based on the numbers alone.
  • Convincing stakeholders that the only metric we can collect that can be conclusively linked to user satisfaction with production performance is the percent of users satisfied with performance during production conditions.
The next thing that became apparent was that everyone who engaged in the conversation called this something different. So we didn't do what one would justifiably expect a bunch of testers to do (i.e. have an ugly argument about who's term came first, is more correct, that continues until no decision is made and all goodwill is lost). Instead, we held a contest to name the practice. We invited the speakers and attendees to submit their ideas, from which we'd select a name of the practice. The stakes were that the submitter of the winning submission would receive a signed copy of Jerry Weinberg's book Perfect Software, and that the speakers and attendees would use and promote the term.

The speakers and attendees submitted nearly 50 ideas. The speakers voted that list down to their top 4, and then the attendees voted for their favorite. In a very close vote, the winning submission from Philip Nguyen was User Experience Under Load (congratulations Philip!).

Wednesday, September 23, 2009

Testing vs. Checking ... my 2 cents.

I was pleased to see Michael Bolton's series on Testing vs. Checking. If you haven't been following, what I consider to be the central thread of the topic (and the unfortunately inevitable fallout that seems to happen in "testerland" almost any time someone says something that makes sense).
From Michael:
From James Bach:
From Scott Barber:

Saturday, January 3, 2009

A misleading benchmark...

No further commentary needed.

Dilbert.com
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Saturday, June 28, 2008

Testing Lessons From Civil Engineering

Below is the paper I submitted as a prologue to an experience report, discussion, and (hopefully) additional research that I'm presenting for the first time during CAST08:

Engineers don’t look at the world the same way that testers do.  Engineers look at the world with an eye to solving problems.  Testers look at the world with an eye toward finding problems to solve.  This seems logical.  What is less logical is the fact that engineers, and I’m talking about the kind of engineers that deal with physical objects, seem to be much more sophisticated in their testing than testers.  In fact, most of what I know about testing, I learned as a civil engineering student.  We didn’t call most of it testing.  We didn’t even identify it as anything other than “You really want to get this right.” Maybe Civil Engineers test better than software testers because of the motivations to “get it right”.  Consider what happens when a piece of Civil Engineering, like a bridge fails:

Monday, November 20, 2006

What Best Practices really are. -- CIO Article

Of all the places I expected to find an article supporting the fact that Best Practices is nothing more than a square on someone's buzz-word bingo card, CIO wasn't it. The highlights are these...
Using celebs for endorsements has become such best practice that everyone does it. So what is best practice about it? Nothing. The phrase is simply a demonstration of how cliched business language dresses up the concept of copying something someone else has done. And when lots of companies copy the copier, it becomes dull, intellectually stagnant and offers no competitive advantage. It's just a me-too strategy executed by the cynical, the lazy, or the lazy cynics.