I've taken some heat for discussing the whole "is test dead" concept due to a feeling that I was validating the concept of testing being unnecessary. Allow me to clarify my position. I do not believe, for one heartbeat, that testing as an activity is in any way unnecessary. I do believe that there are things related to the current state of and common beliefs about testing that should die. With that said...
Scott Barber's Top 10 Things About Testing That Should Die:
10. Egocentricity
Face reality testers, neither the product nor the business revolve around you. Think about it. No business, no product, no developers => no need for testers. In fact, if developers wrote perfect code, there'd be no need for you. You are a service provider and your primary clients are the managers, developers, and/or executives. Your secondary clients are product users and investors. So stop whining and stomping your feet when your clients don't make decisions you like with the information you provide. It's not your call. If you want it to be your call, get on track to become a project manager, product manager, or executive, otherwise, get right with the fact that you provide a service (hopefully a valuable one) and get back to providing it.
This is where Scott Barber shares his thoughts, opinions, ideas and endorsements related to software testing in general, performance testing in specific, and improving the alignment of software development projects with business goals and risks.
Tuesday, November 29, 2011
Tuesday, November 8, 2011
On the Alleged Death of Testing
Out of respect for your time, I'll give you the bottom line up front as a simulated interview that I privately hoped for, but never came. After the mock interview is a supporting narrative for those of you more interested in my thinking on the matter.
Q: There's been a lot of talk recently about testing being dead, so my first question is testing dead?
A: No.
Q: Some of those talking about the alleged death of testing are saying that it's not that testing as a whole is dead, but that testing as it is commonly understood today dead. Is it?
A: No.
Q: Ok, so is testing as it is commonly understood today dying?
A: Not that I can see.
Q: Then why all the talk about testing "as we know it" being dead?
A: IMHO? Wishful thinking.
Q: There's been a lot of talk recently about testing being dead, so my first question is testing dead?
A: No.
Q: Some of those talking about the alleged death of testing are saying that it's not that testing as a whole is dead, but that testing as it is commonly understood today dead. Is it?
A: No.
Q: Ok, so is testing as it is commonly understood today dying?
A: Not that I can see.
Q: Then why all the talk about testing "as we know it" being dead?
A: IMHO? Wishful thinking.
Thursday, October 27, 2011
WOPR 17, my takeaways
The Workshop
On Performance and Reliability (WOPR) 17 was held Oct 20-22, 2011 on the
theme of “Finding Bottlenecks”. Beyond
the fact that this was a historic event in the sense that no other peer
workshop inspired by LAWST has convened this
many times. Of course, as a co-founder
of WOPR, I’m (somewhat unreasonably) proud of this accomplishment, but the fact
that over the last 9 years so many folks have been so inspired by the community
and value of WOPR that they have been willing to volunteer their time to plan
and organize these events, their companies have been willing to donate meeting
space (and often food & goodies), and participants have been frequently
willing to pay their own way (sometime taking vacation time) to attend makes 17
events, one every 6 months, since WOPR 1 is a significant achievement – whether
or not my “founder’s pride” is justified. :)
As is the tradition of WOPR 20-25 folks, selected or invited
by the “content owner” (a.k.a. the person or team who chose the theme to be
explored this time) brought their personal experiences related to “Finding
Bottlenecks” to share and explore with one another. Also as is the tradition, certain patterns
and commonalities emerged as these experiences were described and discussed.
Everyone has their own take, there are no official findings, and I’m not even
going to pretend that I can attribute all the contributing experiences and/or
conversations to my takeaways below.
- Finding bottlenecks can be technically challenging, examples include:
- Analyzing the test & the data is far from straight forward
- The “most useful” tools to narrow down the bottleneck may not be available – forcing us to be technically “creative” to work around those roadblocks.
- Finding bottlenecks can be *very* socio-politically challenging, examples include:o
- Lack of Trust (e.g. “That’s not a bottleneck, that’s the tool!”)
- Denial (e.g. “It’s not possible that’s related to my code!”)
- Lack of cross-team collaboration (e.g. “No, you can’t install that monitor on *our* system!”)
- Sometimes human bottlenecks need to be resolved before technical bottlenecks can be found. (e.g. Perf Team being redirected, resources being re-allocated, excessive micromanagement, etc.)
Some other topics that came up that were relevant and
interesting (such as the frequent discrepancy between tester/technical goals
& business goals), but since these weren’t “on theme” we didn’t discuss
these topics deeply enough for me to draw any conclusions other than “the
points and positions that did come up were consistent with what I would have
anticipated if I’d thought about it in advance”, which, for me, is a nice
confirmation.
My point in sharing these thoughts on finding bottlenecks is
so that all the folks out there who feel like theirs is the only organization
that is thwarted by socio-political challenges even more than technical ones
can realize that they really aren’t alone.
The findings of WOPR17 are the result of the collective
effort of the workshop participants: AJ Alhait, Scott Barber, Goranka Bjedov,
Jeremy Brown, Dan Downing, Craig Fuget, Dawn Haynes, Doug Hoffman, Paul
Holland, Pam Holt, Ed King, Richard Leeke, Yury Makedonov, Emily Maslyn, Greg
McNelly, John Meza, Blaine Morgan, Mimi Niemiller, Eric Proegler, Raymond
Rivest, Bob Sklar, Roland Stens, and Nishi Uppal.
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me
Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing
"If you can see it in your mind...
you will find it in your life."
Monday, October 24, 2011
Best Ice Cream Practice
A twitter conversation from Friday, Oct 21...
@TesterAB Anna Baik: What's best practice for icecream? I don't know what flavour of icecream I should eat, and I'm afraid to get it wrong.
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me
Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing
"If you can see it in your mind...
you will find it in your life."
@TesterAB Anna Baik: What's best practice for icecream? I don't know what flavour of icecream I should eat, and I'm afraid to get it wrong.
- @skillinen Sylvia Killinen: @TesterAB Test ALL the ice cream, that way you'll know which one best satisfies conformance. :)
- @TesterAB Anna Baik: @skillinen I like this idea very much :)
- @adampknight Adam Knight: @TesterAB it's vanilla, if you're not eating vanilla you are doing it wrong. I'd suggest getting yourself CVM certified as soon as you can
- @adampknight Adam Knight: @sbarber @TesterAB we should be specific. I'll clarify in my "10 ways to check if you are truly vanilla" blog post #BestIceCreamPractice
- @sbarber Scott Barber: @adampknight @TesterAB Certified Valuation Manager ™? No, no, no, that's only appropriate for *children's* icecream! #BestIceCreamPractice
- @testingqa Guy Mason: @TesterAB Best to go for that which you most prefer at that point in time?
- @TesterAB Anna Baik: @testingqa No no no. There must be one flavour of icecream that is best for everybody to eat at all points in time.
- @sbarber Scott Barber: @TesterAB @testingqa ROTFL!! That would be the #BestIceCreamPractice, right?
- @TesterAB Anna Baik: @sbarber @testingqa Yes! None of this wishy-washy nonsense, I only want to eat the BEST flavour of icecream. #BestIceCreamPractice
- @sbarber Scott Barber: @TesterAB @testingqa So chocolate, pistachio, lemon sorbet, raspberry swirl, topped with caramel & orange soda, right? #BestIceCreamPractice
- @TesterAB Anna Baik: @sbarber @testingqa #BestIceCreamPractice Finally, someone who'll give me an answer! ...wait. How do I know you're qualified?
- @sbarber Scott Barber: @TesterAB @testingqa #BestIceCreamPractice I founded a non-profit to establish BICP qualification stds and issued myself a certification.
- @TesterAB Anna Baik: @sbarber @testingqa #BestIceCreamPractice Sounds reassuring, I knew there'd be an Official Body somewhere to tell me what icecream to eat
- @sbarber Scott Barber: @TesterAB @testingqa #BestIceCreamPractice The invoice for my services are in the mail. $400/hr + $25,000 for the BICP flavor report.
- @can_test Paul Carvalho: @sbarber @TesterAB @testingqa do you have a template for that report? ;)
- @sbarber Scott Barber: @can_test @TesterAB @testingqa Oh yes, but it's only for members. Membership is a modest $2,500/year #BestIceCreamPractice
- @TesterAB Anna Baik: @sbarber @testingqa Eeek! Don't I even get something to show to people to prove I now know #BestIceCreamPractice?
- @sbarber Scott Barber: @TesterAB @testingqa #BestIceCreamPractice when check clears we mail you a Certified BICP Practitioner Certificate (suitable for framing)
- @testalways Eusebiu Blindu: @sbarber @adampknight @TesterAB I am thinking to create some "opensource" certificate, free to use in CV, with listings in some website
- @sbarber Scott Barber: @testalways @adampknight @TesterAB #BestIceCreamPractice no worries, I have mtgs w/ governments to mandate BICP certs for icecream buyers
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me
Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing
"If you can see it in your mind...
you will find it in your life."
Tuesday, October 18, 2011
Please, no new "certifications"
I just saw an advertisement for this Building a Certification Testing Program - Cutting through the hype to see how it really works on LinkedIn, and I couldn't stop myself from adding the following comment:
I mean, seriously, have any of you seen any data that you'd consider either statistically significant, empirical (vs. anecdotal), or free enough from obvious experimental design flaws to support the claims we see from "certifying bodies"? If you have, please share the data with me and I'll list it in line -- unless of course, it's flawed, in which case, I'd be happy to point out how and why the data doesn't support the conclusion.
Otherwise, please, please, please don't engage in creating more of these things. Please.
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me
Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing
"If you can see it in your mind...
you will find it in your life."
Please make it stop. We don't need more "certification" programs -- not unless you are going to be the first organization that allows itself to be held legally and financially accountable when people you "certify" can't do what you "certified" they can.Seriously, this drives me insane. Others can make stands about content, assessment methods, etc. -- I have my opinions on those things, but honestly that part of the topic bores me. People decide what university to attend, what to major in, what electives to take, etc. for their degree programs ... they can decide on whether or not the content of some professional development program (with or without "certification" program) is worth their effort. What I want to see is the "certifying bodies" being held accountable for complying with the claims they make about the individuals they "certify."
Otherwise, conduct all the training you want. Assess student performance if you want. Only "pass" students who "pass" the assessment if you want.
Just do us all a favor and *STOP* calling it certification until you are willing to do things like:
The list goes on, but I know it's pointless. The certification machine will continue no matter how loudly, or how frequently I point out the ways in which it is frequently (at least arguably) unethical and fraudulent - at least in "testerland."
- reimburse hiring expenses to employers who hire folks you certified as being able to X who can't X
- implement periodic re-assessment to enforce some bar of continued knowledge/skill/ability over time
- implement some way to revoke certifications of folks who fail to demonstrate knowledge/skill/ability in the workforce
I mean, seriously, have any of you seen any data that you'd consider either statistically significant, empirical (vs. anecdotal), or free enough from obvious experimental design flaws to support the claims we see from "certifying bodies"? If you have, please share the data with me and I'll list it in line -- unless of course, it's flawed, in which case, I'd be happy to point out how and why the data doesn't support the conclusion.
Otherwise, please, please, please don't engage in creating more of these things. Please.
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me
Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing
"If you can see it in your mind...
you will find it in your life."
Subscribe to:
Posts (Atom)