Showing posts with label Experimental Design. Show all posts
Showing posts with label Experimental Design. Show all posts

Tuesday, October 18, 2011

Please, no new "certifications"

I just saw an advertisement for this Building a Certification Testing Program - Cutting through the hype to see how it really works on LinkedIn, and I couldn't stop myself from adding the following comment:
Please make it stop. We don't need more "certification" programs -- not unless you are going to be the first organization that allows itself to be held legally and financially accountable when people you "certify" can't do what you "certified" they can.

Otherwise, conduct all the training you want. Assess student performance if you want. Only "pass" students who "pass" the assessment if you want.

Just do us all a favor and *STOP* calling it certification until you are willing to do things like:
  • reimburse hiring expenses to employers who hire folks you certified as being able to X who can't X
  • implement periodic re-assessment to enforce some bar of continued knowledge/skill/ability over time
  • implement some way to revoke certifications of folks who fail to demonstrate knowledge/skill/ability in the workforce
The list goes on, but I know it's pointless. The certification machine will continue no matter how loudly, or how frequently I point out the ways in which it is frequently (at least arguably) unethical and fraudulent - at least in "testerland."
Seriously, this drives me insane.  Others can make stands about content, assessment methods, etc. -- I have my opinions on those things, but honestly that part of the topic bores me.  People decide what university to attend, what to major in, what electives to take, etc. for their degree programs ... they can decide on whether or not the content of some professional development program (with or without "certification" program) is worth their effort.  What I want to see is the "certifying bodies" being held accountable for complying with the claims they make about the individuals they "certify."

I mean, seriously, have any of you seen any data that you'd consider either statistically significant, empirical (vs. anecdotal), or free enough from obvious experimental design flaws to support the claims we see from "certifying bodies"?  If you have, please share the data with me and I'll list it in line -- unless of course, it's flawed, in which case, I'd be happy to point out how and why the data doesn't support the conclusion.

Otherwise, please, please, please don't engage in creating more of these things.  Please.

--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Thursday, September 29, 2011

Making Every Test Count

This is from a while back, but I wouldn't call it dated.  It's a webinar, it runs for 48 min.  I like it, for whatever that's worth.  ;)

Abstract:

Do you ever find yourself wondering what the point is to executing this test... again!?!  Have you ever felt like the purpose of a test is to ensure there is a check mark in a particular check box?  Are you ever asked to get *more* information in even less time with even fewer resources than the lst test project you worked on?

In this presentation, Scott Barber will introduce you to a variety of tips and techniques you can apply to virtually any testing you do as you strive to make ever test you execute add value to the project.


--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Wednesday, September 23, 2009

Testing vs. Checking ... my 2 cents.

I was pleased to see Michael Bolton's series on Testing vs. Checking. If you haven't been following, what I consider to be the central thread of the topic (and the unfortunately inevitable fallout that seems to happen in "testerland" almost any time someone says something that makes sense).
From Michael:
From James Bach:
From Scott Barber:

Saturday, June 28, 2008

Testing Lessons From Civil Engineering

Below is the paper I submitted as a prologue to an experience report, discussion, and (hopefully) additional research that I'm presenting for the first time during CAST08:

Engineers don’t look at the world the same way that testers do.  Engineers look at the world with an eye to solving problems.  Testers look at the world with an eye toward finding problems to solve.  This seems logical.  What is less logical is the fact that engineers, and I’m talking about the kind of engineers that deal with physical objects, seem to be much more sophisticated in their testing than testers.  In fact, most of what I know about testing, I learned as a civil engineering student.  We didn’t call most of it testing.  We didn’t even identify it as anything other than “You really want to get this right.” Maybe Civil Engineers test better than software testers because of the motivations to “get it right”.  Consider what happens when a piece of Civil Engineering, like a bridge fails: