Tuesday, October 18, 2011

Please, no new "certifications"

I just saw an advertisement for this Building a Certification Testing Program - Cutting through the hype to see how it really works on LinkedIn, and I couldn't stop myself from adding the following comment:
Please make it stop. We don't need more "certification" programs -- not unless you are going to be the first organization that allows itself to be held legally and financially accountable when people you "certify" can't do what you "certified" they can.

Otherwise, conduct all the training you want. Assess student performance if you want. Only "pass" students who "pass" the assessment if you want.

Just do us all a favor and *STOP* calling it certification until you are willing to do things like:
  • reimburse hiring expenses to employers who hire folks you certified as being able to X who can't X
  • implement periodic re-assessment to enforce some bar of continued knowledge/skill/ability over time
  • implement some way to revoke certifications of folks who fail to demonstrate knowledge/skill/ability in the workforce
The list goes on, but I know it's pointless. The certification machine will continue no matter how loudly, or how frequently I point out the ways in which it is frequently (at least arguably) unethical and fraudulent - at least in "testerland."
Seriously, this drives me insane.  Others can make stands about content, assessment methods, etc. -- I have my opinions on those things, but honestly that part of the topic bores me.  People decide what university to attend, what to major in, what electives to take, etc. for their degree programs ... they can decide on whether or not the content of some professional development program (with or without "certification" program) is worth their effort.  What I want to see is the "certifying bodies" being held accountable for complying with the claims they make about the individuals they "certify."

I mean, seriously, have any of you seen any data that you'd consider either statistically significant, empirical (vs. anecdotal), or free enough from obvious experimental design flaws to support the claims we see from "certifying bodies"?  If you have, please share the data with me and I'll list it in line -- unless of course, it's flawed, in which case, I'd be happy to point out how and why the data doesn't support the conclusion.

Otherwise, please, please, please don't engage in creating more of these things.  Please.

--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Monday, October 17, 2011

Having lunch with a giant...

I "officially" started my career in software performance in Feb of 2000, only much later to realize I'd started down that path years prior.  In the fall of 2001 (10 years ago), I felt I was stagnating in my self-guided education and went on a hunt for books, articles, training, and/or people to learn from.  I found some peers (and eventually co-founded WOPR with Ross Collard to maximize peer learning) and I found 3 "giants" on whose shoulders I've stood since then (meaning, all of my work was and has remained consistent, complimentary, and/or extended from their work in the field).  Those "giants" are Connie Smith, Ph.D. (Software Performance Engineering), Daniel Menasce, Ph,D. (Capacity and Scalability Planning) and Alberto Savoia (Performance Testing).

Last fall, I had the honor of being on a panel with Connie and spending some time talking to Daniel during the CMG conference in Orlando.  I'd never spoken or corresponded with them before that, but it was nice to meet them and we had some great conversations.

Over the years, however, I have corresponded regularly with Alberto Savoia.  As it turns out, he was moving on to what he would now call his next "it" from software performance as I was becoming known in the industry, so we didn't converse regularly, but we did follow each others careers.  During that time, I drew a lot of inspiration from Alberto.  Not just from the work he'd done in the software performance space, but also from his other accomplishments in technology, the kind and complimentary recommendations he gave me and by graciously agreeing to write a forward for Performance Testing Guidance for Web Applications when I asked.

So earlier this year when I had the chance, I dropped everything to review and comment on his new "it", Pretotyping. He said the review was helpful and that some of what I'd commented on would be included in the next version.

Today, I finally met Alberto face to face.  We had lunch.  We talked about projects & passions old and new, we recalled history and speculated about the future.  He gave me a signed copy of Pretotype It, and I gave him a signed copy of Web Load Testing for Dummies, both of which had been prepared in advance.  And while Alberto has accomplished far more in his technology career than I have, somehow I didn't feel like I was having lunch with the "giant" on whose shoulders most of the work I am known for stands, I felt like I was having lunch with an old friend that I hadn't seen in too long.

To some of you, I suspect this seems a silly thing for me to be making a big deal about, but for a guy who left a small town twenty-some-odd years ago, never imagining that I'd meet anyone "famous", let alone become a "celebrity" of sorts in my (admittedly very small) field, it means a lot to me that someone who I've often credited as being a luminary to me, would not only take the time to have lunch with me, but to share thoughts and ideas with me like friends do.

So, thanks Alberto.  Thanks for the years of inspiration & thanks for the confirmation of friendship.  It means a lot to me, and know that you've provide me with another lead I intend to follow with anyone I may inspire during my career and later have the opportunity to meet.


--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Thursday, October 13, 2011

Top 10 Automation Tips from STP Online Summit

I had the pleasure of hosting the second Online Summit, delivered by Software Test Professionals: Achieving Business Value With Test Automation.  The online summit format consists of 3 sessions each for 3 consecutive days.  The sessions for this summit were:
One of my duties as host was to try to summarize the most valuable nuggets of information from across all of the presentations into a "top tips" list.  This is what I came up with:

Scott's Top 10 Automation Tips from:


Wednesday, October 5, 2011

Web Load Testing for Dummies: Book Announcement



"More so now than ever before, your company’s website and web applications are critical to the success of your business initiatives. Think of all the business generated or sustained via the World Wide Web today compared to any other time in history — in today’s digital culture, a business with any sort of crucial web presence needs to make sure that its website is working hard for the business and not against it. That’s what web load testing is all about.

"Key to success on the web is customer experience, which means that web application performance is a priority. Not convinced? Spend a few moments thinking about the impact to your business (in other words, think about how angry the CEO and/or investors would be) if:
  ✓ Your new application launch is delayed due to performance problems
  ✓ Your site breaks under the load of your successful marketing promotion
  ✓ High-traffic volume causes such poor web performance on your busiest online shopping day that abandonment skyrockets and conversions plummet
  ✓ Your new infrastructure is configured improperly, grinding the website to a crawl

"Managers and executives of organizations that derive significant portions of their revenue from web applications realize that they need to focus more on protecting revenue, reducing risk, and ensuring that customers have great experiences. They see how web applications that perform well on release day and throughout their production lives strengthen the company’s brand and reputation, creating customer loyalty. In other words, web load testing is a critical component to any risk management plan for web applications."

Get the eBook version free here.

--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Tuesday, October 4, 2011

Is Junosphere the world's first cloud testing environment? Not really - IT in Context

Is Junosphere the world's first cloud testing environment? Not really - IT in Context

I don't get too irked by companies coining new phrases to make subtle marketing distinctions in services, but when they do it so they can make first/best claims flips my bozo bit. Seriously, if your service is so bland or weak that you need to invent a new term so you can claim that it's the "best the best thing called blah" without being called out for fraud, maybe you should just improve your service.

--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
you will find it in your life."