Tuesday, September 20, 2011

Candidate Statement for CMG Director

I've been nominated as a director candidate for the CMG. My candidate statement is posted below because my views related to CMG mirror my views for application performance in organizations and the industry as a whole and I believe that is (or, at least, I hope it is) interesting to anyone involved or concerned with challenges related to application performance now and in the future.

If you are a CMG member, I encourage you to review all of the candidate statements and to vote your conscience here.
Remember, if you don't vote, you have no right to complain. ;)

Statement of Willingness to Serve:
I am willing and would consider it an honor to serve as a director for CMG if elected.

Professional Work Experience:
In my nearly 20 years of experience working in software and technology, I have performed the duties associated with virtually all of the commonly thought of roles; from analyst to project management, configuration management to IT support, and developer to CIO. These many experiences coalesced shortly after Y2K into a career focused on helping organizations improve software system performance to enhance user experience and enable smooth growth while avoiding speed, stability, and scalability catastrophes in a fiscally responsible manner.

Friday, September 2, 2011

Thoughts on Agile & Agile Testing

This past weekend, I finally made time to start reading Agile Testing: A Practical Guide For Testers And Agile Teams, Lisa Crispin & Janet Gregory, Addison-Wesley (2009).  I made it through the first two chapters before life called me away.  After I put the book down and starting going about accomplishing a mundane series of errands, I realized that I was feeling disappointed and that the disappointment had started growing just a few pages into the book.  Not because of what the book had to say, what it said was pretty good – not exactly how I would have expressed a few things, but thus is the plight of a writer reading what someone else has written on a topic they also care and write about.  What was disappointing me was the fact that the stuff in those chapters needed to be said at all.

You see, as Lisa and Janet were describing what Agile Testing and Testing on Agile Teams was all about, and explaining how it is “different” than “traditional testing”, my first thought was:

Tuesday, August 23, 2011

STP Online Summit: Achieving Business Value with Test Automation

Due to the overwhelming success and positive reviews of the last STP Online Summit: Business Value of Performance Testing, we've decided to do it again -- only this time, we're going to explore Achieving Business Value with Test Automation.

Join me (while I continue practicing my radio host skills for my emergency back-up career as a sportscaster) and 7 other presenters that I consider to be elite practitioners, teachers, and thinkers in their test automation areas of specialization for 3 half days online to learn their tips and methods for achieving business value with test automation. If you or your organization are using, or thinking about using, automation to enhance or improve your testing, you're not going to want to miss this online summit. I honestly can't think of anywhere else you can get this concentration of relevant and thematically targeted information at a better price, but you be the judge:

When: Tuesday October 11 10:00AM - Thursday October 13 1:30PM PST

Cost: $195 USD before 9/26/11 $245 USD after 9/26/11

Theme: For more than 15 years organizations have been investing in the promise of better, cheaper, and faster testing through automation. While some companies have achieved demonstrable business value from their forays into test automation, many others have experienced questionable to negative returns on their investments. Join your host, Scott Barber, for this three day online summit, to hear how seven recognized leaders in test automation have achieved real business value by implementing a variety of automation flavors and styles for their employers and clients. Learn how to answer the ROI question by focusing on business value instead of testing tasks, and how to implement automation in ways that deliver that value to the business, not just to the development and/or test team.



Thursday, August 4, 2011

Scott Barber Interviewed by Matt Heusser; Podcast

Two part podcast on the STP site. I say some interesting stuff... or at least I say some stuff that's interesting to me. :)

Twist #52 - With Scott Barber

Twist #53 - The Return of the Barber
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Monday, August 1, 2011

Performance Testing Practice Named During Online Summit

Last week, I hosted STP's Online Performance Summit, a 3 half-day, 9 session, live, interactive webinar. As far as I know, this was the first multi-presenter, multi-day, live webinar by testers for testers. The feedback from attendees and presenters that I have seen has all been very positive, and personally, I think it went very well. On top of that, I had a whole lot of fun playing "radio talk show host".

The event sold out early at 100 attendees with more folks wanting to attend, but were unable. Since this was an experiment of sorts in terms of format and delivery, we made a commitment to the smallest and least expensive level of service from the webinar technology provider, and by the time we realized we had more interest than "seats", it was simply too late to make the necessary service changes to accommodate more folks. We won't be making that mistake again for our next online summit to be held October 11-13 on the topic of "Achieving Business Value with Test Automation". Keep your eyes on the STP website for more information about that and other future summits.

With all of that context, now to the point of this post. During Eric Proegler's session (Strategies for Performance Testing Integrated Sub-Systems), a conversation emerged in which it became apparent that many performance testers conduct some kind of testing that involves real users interacting with the system under test while a performance/load/stress test was running for the purposes of:
  • Linking the numbers generated through performance tests to the degree of satisfaction of actual human users.
  • Identifying items that human users classify as performance issues that do not appear to be issues based on the numbers alone.
  • Convincing stakeholders that the only metric we can collect that can be conclusively linked to user satisfaction with production performance is the percent of users satisfied with performance during production conditions.
The next thing that became apparent was that everyone who engaged in the conversation called this something different. So we didn't do what one would justifiably expect a bunch of testers to do (i.e. have an ugly argument about who's term came first, is more correct, that continues until no decision is made and all goodwill is lost). Instead, we held a contest to name the practice. We invited the speakers and attendees to submit their ideas, from which we'd select a name of the practice. The stakes were that the submitter of the winning submission would receive a signed copy of Jerry Weinberg's book Perfect Software, and that the speakers and attendees would use and promote the term.

The speakers and attendees submitted nearly 50 ideas. The speakers voted that list down to their top 4, and then the attendees voted for their favorite. In a very close vote, the winning submission from Philip Nguyen was User Experience Under Load (congratulations Philip!).