Showing posts with label Lessons. Show all posts
Showing posts with label Lessons. Show all posts

Monday, December 5, 2011

10 Must-Know Tips for Performance Test Automation

More than other automation, bad performance test automation leads to:
  • Undetectably incorrect results
  • Good release decisions, based on bad data
  • Surprising, catastrophic failures in production
  • Incorrect hardware purchases
  • Extended down-time
  • Significant media coverage and brand erosion
More than other automation, performance test automation demands:
  • Clear objectives (not pass/fail requirements)
  • Valid application usage models
  • Detailed knowledge of the system and the business
  • External test monitoring
  • Cross-team collaboration
Unfortunately, bad performance test automation is:
  • Very easy to create,
  • Difficult to detect, and
  • More difficult to correct.
The following 10 tips, based on my own experiences, will help you avoid creating bad performance test automation in the first place.

Tip #10: Data Design
  • *Lots* of test data is essential (at least 3 sets per user to be simulated – 10 is not uncommon)
  • Test Data to be unique and minimally overlapping (updating the same row in a database 1000 times has a different performance profile than 1000 different rows)
  • Consider changed/consumed data (a search will provide different results, and a item to be purchased may be out of stock without careful planning)
  • Don’t share your data environment (see above)

Tuesday, November 29, 2011

10 Things About Testing That Should Die

I've taken some heat for discussing the whole "is test dead" concept due to a feeling that I was validating the concept of testing being unnecessary. Allow me to clarify my position. I do not believe, for one heartbeat, that testing as an activity is in any way unnecessary. I do believe that there are things related to the current state of and common beliefs about testing that should die. With that said...

Scott Barber's Top 10 Things About Testing That Should Die: 

10. Egocentricity

Face reality testers, neither the product nor the business revolve around you. Think about it. No business, no product, no developers => no need for testers. In fact, if developers wrote perfect code, there'd be no need for you. You are a service provider and your primary clients are the managers, developers, and/or executives. Your secondary clients are product users and investors. So stop whining and stomping your feet when your clients don't make decisions you like with the information you provide. It's not your call. If you want it to be your call, get on track to become a project manager, product manager, or executive, otherwise, get right with the fact that you provide a service (hopefully a valuable one) and get back to providing it.

Tuesday, November 8, 2011

On the Alleged Death of Testing

Out of respect for your time, I'll give you the bottom line up front as a simulated interview that I privately hoped for, but never came. After the mock interview is a supporting narrative for those of you more interested in my thinking on the matter.

Q: There's been a lot of talk recently about testing being dead, so my first question is testing dead?
A: No.

Q: Some of those talking about the alleged death of testing are saying that it's not that testing as a whole is dead, but that testing as it is commonly understood today dead. Is it?
A: No.

Q: Ok, so is testing as it is commonly understood today dying?
A: Not that I can see.

Q: Then why all the talk about testing "as we know it" being dead?
A: IMHO? Wishful thinking.

Thursday, October 27, 2011

WOPR 17, my takeaways


The Workshop On Performance and Reliability (WOPR) 17 was held Oct 20-22, 2011 on the theme of “Finding Bottlenecks”.  Beyond the fact that this was a historic event in the sense that no other peer workshop inspired by LAWST has convened this many times.  Of course, as a co-founder of WOPR, I’m (somewhat unreasonably) proud of this accomplishment, but the fact that over the last 9 years so many folks have been so inspired by the community and value of WOPR that they have been willing to volunteer their time to plan and organize these events, their companies have been willing to donate meeting space (and often food & goodies), and participants have been frequently willing to pay their own way (sometime taking vacation time) to attend makes 17 events, one every 6 months, since WOPR 1 is a significant achievement – whether or not my “founder’s pride” is justified.  :)

As is the tradition of WOPR 20-25 folks, selected or invited by the “content owner” (a.k.a. the person or team who chose the theme to be explored this time) brought their personal experiences related to “Finding Bottlenecks” to share and explore with one another.  Also as is the tradition, certain patterns and commonalities emerged as these experiences were described and discussed. Everyone has their own take, there are no official findings, and I’m not even going to pretend that I can attribute all the contributing experiences and/or conversations to my takeaways below.

  • Finding bottlenecks can be technically challenging, examples include: 
    •  Analyzing the test & the data is far from straight forward 
    •  The “most useful” tools to narrow down the bottleneck may not be available – forcing us to be technically “creative” to work around those roadblocks.   
  • Finding bottlenecks can be *very* socio-politically challenging, examples include:o   
    • Lack of Trust (e.g. “That’s not a bottleneck, that’s the tool!”)
    • Denial (e.g. “It’s not possible that’s related to my code!”)
    • Lack of cross-team collaboration (e.g. “No, you can’t install that monitor on *our* system!”)
  • Sometimes human bottlenecks need to be resolved before technical bottlenecks can be found. (e.g. Perf Team being redirected, resources being re-allocated, excessive micromanagement, etc.)
Some other topics that came up that were relevant and interesting (such as the frequent discrepancy between tester/technical goals & business goals), but since these weren’t “on theme” we didn’t discuss these topics deeply enough for me to draw any conclusions other than “the points and positions that did come up were consistent with what I would have anticipated if I’d thought about it in advance”, which, for me, is a nice confirmation.

My point in sharing these thoughts on finding bottlenecks is so that all the folks out there who feel like theirs is the only organization that is thwarted by socio-political challenges even more than technical ones can realize that they really aren’t alone.

The findings of WOPR17 are the result of the collective effort of the workshop participants: AJ Alhait, Scott Barber, Goranka Bjedov, Jeremy Brown, Dan Downing, Craig Fuget, Dawn Haynes, Doug Hoffman, Paul Holland, Pam Holt, Ed King, Richard Leeke, Yury Makedonov, Emily Maslyn, Greg McNelly, John Meza, Blaine Morgan, Mimi Niemiller, Eric Proegler, Raymond Rivest, Bob Sklar, Roland Stens, and Nishi Uppal.
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Monday, October 17, 2011

Having lunch with a giant...

I "officially" started my career in software performance in Feb of 2000, only much later to realize I'd started down that path years prior.  In the fall of 2001 (10 years ago), I felt I was stagnating in my self-guided education and went on a hunt for books, articles, training, and/or people to learn from.  I found some peers (and eventually co-founded WOPR with Ross Collard to maximize peer learning) and I found 3 "giants" on whose shoulders I've stood since then (meaning, all of my work was and has remained consistent, complimentary, and/or extended from their work in the field).  Those "giants" are Connie Smith, Ph.D. (Software Performance Engineering), Daniel Menasce, Ph,D. (Capacity and Scalability Planning) and Alberto Savoia (Performance Testing).

Last fall, I had the honor of being on a panel with Connie and spending some time talking to Daniel during the CMG conference in Orlando.  I'd never spoken or corresponded with them before that, but it was nice to meet them and we had some great conversations.

Over the years, however, I have corresponded regularly with Alberto Savoia.  As it turns out, he was moving on to what he would now call his next "it" from software performance as I was becoming known in the industry, so we didn't converse regularly, but we did follow each others careers.  During that time, I drew a lot of inspiration from Alberto.  Not just from the work he'd done in the software performance space, but also from his other accomplishments in technology, the kind and complimentary recommendations he gave me and by graciously agreeing to write a forward for Performance Testing Guidance for Web Applications when I asked.

So earlier this year when I had the chance, I dropped everything to review and comment on his new "it", Pretotyping. He said the review was helpful and that some of what I'd commented on would be included in the next version.

Today, I finally met Alberto face to face.  We had lunch.  We talked about projects & passions old and new, we recalled history and speculated about the future.  He gave me a signed copy of Pretotype It, and I gave him a signed copy of Web Load Testing for Dummies, both of which had been prepared in advance.  And while Alberto has accomplished far more in his technology career than I have, somehow I didn't feel like I was having lunch with the "giant" on whose shoulders most of the work I am known for stands, I felt like I was having lunch with an old friend that I hadn't seen in too long.

To some of you, I suspect this seems a silly thing for me to be making a big deal about, but for a guy who left a small town twenty-some-odd years ago, never imagining that I'd meet anyone "famous", let alone become a "celebrity" of sorts in my (admittedly very small) field, it means a lot to me that someone who I've often credited as being a luminary to me, would not only take the time to have lunch with me, but to share thoughts and ideas with me like friends do.

So, thanks Alberto.  Thanks for the years of inspiration & thanks for the confirmation of friendship.  It means a lot to me, and know that you've provide me with another lead I intend to follow with anyone I may inspire during my career and later have the opportunity to meet.


--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Thursday, October 13, 2011

Top 10 Automation Tips from STP Online Summit

I had the pleasure of hosting the second Online Summit, delivered by Software Test Professionals: Achieving Business Value With Test Automation.  The online summit format consists of 3 sessions each for 3 consecutive days.  The sessions for this summit were:
One of my duties as host was to try to summarize the most valuable nuggets of information from across all of the presentations into a "top tips" list.  This is what I came up with:

Scott's Top 10 Automation Tips from:


Thursday, September 29, 2011

Making Every Test Count

This is from a while back, but I wouldn't call it dated.  It's a webinar, it runs for 48 min.  I like it, for whatever that's worth.  ;)

Abstract:

Do you ever find yourself wondering what the point is to executing this test... again!?!  Have you ever felt like the purpose of a test is to ensure there is a check mark in a particular check box?  Are you ever asked to get *more* information in even less time with even fewer resources than the lst test project you worked on?

In this presentation, Scott Barber will introduce you to a variety of tips and techniques you can apply to virtually any testing you do as you strive to make ever test you execute add value to the project.


--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Friday, September 2, 2011

Thoughts on Agile & Agile Testing

This past weekend, I finally made time to start reading Agile Testing: A Practical Guide For Testers And Agile Teams, Lisa Crispin & Janet Gregory, Addison-Wesley (2009).  I made it through the first two chapters before life called me away.  After I put the book down and starting going about accomplishing a mundane series of errands, I realized that I was feeling disappointed and that the disappointment had started growing just a few pages into the book.  Not because of what the book had to say, what it said was pretty good – not exactly how I would have expressed a few things, but thus is the plight of a writer reading what someone else has written on a topic they also care and write about.  What was disappointing me was the fact that the stuff in those chapters needed to be said at all.

You see, as Lisa and Janet were describing what Agile Testing and Testing on Agile Teams was all about, and explaining how it is “different” than “traditional testing”, my first thought was:

Thursday, August 4, 2011

Scott Barber Interviewed by Matt Heusser; Podcast

Two part podcast on the STP site. I say some interesting stuff... or at least I say some stuff that's interesting to me. :)

Twist #52 - With Scott Barber

Twist #53 - The Return of the Barber
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Monday, August 1, 2011

Performance Testing Practice Named During Online Summit

Last week, I hosted STP's Online Performance Summit, a 3 half-day, 9 session, live, interactive webinar. As far as I know, this was the first multi-presenter, multi-day, live webinar by testers for testers. The feedback from attendees and presenters that I have seen has all been very positive, and personally, I think it went very well. On top of that, I had a whole lot of fun playing "radio talk show host".

The event sold out early at 100 attendees with more folks wanting to attend, but were unable. Since this was an experiment of sorts in terms of format and delivery, we made a commitment to the smallest and least expensive level of service from the webinar technology provider, and by the time we realized we had more interest than "seats", it was simply too late to make the necessary service changes to accommodate more folks. We won't be making that mistake again for our next online summit to be held October 11-13 on the topic of "Achieving Business Value with Test Automation". Keep your eyes on the STP website for more information about that and other future summits.

With all of that context, now to the point of this post. During Eric Proegler's session (Strategies for Performance Testing Integrated Sub-Systems), a conversation emerged in which it became apparent that many performance testers conduct some kind of testing that involves real users interacting with the system under test while a performance/load/stress test was running for the purposes of:
  • Linking the numbers generated through performance tests to the degree of satisfaction of actual human users.
  • Identifying items that human users classify as performance issues that do not appear to be issues based on the numbers alone.
  • Convincing stakeholders that the only metric we can collect that can be conclusively linked to user satisfaction with production performance is the percent of users satisfied with performance during production conditions.
The next thing that became apparent was that everyone who engaged in the conversation called this something different. So we didn't do what one would justifiably expect a bunch of testers to do (i.e. have an ugly argument about who's term came first, is more correct, that continues until no decision is made and all goodwill is lost). Instead, we held a contest to name the practice. We invited the speakers and attendees to submit their ideas, from which we'd select a name of the practice. The stakes were that the submitter of the winning submission would receive a signed copy of Jerry Weinberg's book Perfect Software, and that the speakers and attendees would use and promote the term.

The speakers and attendees submitted nearly 50 ideas. The speakers voted that list down to their top 4, and then the attendees voted for their favorite. In a very close vote, the winning submission from Philip Nguyen was User Experience Under Load (congratulations Philip!).

Monday, April 11, 2011

What being a Context-Driven Tester means to me

I guess it’s that time again.  What time is that, you ask?  It’s the time when discussion/debate flares up over Context-Driven. I’m not going to weigh in on the whole discussion of pros/cons, value/distraction, etc.  I am a consultant.  I am Context-Driven (and not just as a tester, it's simply the way I have operated since long before I was a tester and long before I became aware someone had coined a term and composed a set of principles around how I already operated).  The license plate on my car says “CONTEXT”. It works for me.  But my point isn’t to convince you that it’s right for you.  My point is to address a comment that I frequently hear that *feels* very sad to me.

Where I work, I don’t have the freedom or authority to implement all this Context-Driven stuff, so I guess I don’t get to be part of the club.
I find this sad, because I don’t agree.  It is my opinion that “Where I work, I don’t have the freedom or authority…” *is* a "driving context", making smart decisions about what you are empowered to choose, and appropriately trying to inform/educate those who are "driving your context" that there are other options qualifies as being Context-Driven... at least to me.

What follows is something I drafted for an org that had recently decided that it wanted to adopt the principles of being Context-Driven, but didn’t want to inadvertently offend members whose context was largely dictated by decisions outside of their sphere of influence.  Due to a wide variety of unrelated circumstances, what I wrote never got presented to the org & got lost and forgotten on my hard drive.  I recently found it and wanted to share it with everyone because I think it’s valuable.

Monday, September 21, 2009

Thorkil Sonne: Recruit Autistics

Wired.com ran their smart list today. If you aren't familiar with it or don't care, at least check out the great press fellow software tester, entrepreneur, and social innovator Thorkil Sonne is getting for Specialisterne here:

http://www.wired.com/techbiz/people/magazine/17-10/ff_smartlist_sonne

While you're at it, why not digg it!

I know there is a lot that we testers disagree about, but if there is one thing we should be able to agree upon, it's that Thorkil, Specialisterne, and the very special people they serve deserve our support and best wishes. I can only hope that this is the spark that get's this (and other such, responsible programs) moving globally. While that would certainly make me happy for Thorkil, the real winners when this takes off will be those who would finally find themselves filling jobs well suited to their skills, those who are reluctantly (and often poorly) doing those jobs now, and their employers who can reassign those reluctant folks to something *they* are better suited for and will complain about less (we all hope) .

Congratulations Thorkil & Specialisterne!!!
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Saturday, January 3, 2009

A misleading benchmark...

No further commentary needed.

Dilbert.com
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Friday, December 5, 2008

Latest Column -- The controversy surrounding the schools of software testing

My latest column...

Periodically, discussions break out in various software testing communities around the Web regarding the schools of software testing.

As I write this, there are discussions going in SQAForums, on the Software-Testing Yahoo! group, and various blogs that (at least up to the time I started writing this piece) reside on or are fed to Testing Reflections. In principle, I'm always pleased when these discussions break out. The point of identifying the schools in the first place was to increase the overall awareness of the diversity in ideologies, practices, and values (i.e. schools of thought) in our field and to stimulate discussion about the situational pros and cons of each. That said, the discussions that actually take place tend to drift off in one or more directions that end up being disappointing, unnecessarily confrontational, and generally not useful.

After witnessing this pattern, participating in these recent discussions, and listening to comments from those who followed the discussions for several years, I've identified several areas in which these discussions go awry. Below, I call those out and share my thoughts about each.

Read the rest of the column.
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Friday, November 7, 2008

Latest Column -- Testing training: Disturbing behaviors of students

My latest column...

Drive-by training. Never heard of it? It is exactly what it sounds like. You drive to a training facility (or an instructor drives to you), for a day or three the instructor delivers the pre-packaged training class, then everyone drives back home. It's not the best training model ever invented. There is generally no student assessment, and the only instructor/course provider accountability is reputation. Even so, many good ideas can be shared and lots of students come away feeling that it was well worth "the drive."

As it turns out, I've been delivering a lot of drive-by training to software testers this fall. That in itself isn't particularly noteworthy -- end-of-the-budget year is a popular time for drive-by training -- but something that is noteworthy is that I have noticed a rise in some disturbing behaviors among the individuals and organizations that select and attend drive-by training.
At first, I thought it was just me. But after an informal poll (and some lively discussions) with my employees and trainer friends in the testing realm, I became increasingly convinced that the behaviors I'm noticing are not exclusive to me and that I'm not the only one who thinks they are on the rise.

Read the rest of the column.
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Sunday, October 5, 2008

Latest Column -- Software Testers are not helpless

My latest column...

During a coffee break at a class the other week, I overheard the following comment from one student to another:

Tester: "This stinks! All of my automated test scripts are broken and I can't seem to get the tool to work now that the developers have enabled Secure Sockets Layer. I'm going to have to work through the weekend."

I know that it's generally considered rude to eavesdrop, and ruder still to comment on a conversation you weren't invited to, but I figured that since I was teaching the class I'd be forgiven. Besides, I simply couldn't help myself.

Read the rest of the column.
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Tuesday, August 19, 2008

Latest Column -- Avoid "Center of the Universe Syndrome"

My latest column cautioning testers not to think they are the center of the development team's universe http://searchsoftwarequality.techtarget.com/tip/0,289483,sid92_gci1325828,00.html
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Saturday, June 28, 2008

Testing Lessons From Civil Engineering

Below is the paper I submitted as a prologue to an experience report, discussion, and (hopefully) additional research that I'm presenting for the first time during CAST08:

Engineers don’t look at the world the same way that testers do.  Engineers look at the world with an eye to solving problems.  Testers look at the world with an eye toward finding problems to solve.  This seems logical.  What is less logical is the fact that engineers, and I’m talking about the kind of engineers that deal with physical objects, seem to be much more sophisticated in their testing than testers.  In fact, most of what I know about testing, I learned as a civil engineering student.  We didn’t call most of it testing.  We didn’t even identify it as anything other than “You really want to get this right.” Maybe Civil Engineers test better than software testers because of the motivations to “get it right”.  Consider what happens when a piece of Civil Engineering, like a bridge fails:

Monday, June 18, 2007

Software Testing Lessons from my Children

My most recent column has just been posted on TechTarget in which I discuss some of the lessons I‘ve learned from my children about software testing.
I had planned an entirely different topic for this month, but I‘m sitting down to write this on Father‘s Day while my sons (Nicholas, age 8, and Taylor, age 4) are napping, and realizing that I‘ve never written about what I have learned about testing from my boys.
Before I share some of these lessons, let me first share a little about me and fatherhood. For all of the dedication, time, and passion I give to my career, it is not even comparable to the dedication, time. and passion I give to my boys. For example, I stopped consulting for a while so I could see my boys every day when they were young because I couldn‘t stand the thought of being on the road for their first steps, new words, and all of the other developmental wonders that occur on almost a daily basis during the first several years of life. When I went back to consulting, I started my own company—not because I wanted to run a company, but because I didn‘t want to have to answer to anyone else when I chose to not travel during baseball season so I could coach my son’s team. In the same spirit, when I work from home, I frequently do so in a room with my boys, who are naturally curious about what I‘m doing. Over the past few years of this, I’ve learned a lot of things about being a good tester from them. Some of the most significant are these:
    • Don‘t be afraid to ask "Why?"
    • Exploratory play is learning
    • Recording your testing is invaluable
    • "Intuitive" means different things to different people
    • Fast enough depends on the user
    • You can never tell what a user may try to do with your software
    • Sometimes the most valuable thing you can do is take a break
 
See the column for more behind the lessons
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."