Showing posts with label Testing. Show all posts
Showing posts with label Testing. Show all posts

Monday, April 2, 2012

Let's Test 2012

The first (as far as anyone I know is aware) Context-Driven conference in Europe is quickly approaching. On May 7-9, 2012 in Stockholm, Sweden, Let's Test "A European conference on context-driven testing - for testers, by testers" will take place.

This is a CAST inspired conference, meaning that it focuses on in-depth exploration of topics, includes facilitated discussion as part of every talk (i.e. speakers don't get to "run out of time" as soon as they hear that "hard question") and conferring only increases between and after sessions. It's a fabulous format! If you haven't experienced it, and you are passionate about testing, you really want to -- it will change your perspective on conferences forever.

I am proud to say that I will not only be attending Let's Test 2012, but that I am honored to be on the program with some first-run content that I'm very excited about:

A Full Day Tutorial: Context Appropriate Performance Testing, from Simple to Rocket Science
A Keynote: Testing Missions in Context From Checking to Assessment
 

Tuesday, March 27, 2012

Software Quality Assurance Engineer... Happiest job?!?

If you haven't seen this article, you want to read it:

http://finance.yahoo.com/blogs/secrets-toyour-success/happiest-jobs-america-173044519.html

About half way down it says:
The happiest job of all isn't kindergarten teacher or dentist. It's software quality assurance engineer. Professionals with this job title are typically involved in the entire software development process to ensure the quality of the final product. This can include processes such as requirements gathering and documentation, source code control, code review, change management, configuration management, release management, and the actual testing of the software, explains Matt Miller, chief technology officer at CareerBliss.
With an index score of 4.24, software quality assurance engineers said they are more than satisfied with the people they work with and the company they work for. They're also fairly content with their daily tasks and bosses.

These professionals "typically make between $85,000 and $100,000 a year in salary and are the gatekeepers for releasing high quality software products," Miller says. Organizations generally will not allow software to be released until it has been fully tested and approved by their software quality assurance group, he adds.
So I have a bunch of comments:
  1. I guess I don't know what a "Software Quality Assurance Engineer" is -- or this Matt Miler guy doesn't. 
  2. *If* anyone "ensures the quality of the final product" in software, it's a PM or higher.
  3. I don't think I've met anyone with that title who smiled and told me how much they love their job.
  4. I'm certain I've never met someone with that title that makes that much money. 
  5. I think I'd rather shoot myself in the head than have those tasks... even at such a generous salary.
I could go on, but I'll stop.  I want to see these questions, & I want to know the demographics of the people surveyed, & I want to see the titles actually reported by respondents that got rolled up under "Software Quality Assurance Engineer." I'd also like to have a word or 73 with this Matt Miller dude... CTO to CTO, 'cause lets face it, we all know that testers wouldn't be caught dead bragging about how *happy* their job makes them, or how *satisfying* it is. Testers tend to love the act of testing, but not their jobs, or their bosses, or their companies -- and if this ain't referring to testers, I wanna know why these process people are apparently so happy about being forced to do the actual testing on top of their "real" job.


Feel free to share your thoughts, but this strikes me as "not *even* wrong" to a degree that I can't seem to even reverse-engineer a single measurement dysfunction that could account for all the ways in which this article strikes me as "just not right".

 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
Director, Computer Measurement Group
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Friday, March 23, 2012

Trust is a Cornerstone to Delivering Business Value

In my last post about Metrics I introduced the notion of trust as it relates to Business Value by stating:
"Failing to trust 'the Business' does NOT add Business Value"
I'd like to generalize that statement further to say "A lack of trust that individuals or groups involved in the project are primarily focused on helping the business succeed undermines business value".

Now, I can only imagine the reaction many testers are having while reading this. For instance "If I trust the developer when they say 'This is fine, you don't need to test it', we'll have major bugs make it to production." And anyone thinking that would be absolutely right -- because that is not the *kind* of trust I'm talking about.

When I say trust, I don't mean "Trust others to tell you how to do your job" or "Trust others to do what you believe is correct/best" or even "Trust others to be successful in accomplishing what they have been assigned to accomplish on time, on mission, on quality, and on budget"

When I say trust, I mean "Trust others to approach their role with integrity" and "Trust that others are doing the best they can to make the decisions or take the actions appropriate to their role and responsibilities based on the information they have" and "Trust that if you haven't been assigned to do or to be the decision maker about something, that task or decision is better handled by someone else -- whether or not *you* have the information necessary to make sense out of why.

Monday, March 19, 2012

10 Take Aways from STP Summit on Agile Transitions

I had the pleasure of hosting the fourth Online Summit, delivered by Software Test Professionals: Agile Transitions.  The online summit format consists of 3 sessions each for 3 consecutive days.  The sessions for this summit were:
One of my duties as host was to try to summarize the most valuable nuggets of information from across all of the presentations into a "top take aways" list.  This is what I came up with:

Scott's Top 10 Take Aways from:

Friday, March 16, 2012

Business Value of Testing: Find Bugs ≠ Mission

My introduction to software testing was as a performance tester. Before the completion of my first project, I had a firm understanding of the primary mission of performance testing. That understanding has changed very little to this day, though I have improved how I communicate that mission. Currently, I phrase the (generic) primary mission of performance testing as:
"To find how many, how fast, how much and for how long; to assist business w/ related technical and/or business decisions and/or; to support related improvement activities as prioritized by the business."
That certainly doesn't mean that all performance testing efforts include every aspect of that mission, but I'm hard pressed to imagine something that I'd call performance testing that includes none of the aspects of that mission. It is also true that I have experienced all manner of secondary missions as part of a performance testing effort (missions related to fail-over, and denial of service attack security, for example). Those combinations, permutations, variations and additions are all where context comes into play.

However; when I was first asked to help out with some functional testing, I quickly realized that I didn't really know what was expected of me, so I asked. As you might imagine, folks looked at me like I'd grown a second head... a green one... with scales and horns. After about the 4th time I asked the question, got a funny look, watched as it became clear I was serious and received some variant of the answer:
"Find bugs"

Friday, March 9, 2012

Context-Driven Testing Crossroads: Addendum

I guess I wasn't as done talking about this as I thought. Earlier today, I posted the following comment (except with a few extra typos that I chose to fix below) on Tim Western's blog in response to his post Is the Context Driven School of Testing - Dead?:
"A point that I think many miss is that this is not just about individual testers.

50 years ago (more or less) testING began fighting a rather arduous battle to establish an identity separate from developMENT. This, eventually, led to testERS establishing an identity separate from developERS.

Saturday, March 3, 2012

Context-Driven School (of thought): "I'm not dead yet... I feel happy!"

This is Part III in a series of entries related to the following quote from the "about page" of context-driven-testing.com hosted by Cem Kaner:
"...However, over the past 11 years, the founders have gone our separate ways. We have developed distinctly different visions. If there ever was one context-driven school, there is not one now..."
If you haven't done so already, I recommend starting with:


Ok, so maybe not "happy" but I couldn't resist the Monty Python reference.

James Bach stated on his latest blog update (Context-Driven Testing at a Crossroads):
"I’m the last of the founders of the Context-Driven School, as such, who remain true to the original vision. I will bear its torch along with any fellow travelers who wish to pursue a similar program."

Thursday, March 1, 2012

With the Context-Driven School "closed" what's next?

This is Part II in a series of entries related to the following quote from the "about page" of context-driven-testing.com hosted by Cem Kaner:

"...However, over the past 11 years, the founders have gone our separate ways. We have developed distinctly different visions. If there ever was one context-driven school, there is not one now..."
If you haven't done so already, I recommend starting with Part I: Is Testing Dead? Dunno, but the Context-Driven School Is


Much like when one completes an educational program at one institution and ponders whether or not to enroll in another program (and if so, which one), or to enter the workforce and continue their learning along the professional development or self-education path, I think it's fair for those who have come to self-identify as members of the Context-Driven School to be asking themselves similar questions.

And much like completing an educational program does not equate to losing the lessons learned (as opposed to the lesson's taught) in the program, the Context-Driven Principles and the lessons many of us have learned by studying in (or, for that matter, rebelling against) the Context-Driven School remain despite Cem's announcement that (in my words) the school is now closed.

Sunday, December 25, 2011

Curse of the Performance Tester?

Seriously?!? After wrapping gifts until nearly 5am (I was behind by even my standards due mostly to work travel, client commitments & preparing to close the corporate books for 2011), and getting up before 8am to celebrate Christmas with my boys, I finally stole a few minutes when I noticed they'd both fallen asleep on the couch to play with *my* new toy (i.e. install Skyrim), only to be foiled by...

Monday, December 5, 2011

10 Must-Know Tips for Performance Test Automation

More than other automation, bad performance test automation leads to:
  • Undetectably incorrect results
  • Good release decisions, based on bad data
  • Surprising, catastrophic failures in production
  • Incorrect hardware purchases
  • Extended down-time
  • Significant media coverage and brand erosion
More than other automation, performance test automation demands:
  • Clear objectives (not pass/fail requirements)
  • Valid application usage models
  • Detailed knowledge of the system and the business
  • External test monitoring
  • Cross-team collaboration
Unfortunately, bad performance test automation is:
  • Very easy to create,
  • Difficult to detect, and
  • More difficult to correct.
The following 10 tips, based on my own experiences, will help you avoid creating bad performance test automation in the first place.

Tip #10: Data Design
  • *Lots* of test data is essential (at least 3 sets per user to be simulated – 10 is not uncommon)
  • Test Data to be unique and minimally overlapping (updating the same row in a database 1000 times has a different performance profile than 1000 different rows)
  • Consider changed/consumed data (a search will provide different results, and a item to be purchased may be out of stock without careful planning)
  • Don’t share your data environment (see above)

Tuesday, November 29, 2011

10 Things About Testing That Should Die

I've taken some heat for discussing the whole "is test dead" concept due to a feeling that I was validating the concept of testing being unnecessary. Allow me to clarify my position. I do not believe, for one heartbeat, that testing as an activity is in any way unnecessary. I do believe that there are things related to the current state of and common beliefs about testing that should die. With that said...

Scott Barber's Top 10 Things About Testing That Should Die: 

10. Egocentricity

Face reality testers, neither the product nor the business revolve around you. Think about it. No business, no product, no developers => no need for testers. In fact, if developers wrote perfect code, there'd be no need for you. You are a service provider and your primary clients are the managers, developers, and/or executives. Your secondary clients are product users and investors. So stop whining and stomping your feet when your clients don't make decisions you like with the information you provide. It's not your call. If you want it to be your call, get on track to become a project manager, product manager, or executive, otherwise, get right with the fact that you provide a service (hopefully a valuable one) and get back to providing it.

Tuesday, November 8, 2011

On the Alleged Death of Testing

Out of respect for your time, I'll give you the bottom line up front as a simulated interview that I privately hoped for, but never came. After the mock interview is a supporting narrative for those of you more interested in my thinking on the matter.

Q: There's been a lot of talk recently about testing being dead, so my first question is testing dead?
A: No.

Q: Some of those talking about the alleged death of testing are saying that it's not that testing as a whole is dead, but that testing as it is commonly understood today dead. Is it?
A: No.

Q: Ok, so is testing as it is commonly understood today dying?
A: Not that I can see.

Q: Then why all the talk about testing "as we know it" being dead?
A: IMHO? Wishful thinking.

Thursday, October 13, 2011

Top 10 Automation Tips from STP Online Summit

I had the pleasure of hosting the second Online Summit, delivered by Software Test Professionals: Achieving Business Value With Test Automation.  The online summit format consists of 3 sessions each for 3 consecutive days.  The sessions for this summit were:
One of my duties as host was to try to summarize the most valuable nuggets of information from across all of the presentations into a "top tips" list.  This is what I came up with:

Scott's Top 10 Automation Tips from:


Tuesday, October 4, 2011

Is Junosphere the world's first cloud testing environment? Not really - IT in Context

Is Junosphere the world's first cloud testing environment? Not really - IT in Context

I don't get too irked by companies coining new phrases to make subtle marketing distinctions in services, but when they do it so they can make first/best claims flips my bozo bit. Seriously, if your service is so bland or weak that you need to invent a new term so you can claim that it's the "best the best thing called blah" without being called out for fraud, maybe you should just improve your service.

--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
you will find it in your life."

Tuesday, August 23, 2011

STP Online Summit: Achieving Business Value with Test Automation

Due to the overwhelming success and positive reviews of the last STP Online Summit: Business Value of Performance Testing, we've decided to do it again -- only this time, we're going to explore Achieving Business Value with Test Automation.

Join me (while I continue practicing my radio host skills for my emergency back-up career as a sportscaster) and 7 other presenters that I consider to be elite practitioners, teachers, and thinkers in their test automation areas of specialization for 3 half days online to learn their tips and methods for achieving business value with test automation. If you or your organization are using, or thinking about using, automation to enhance or improve your testing, you're not going to want to miss this online summit. I honestly can't think of anywhere else you can get this concentration of relevant and thematically targeted information at a better price, but you be the judge:

When: Tuesday October 11 10:00AM - Thursday October 13 1:30PM PST

Cost: $195 USD before 9/26/11 $245 USD after 9/26/11

Theme: For more than 15 years organizations have been investing in the promise of better, cheaper, and faster testing through automation. While some companies have achieved demonstrable business value from their forays into test automation, many others have experienced questionable to negative returns on their investments. Join your host, Scott Barber, for this three day online summit, to hear how seven recognized leaders in test automation have achieved real business value by implementing a variety of automation flavors and styles for their employers and clients. Learn how to answer the ROI question by focusing on business value instead of testing tasks, and how to implement automation in ways that deliver that value to the business, not just to the development and/or test team.



Thursday, August 4, 2011

Scott Barber Interviewed by Matt Heusser; Podcast

Two part podcast on the STP site. I say some interesting stuff... or at least I say some stuff that's interesting to me. :)

Twist #52 - With Scott Barber

Twist #53 - The Return of the Barber
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Tuesday, June 7, 2011

Uruguay surpasses world with professional development program for software testers.

The Centro de Ensayos de Software (CES), a non-profit software testing laboratory in Uruguay, has recently launched a program that is certain to become the new “gold standard” in professional development for software testers.  The program, endorsed by the Universidad de la Republica (Uruguay), the Universidad Castilla La Mancha (Spain), and sanctioned by the Uruguayan IT Chamber (CUTI), is the most comprehensive, affordable, and publicly available training program for software testers on the market.  Based on my market research and comprehensive review of the program, I have no reservation in rating it as market leading.

Software Testing, the software development activity responsible for identifying issues with software and providing a wide variety of quality-related information to stakeholders and decision-makers prior to release, is the primary job of many millions world-wide, yet the majority of software testers learn their craft entirely on the job.  Yes, there are various “take a class or two, pass an information-based (not a skill-based) test, and receive a certification” programs – some more respectable than others and most far more expensive than the CES program.  There is even a new certificate coming to market that involves three, one month, on-line courses where students are taught and assessed by experienced testers and university professors, but none of those rise to the level of the CES’s program.

Monday, April 11, 2011

What being a Context-Driven Tester means to me

I guess it’s that time again.  What time is that, you ask?  It’s the time when discussion/debate flares up over Context-Driven. I’m not going to weigh in on the whole discussion of pros/cons, value/distraction, etc.  I am a consultant.  I am Context-Driven (and not just as a tester, it's simply the way I have operated since long before I was a tester and long before I became aware someone had coined a term and composed a set of principles around how I already operated).  The license plate on my car says “CONTEXT”. It works for me.  But my point isn’t to convince you that it’s right for you.  My point is to address a comment that I frequently hear that *feels* very sad to me.

Where I work, I don’t have the freedom or authority to implement all this Context-Driven stuff, so I guess I don’t get to be part of the club.
I find this sad, because I don’t agree.  It is my opinion that “Where I work, I don’t have the freedom or authority…” *is* a "driving context", making smart decisions about what you are empowered to choose, and appropriately trying to inform/educate those who are "driving your context" that there are other options qualifies as being Context-Driven... at least to me.

What follows is something I drafted for an org that had recently decided that it wanted to adopt the principles of being Context-Driven, but didn’t want to inadvertently offend members whose context was largely dictated by decisions outside of their sphere of influence.  Due to a wide variety of unrelated circumstances, what I wrote never got presented to the org & got lost and forgotten on my hard drive.  I recently found it and wanted to share it with everyone because I think it’s valuable.

Monday, September 21, 2009

Thorkil Sonne: Recruit Autistics

Wired.com ran their smart list today. If you aren't familiar with it or don't care, at least check out the great press fellow software tester, entrepreneur, and social innovator Thorkil Sonne is getting for Specialisterne here:

http://www.wired.com/techbiz/people/magazine/17-10/ff_smartlist_sonne

While you're at it, why not digg it!

I know there is a lot that we testers disagree about, but if there is one thing we should be able to agree upon, it's that Thorkil, Specialisterne, and the very special people they serve deserve our support and best wishes. I can only hope that this is the spark that get's this (and other such, responsible programs) moving globally. While that would certainly make me happy for Thorkil, the real winners when this takes off will be those who would finally find themselves filling jobs well suited to their skills, those who are reluctantly (and often poorly) doing those jobs now, and their employers who can reassign those reluctant folks to something *they* are better suited for and will complain about less (we all hope) .

Congratulations Thorkil & Specialisterne!!!
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Friday, December 5, 2008

Latest Column -- The controversy surrounding the schools of software testing

My latest column...

Periodically, discussions break out in various software testing communities around the Web regarding the schools of software testing.

As I write this, there are discussions going in SQAForums, on the Software-Testing Yahoo! group, and various blogs that (at least up to the time I started writing this piece) reside on or are fed to Testing Reflections. In principle, I'm always pleased when these discussions break out. The point of identifying the schools in the first place was to increase the overall awareness of the diversity in ideologies, practices, and values (i.e. schools of thought) in our field and to stimulate discussion about the situational pros and cons of each. That said, the discussions that actually take place tend to drift off in one or more directions that end up being disappointing, unnecessarily confrontational, and generally not useful.

After witnessing this pattern, participating in these recent discussions, and listening to comments from those who followed the discussions for several years, I've identified several areas in which these discussions go awry. Below, I call those out and share my thoughts about each.

Read the rest of the column.
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."