Showing posts with label Value. Show all posts
Showing posts with label Value. Show all posts

Monday, December 5, 2011

10 Must-Know Tips for Performance Test Automation

More than other automation, bad performance test automation leads to:
  • Undetectably incorrect results
  • Good release decisions, based on bad data
  • Surprising, catastrophic failures in production
  • Incorrect hardware purchases
  • Extended down-time
  • Significant media coverage and brand erosion
More than other automation, performance test automation demands:
  • Clear objectives (not pass/fail requirements)
  • Valid application usage models
  • Detailed knowledge of the system and the business
  • External test monitoring
  • Cross-team collaboration
Unfortunately, bad performance test automation is:
  • Very easy to create,
  • Difficult to detect, and
  • More difficult to correct.
The following 10 tips, based on my own experiences, will help you avoid creating bad performance test automation in the first place.

Tip #10: Data Design
  • *Lots* of test data is essential (at least 3 sets per user to be simulated – 10 is not uncommon)
  • Test Data to be unique and minimally overlapping (updating the same row in a database 1000 times has a different performance profile than 1000 different rows)
  • Consider changed/consumed data (a search will provide different results, and a item to be purchased may be out of stock without careful planning)
  • Don’t share your data environment (see above)

Tuesday, November 29, 2011

10 Things About Testing That Should Die

I've taken some heat for discussing the whole "is test dead" concept due to a feeling that I was validating the concept of testing being unnecessary. Allow me to clarify my position. I do not believe, for one heartbeat, that testing as an activity is in any way unnecessary. I do believe that there are things related to the current state of and common beliefs about testing that should die. With that said...

Scott Barber's Top 10 Things About Testing That Should Die: 

10. Egocentricity

Face reality testers, neither the product nor the business revolve around you. Think about it. No business, no product, no developers => no need for testers. In fact, if developers wrote perfect code, there'd be no need for you. You are a service provider and your primary clients are the managers, developers, and/or executives. Your secondary clients are product users and investors. So stop whining and stomping your feet when your clients don't make decisions you like with the information you provide. It's not your call. If you want it to be your call, get on track to become a project manager, product manager, or executive, otherwise, get right with the fact that you provide a service (hopefully a valuable one) and get back to providing it.

Monday, October 24, 2011

Best Ice Cream Practice

A twitter conversation from Friday, Oct 21...

@TesterAB Anna Baik: What's best practice for icecream? I don't know what flavour of icecream I should eat, and I'm afraid to get it wrong.
  • @skillinen Sylvia Killinen: @TesterAB Test ALL the ice cream, that way you'll know which one best satisfies conformance. :)
  • @adampknight Adam Knight: @TesterAB it's vanilla, if you're not eating vanilla you are doing it wrong. I'd suggest getting yourself CVM certified as soon as you can
    • @adampknight  Adam Knight: @sbarber @TesterAB we should be specific. I'll clarify in my "10 ways to check if you are truly vanilla" blog post #BestIceCreamPractice
    • @sbarber Scott Barber: @adampknight @TesterAB Certified Valuation Manager ™? No, no, no, that's only appropriate for *children's* icecream! #BestIceCreamPractice
  • @testingqa Guy Mason: @TesterAB Best to go for that which you most prefer at that point in time?
    • @TesterAB Anna Baik: @testingqa No no no. There must be one flavour of icecream that is best for everybody to eat at all points in time.
    • @TesterAB Anna Baik: @sbarber @testingqa Yes! None of this wishy-washy nonsense, I only want to eat the BEST flavour of icecream. #BestIceCreamPractice
    • @sbarber Scott Barber: @TesterAB @testingqa So chocolate, pistachio, lemon sorbet, raspberry swirl, topped with caramel & orange soda, right? #BestIceCreamPractice
    • @TesterAB Anna Baik: @sbarber @testingqa #BestIceCreamPractice Finally, someone who'll give me an answer! ...wait. How do I know you're qualified?
    • @sbarber Scott Barber: @TesterAB @testingqa #BestIceCreamPractice I founded a non-profit to establish BICP qualification stds and issued myself a certification.
    • @TesterAB Anna Baik: @sbarber @testingqa #BestIceCreamPractice Sounds reassuring, I knew there'd be an Official Body somewhere to tell me what icecream to eat
    • @sbarber Scott Barber: @TesterAB @testingqa #BestIceCreamPractice The invoice for my services are in the mail. $400/hr + $25,000 for the BICP flavor report.
    • @TesterAB Anna Baik: @sbarber @testingqa Eeek! Don't I even get something to show to people to prove I now know #BestIceCreamPractice?
    • @sbarber Scott Barber: @TesterAB @testingqa #BestIceCreamPractice when check clears we mail you a Certified BICP Practitioner Certificate (suitable for framing)
Questions?  No? Didn't think so.  :)

 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Thursday, October 13, 2011

Top 10 Automation Tips from STP Online Summit

I had the pleasure of hosting the second Online Summit, delivered by Software Test Professionals: Achieving Business Value With Test Automation.  The online summit format consists of 3 sessions each for 3 consecutive days.  The sessions for this summit were:
One of my duties as host was to try to summarize the most valuable nuggets of information from across all of the presentations into a "top tips" list.  This is what I came up with:

Scott's Top 10 Automation Tips from:


Friday, September 30, 2011

Agile backlash series...

From SearchSoftwareQuality.com:

Agile backlash series: Exploring Agile development problems and solutions


 I think Jan Stafford did a great job on this series.  I don't agree with every opinion from everyone interviewed, but I wouldn't expect to.  I think it's fair, honest, insightful, and (best of all) focuses on experiences, challenges, and ideas about overcoming challenges instead of theory, marketing fluff, and excessive exaggeration.  :)

Of course, I'm always happy when someone is willing to publish quotes of mine like the following excerpts from Why Agile should not marginalize software testers:

"SSQ: You come in frequently to integrate testing into Agile development. What kind of problems do you see organizations having when integrating testing?

Scott Barber: The first thing that I hear about is, ‘What do we need testers for if we’re doing Agile? Isn’t everyone in Agile a generalist?’

Thursday, September 29, 2011

Making Every Test Count

This is from a while back, but I wouldn't call it dated.  It's a webinar, it runs for 48 min.  I like it, for whatever that's worth.  ;)

Abstract:

Do you ever find yourself wondering what the point is to executing this test... again!?!  Have you ever felt like the purpose of a test is to ensure there is a check mark in a particular check box?  Are you ever asked to get *more* information in even less time with even fewer resources than the lst test project you worked on?

In this presentation, Scott Barber will introduce you to a variety of tips and techniques you can apply to virtually any testing you do as you strive to make ever test you execute add value to the project.


--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Wednesday, July 20, 2011

CloudTest Lite - A Game Changer in the Performance Tool Market

Yesterday, SOASTA announced their new product, CloudTest Lite (Press Release). It's not common that I get excited about a tool product release, but this is different. This product has the potential to change the market for the better.
Scratch that. I'll be shocked if it doesn't change the market for the better.

Why is that, you ask? Consider the following attributes of CloutTest Lite:
  • It's a fully featured, easy to learn and use, enterprise class, modern, performance testing tool for web & mobile applications
  • All you need to use it is a reasonably modern machine connected to the internet and a web browser.
    • You don't need to buy, install, configure or maintain load generation machines.
    • The "license" is tied to your personal credentials, so you can design, create, execute, and analyze your tests from any machine you want without needing to figure out how to point to the license server, or how to get onto the corporate network from your favorite internet cafe.
    • You can even do much of the design, test enhancement, and analysis entirely off-line.
  • You can simulate up to 100 virtual users any time you want. No more scheduling time on the controller days or weeks in advance guessing the app will be ready for your test. No more having to wait until your next scheduled time to re-run your test when you see something 'wonky' in your data.
  • It's free.
    • Yes, I said free.
    • As in, you never need to pay a dime. Not today, not when the trial expires, not a year from now to continue your maintenance contract.
    • That's right, it is free from now until the sun explodes (or at least until well beyond when anything we're building or planning to build today is long gone and forgotten)
Imagine the implications:

Wednesday, September 23, 2009

Testing vs. Checking ... my 2 cents.

I was pleased to see Michael Bolton's series on Testing vs. Checking. If you haven't been following, what I consider to be the central thread of the topic (and the unfortunately inevitable fallout that seems to happen in "testerland" almost any time someone says something that makes sense).
From Michael:
From James Bach:
From Scott Barber:

Tuesday, October 16, 2007

From the Mailbox: What makes software "good" or "bad"?

I was asked the question below (lightly edited for anonymity, clarity, and length) today and found it intriguing, so I thought I'd post it here.
 
The Question:
This is an attempt to understand how (and why) users, practitioners, and professionals perceive the difference between a good software product and a bad software product, specifically released software products.
My Response:

Monday, July 30, 2007

Hourly Rant...

I just finished answering a question posted on LinkedIn by Esther Schindler in researching a article she is working on for CIO.com

She asks (summarized):

"There's just one question to answer: If you could get the (client) boss(es) to understand JUST ONE THING about computer consulting and contracting, what would it be?

Or, to put the same question another way: If you were given a single wish of something to change (about a current or past client) what would it be?"

My response (lightly edited from the original):