Some time back, I blogged about a book I’d been significantly contributing to being available as a free .pdf download. (see the entry here)
Well, the book quietly appeared in “dead tree format” (as Stuart Moncrieff put it in his blog post about the book)
a couple of weeks ago and I’ve been getting light heartedly scolded by
some of my friends and readers for not making a big announcement, so
here’s my “big announcement.”
by: J.D. Meier, Scott Barber, Carlos Farre, Prashant Bansode, and Dennis Rea is now available on Amazon.
Reviewed by: Alberto Savoia, Ben Simo, Cem Kaner, Chris Loosley, Corey Goldberg, Dawn Haynes, Derek Mead, Karen N. Johnson, Mike Bonar, Pradeep Soundararajan,
Richard Leeke, Roland Stens, Ross Collard, Steven Woody, Alan
Ridlehoover, Clint Huffman, Edmund Wong, Ken Perilman, Larry Brader,
Mark Tomlinson, Paul Williams, Pete Coupland, and Rico Mariani.
The best part is that you can
buy the book on Amazon, download the PDF, browse the HTML, or do any combination of the above.
This is where Scott Barber shares his thoughts, opinions, ideas and endorsements related to software testing in general, performance testing in specific, and improving the alignment of software development projects with business goals and risks.
Monday, December 17, 2007
Tuesday, October 23, 2007
From The Web: "Noncertified IT pros earn more..."
- Stop the presses! Can it be true? The industry wants effective, qualified, multi-dimensional people who are capable of understanding business drivers & risk mitigation and applying that in a sapient way to their job as opposed to folks who paid someone to teach them how to pass a multiple-choice exam?!? Amazing!
- Noncertified IT pros earn more than certified counterparts: survey
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me
Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing
"If you can see it in your mind...
you will find it in your life."
Thursday, October 18, 2007
From the Mailbox: Software Development: Art or Science?
- Here’s a question that I didn’t realize I had much to say about until I read my own response.
- The Question:
Software Development: Is it an art or a science? An age old question I know, but what do you think and why?
- My Response:
Tuesday, October 16, 2007
From the Mailbox: What makes software "good" or "bad"?
- I was asked the question below (lightly edited for anonymity, clarity, and length) today and found it intriguing, so I thought I'd post it here.
- The Question:
This is an attempt to understand how (and why) users, practitioners, and professionals perceive the difference between a good software product and a bad software product, specifically released software products.
- My Response:
Monday, September 17, 2007
Gentleman, Start Your Engines!!!
- My most recent column, inspired by a surprise trip to the Brickyard 400, has just been posted on TechTarget in which I discuss the distinction between "delivery" and "done" when it comes to testing the performance of software systems.
Countless hours of development are now in the past. Testing indicates that everything is ready for the big day. The whole team is on hand, and the world is watching. It's the moment of truth; time to find out if all of the hard work is going to pay off. Anticipation builds until the command is given…
"Gentlemen, start your engines!"
The cars come to life. They take a few pace laps and at last, the green flag drops. In fewer than 90 seconds the cars are back on the front stretch approaching speeds of 200 mph -- the pinnacle of stock car performance.
This summer I worked on a project in Indianapolis. Usually when I travel to remote client sites I fly home on the weekends, but there was one weekend that I chose to stay. I chose to stay for two reasons. First, the flights for that weekend were insanely expensive and second, I have some friends in Indianapolis whom I'm always happy to have an excuse to visit. As luck would have it, the flights were expensive because that was the weekend of the Brickyard 400, and one of the friends I wanted to spend time with had a spare ticket, which I shamelessly accepted when he offered.
During the pomp and circumstance leading up to the start of the race I realized what a fabulous example the race was of one of my most-quoted sound bites related to performance testing: "Don't confuse delivery with done."
- See the column for the rest of the story.
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me
Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing
"If you can see it in your mind...
you will find it in your life."
Saturday, August 25, 2007
Model Workloads for Performance Testing: FIBLOTS
- This is the third installment of a currently unknown number of posts about heuristics and mnemonics I find valuable when teaching and conducting performance testing.
- Other posts about performance testing heuristics and mnemonics are:
- Installment 1 - Performance Testing Core Principles: CCD IS EARI
- Installment 2 - Classify Performance Tests: IVECTRAS
- For years, I have championed the use of production logs to create workload models for performance testing. During the same period, I've been researching and experimenting with methods to quickly create "good enough" workload models without empirical data that increase the value of the performance tests. I recently realized that these two ideas are actually complimentary, not exclusionary, and that with or without empirical usage data from production logs, I do the same thing, I:
- FIBLOTS.
- While the play on words makes this mnemonic particularly memorable, I'm not saying that I just make it up. Rather the acronym represents the following guideword heuristics that have served me well in deciding what to include in my workload models over the years.
- Frequent: Common application usage.
- Intensive: i.e. Resource hogging activities.
- Business Critical: Even if these activities are both rare and not risky
- Legal: Stuff that will get you sued or not paid.
- Obvious: Stuff that is likely to earn you bad press
- Technically Risky: New technologies, old technologies, places where it’s failed before, previously under-tested areas
- Stakeholder Mandated: Don’t argue with the boss (too much).
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me
Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing
"If you can see it in your mind...
you will find it in your life."
Friday, August 3, 2007
Classify Performance Tests: IVECTRAS
- This is the second installment of a currently unknown number of posts about heuristics and mnemonics I find valuable when teaching and conducting performance testing.
- Other posts about performance testing heuristics and mnemonics are:
- Installment 1 - Performance Testing Core Principles: CCD IS EARI
- Installment 3 - Model Workloads for Performance Testing: FIBLOTS
- I have struggled for over 7 years now with first figuring out and then trying to explain all the different "types" of performance tests. You know the ones:
- Performance Test
- Load Test
- Stress Test
- Spike Test
- Endurance Test
- Reliability Test
- Component Test
- Configuration Test
- {insert your favorite word} Test
- Well, I finally have an alternative.
- IVECTRAS
- IVECTRAS is valuable for classifying performance tests (or test cases if you like that term better) and performance test objectives. Better still, it is easy to map to Criteria, Requirements, Goals, Targets, Thresholds, Milestones, Phases, Project Goals, Risks, Business Requirements, Scripts, Suites, Test Data, etc. Yet even better, you can use it as a heuristic to assist with determining performance testing objectives and performance test design. So what is it?
- To determine, design or classify a performance test objective or test, ask is this an:
- INVESTIGATION or VALIDATION
- of END-TO-END or COMPONENT response TIMES and/or RESOURCE consumption
- under ANTICIPATED or STRESSFUL conditions
- For me (and my clients since I came up with this) there is a lot less confusion when one says "We need to INVESTIGATE COMPONENT level RESOURCE consumption for the application server under STRESSFUL conditions" than it is to say "We need to do a unit stress test against the application server." Even if there are still questions to be answered after applying IVECTRAS, at least the questions should be more obvious -- and if nothing else, *that* adds value for me.
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me
Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing
"If you can see it in your mind...
you will find it in your life."
Monday, July 30, 2007
Hourly Rant...
I just finished answering a question posted on LinkedIn by Esther
Schindler in researching a article she is working on for CIO.com
She asks (summarized):
My response (lightly edited from the original):
She asks (summarized):
"There's just one question to answer: If you could get the (client) boss(es) to understand JUST ONE THING about computer consulting and contracting, what would it be?
Or, to put the same question another way: If you were given a single wish of something to change (about a current or past client) what would it be?"
My response (lightly edited from the original):
Monday, June 18, 2007
Software Testing Lessons from my Children
- My most recent column has just been posted on TechTarget in which I discuss some of the lessons I‘ve learned from my children about software testing.
I had planned an entirely different topic for this month, but I‘m sitting down to write this on Father‘s Day while my sons (Nicholas, age 8, and Taylor, age 4) are napping, and realizing that I‘ve never written about what I have learned about testing from my boys.
Before I share some of these lessons, let me first share a little about me and fatherhood. For all of the dedication, time, and passion I give to my career, it is not even comparable to the dedication, time. and passion I give to my boys. For example, I stopped consulting for a while so I could see my boys every day when they were young because I couldn‘t stand the thought of being on the road for their first steps, new words, and all of the other developmental wonders that occur on almost a daily basis during the first several years of life. When I went back to consulting, I started my own company—not because I wanted to run a company, but because I didn‘t want to have to answer to anyone else when I chose to not travel during baseball season so I could coach my son’s team. In the same spirit, when I work from home, I frequently do so in a room with my boys, who are naturally curious about what I‘m doing. Over the past few years of this, I’ve learned a lot of things about being a good tester from them. Some of the most significant are these:
Don‘t be afraid to ask "Why?"
Exploratory play is learning
Recording your testing is invaluable
"Intuitive" means different things to different people
Fast enough depends on the user
You can never tell what a user may try to do with your software
Sometimes the most valuable thing you can do is take a break
- See the column for more behind the lessons
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me
Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing
"If you can see it in your mind...
you will find it in your life."
Monday, May 21, 2007
Performance Testing Core Principles: CCD IS EARI
- This is the first installment of a currently unknown number of posts about heuristics and mnemonics I find valuable when teaching and conducting performance testing.
- Other posts about performance testing heuristics and mnemonics are:
- Installment 2 - Classify Performance Tests: IVECTRAS
- Installment 3 - Model Workloads for Performance Testing: FIBLOTS
- There is not a "one-size-fits-most" approach to performance testing, but I have become rather convinced that there are nine principles that are (almost always) applied (or at least actively considered) in successful performance testing projects. I remember those principles by remembering:
- CCD IS EARI
- Context: Project context is central to successful performance testing.
- Criteria: Business, project, system, & user success criteria.
- Design: Identify system usage, and key metrics; plan and design tests.
- Install: Install and prepare environment, tools, & resource monitors.
- Script: Implement test design using tools.
- Execute: Run and monitor tests. Validate tests, test data, and results.
- Analyze: Analyze the data individually and as a cross-functional team.
- Report: Consolidate and share results, customized by audience.
- Iterate: "Lather, rinse, repeat" as necessary.
- For more see Developing an approach to performance testing -- CCD IS EARI.
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me
Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing
"If you can see it in your mind...
you will find it in your life."
Friday, March 30, 2007
Five Questions with Jon Bach, by the Braidy Tester
I met Jon about 3 years ago. It was a funny story, actually. I was at
STAREast talking with a bunch of folks at the bar after the last
presentation of the day. Some guy came over and introduced himself to
the person sitting next to me.
I heard his name and I stopped, mid-word, stood up excitedly, started shaking his hand and talking a mile-a- minute...
(Scott) "OhMyGod! Jon Bach! I'mSoExcitedToMeetYou! IReadYourBookAnd... I'm sorry, my name is Scott Barber, I've done some work with your brother..."
(Jon) "Wait! JamesToldMeAboutYou! You'reThePerformanceGuy! IRunATestingLabInSeattleAnd... How about we sit at the bar, I'll buy you a beer."
Jon has been one of my best friends ever since. Oh yeah, he's also one of the best testers and teachers of testers I've ever met. If you don't know Jon -- or even if you do -- Michael Hunter posted Five Questions With Jon Bach today. Take a look, it's a good read.
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me
Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing
"If you can see it in your mind...
you will find it in your life."
I heard his name and I stopped, mid-word, stood up excitedly, started shaking his hand and talking a mile-a- minute...
(Scott) "OhMyGod! Jon Bach! I'mSoExcitedToMeetYou! IReadYourBookAnd... I'm sorry, my name is Scott Barber, I've done some work with your brother..."
(Jon) "Wait! JamesToldMeAboutYou! You'reThePerformanceGuy! IRunATestingLabInSeattleAnd... How about we sit at the bar, I'll buy you a beer."
Jon has been one of my best friends ever since. Oh yeah, he's also one of the best testers and teachers of testers I've ever met. If you don't know Jon -- or even if you do -- Michael Hunter posted Five Questions With Jon Bach today. Take a look, it's a good read.
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me
Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing
"If you can see it in your mind...
you will find it in your life."
Thursday, March 22, 2007
Custom Performance Testing Search Engine
About 24 hours ago, Google Co-op publicly released the ability for folks
to make their own CSE's (Custom Search Engines). From the site:
Harness the power of Google search
Create a highly specialized Custom Search Engine that reflects your knowledge and interests. Place it on your website and, using our AdSense for Search program, make money from the resulting traffic.
See examples of how a Custom Search Engine works.
What you can do with a Custom Search Engine
Sound cool? I thought so. So cool, in fact, that it threw me into a fit of ADD obsession. It took me about 21 of the last 24 hours to do, but now there is CSE just for Performance Testers. Just think about it, no more results for tuning sports cars, training for a marathon or measuring employee productivity when searching for material related to software performance testing.
Check it out and let me know what you think!
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me
Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing
"If you can see it in your mind...
you will find it in your life."
Harness the power of Google search
Create a highly specialized Custom Search Engine that reflects your knowledge and interests. Place it on your website and, using our AdSense for Search program, make money from the resulting traffic.
See examples of how a Custom Search Engine works.
What you can do with a Custom Search Engine
- Place a search box and search results on your website.
- Specify or prioritize the sites you want to include in searches.
- Customize the look and feel to match your website.
- Invite your community to contribute to the search engine.
Sound cool? I thought so. So cool, in fact, that it threw me into a fit of ADD obsession. It took me about 21 of the last 24 hours to do, but now there is CSE just for Performance Testers. Just think about it, no more results for tuning sports cars, training for a marathon or measuring employee productivity when searching for material related to software performance testing.
Check it out and let me know what you think!
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me
Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing
"If you can see it in your mind...
you will find it in your life."
Wednesday, January 31, 2007
Resumes in Context
On a forum related to James Bach's Rapid Software Testing On-line (Beta)
class (which I highly recommend! A few more technical issues to work
out and it should be ready for prime-time) another student (Anne Marie
Martin, from Atlanta) posted the following (lightly edited):
Here's something I struggle with though, and would love to hear thoughts on. I have about 11 years experience in testing, and try to invest time in learning more about testing, and learning more in general that can help me with testing - such as the things we've all been discussing about philosophy and learning and Weinberg and a hundred other things that have tickled my brain during our discussions and threads that made my 'to do' list of things to read or explore or learn from.
Subscribe to:
Posts (Atom)