Showing posts with label Scott Barber. Show all posts
Showing posts with label Scott Barber. Show all posts

Tuesday, October 23, 2007

From The Web: "Noncertified IT pros earn more..."

Stop the presses! Can it be true? The industry wants effective, qualified, multi-dimensional people who are capable of understanding business drivers & risk mitigation and applying that in a sapient way to their job as opposed to folks who paid someone to teach them how to pass a multiple-choice exam?!? Amazing!
 
Noncertified IT pros earn more than certified counterparts: survey
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Thursday, October 18, 2007

From the Mailbox: Software Development: Art or Science?

Here’s a question that I didn’t realize I had much to say about until I read my own response.
 
The Question:
Software Development: Is it an art or a science? An age old question I know, but what do you think and why?
My Response:

Tuesday, October 16, 2007

From the Mailbox: What makes software "good" or "bad"?

I was asked the question below (lightly edited for anonymity, clarity, and length) today and found it intriguing, so I thought I'd post it here.
 
The Question:
This is an attempt to understand how (and why) users, practitioners, and professionals perceive the difference between a good software product and a bad software product, specifically released software products.
My Response:

Friday, August 3, 2007

Classify Performance Tests: IVECTRAS

This is the second installment of a currently unknown number of posts about heuristics and mnemonics I find valuable when teaching and conducting performance testing.
 
Other posts about performance testing heuristics and mnemonics are:
 
 
I have struggled for over 7 years now with first figuring out and then trying to explain all the different "types" of performance tests. You know the ones:
 
  • Performance Test
  • Load Test
  • Stress Test
  • Spike Test
  • Endurance Test
  • Reliability Test
  • Component Test
  • Configuration Test
  • {insert your favorite word} Test
 
Well, I finally have an alternative.
 
IVECTRAS
IVECTRAS is valuable for classifying performance tests (or test cases if you like that term better) and performance test objectives. Better still, it is easy to map to Criteria, Requirements, Goals, Targets, Thresholds, Milestones, Phases, Project Goals, Risks, Business Requirements, Scripts, Suites, Test Data, etc. Yet even better, you can use it as a heuristic to assist with determining performance testing objectives and performance test design. So what is it?
 
To determine, design or classify a performance test objective or test, ask is this an:
 
INVESTIGATION or VALIDATION
of END-TO-END or COMPONENT
response TIMES and/or RESOURCE consumption
under ANTICIPATED or STRESSFUL conditions
 
For me (and my clients since I came up with this) there is a lot less confusion when one says "We need to INVESTIGATE COMPONENT level RESOURCE consumption for the application server under STRESSFUL conditions" than it is to say "We need to do a unit stress test against the application server." Even if there are still questions to be answered after applying IVECTRAS, at least the questions should be more obvious -- and if nothing else, *that* adds value for me.
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Monday, July 30, 2007

Hourly Rant...

I just finished answering a question posted on LinkedIn by Esther Schindler in researching a article she is working on for CIO.com

She asks (summarized):

"There's just one question to answer: If you could get the (client) boss(es) to understand JUST ONE THING about computer consulting and contracting, what would it be?

Or, to put the same question another way: If you were given a single wish of something to change (about a current or past client) what would it be?"

My response (lightly edited from the original):

Monday, June 18, 2007

Software Testing Lessons from my Children

My most recent column has just been posted on TechTarget in which I discuss some of the lessons I‘ve learned from my children about software testing.
I had planned an entirely different topic for this month, but I‘m sitting down to write this on Father‘s Day while my sons (Nicholas, age 8, and Taylor, age 4) are napping, and realizing that I‘ve never written about what I have learned about testing from my boys.
Before I share some of these lessons, let me first share a little about me and fatherhood. For all of the dedication, time, and passion I give to my career, it is not even comparable to the dedication, time. and passion I give to my boys. For example, I stopped consulting for a while so I could see my boys every day when they were young because I couldn‘t stand the thought of being on the road for their first steps, new words, and all of the other developmental wonders that occur on almost a daily basis during the first several years of life. When I went back to consulting, I started my own company—not because I wanted to run a company, but because I didn‘t want to have to answer to anyone else when I chose to not travel during baseball season so I could coach my son’s team. In the same spirit, when I work from home, I frequently do so in a room with my boys, who are naturally curious about what I‘m doing. Over the past few years of this, I’ve learned a lot of things about being a good tester from them. Some of the most significant are these:
    • Don‘t be afraid to ask "Why?"
    • Exploratory play is learning
    • Recording your testing is invaluable
    • "Intuitive" means different things to different people
    • Fast enough depends on the user
    • You can never tell what a user may try to do with your software
    • Sometimes the most valuable thing you can do is take a break
 
See the column for more behind the lessons
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Monday, May 21, 2007

Performance Testing Core Principles: CCD IS EARI

This is the first installment of a currently unknown number of posts about heuristics and mnemonics I find valuable when teaching and conducting performance testing.
 
Other posts about performance testing heuristics and mnemonics are:
 
 
There is not a "one-size-fits-most" approach to performance testing, but I have become rather convinced that there are nine principles that are (almost always) applied (or at least actively considered) in successful performance testing projects. I remember those principles by remembering:
 
CCD IS EARI
  • Context: Project context is central to successful performance testing.
  • Criteria: Business, project, system, & user success criteria.
  • Design: Identify system usage, and key metrics; plan and design tests.
  • Install: Install and prepare environment, tools, & resource monitors.
  • Script: Implement test design using tools.
  • Execute: Run and monitor tests. Validate tests, test data, and results.
  • Analyze: Analyze the data individually and as a cross-functional team.
  • Report: Consolidate and share results, customized by audience.
  • Iterate: "Lather, rinse, repeat" as necessary.
 
For more see Developing an approach to performance testing -- CCD IS EARI.
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Wednesday, January 31, 2007

Resumes in Context

On a forum related to James Bach's Rapid Software Testing On-line (Beta) class (which I highly recommend! A few more technical issues to work out and it should be ready for prime-time) another student (Anne Marie Martin, from Atlanta) posted the following (lightly edited):
Here's something I struggle with though, and would love to hear thoughts on. I have about 11 years experience in testing, and try to invest time in learning more about testing, and learning more in general that can help me with testing - such as the things we've all been discussing about philosophy and learning and Weinberg and a hundred other things that have tickled my brain during our discussions and threads that made my 'to do' list of things to read or explore or learn from.

Monday, November 20, 2006

What Best Practices really are. -- CIO Article

Of all the places I expected to find an article supporting the fact that Best Practices is nothing more than a square on someone's buzz-word bingo card, CIO wasn't it. The highlights are these...
Using celebs for endorsements has become such best practice that everyone does it. So what is best practice about it? Nothing. The phrase is simply a demonstration of how cliched business language dresses up the concept of copying something someone else has done. And when lots of companies copy the copier, it becomes dull, intellectually stagnant and offers no competitive advantage. It's just a me-too strategy executed by the cynical, the lazy, or the lazy cynics.

Friday, November 17, 2006

Happy About Global Software Test Automation

I just posted this review for Hung Nguyen's new book on Amazon. All you testers and test managers out there, slip this book under your boss's door when they aren't looking and watch how quickly the company starts embracing and respecting software testing!

***

Happy About Global Software Test Automation: A Discussion of Software Testing for Executives is an absolute must read for any executive in a company that develops, customizes or implements software.

Sunday, November 12, 2006

Modeling Application Usage Visually, Google Tech Talk

Some folks have said that I should get this on my blog, so here it is. If you like it, rate it... if you don't... umm... well... let your conscious be your guide. ;)

Modeling Application Usage Visually

Play Video

Description: Google TechTalks April 24, 2006 Scott Barber is the CTO of PerfTestPlus, Inc. and Co-Founder of the Workshop on Performance and Reliability (WOPR). Scott's particular specialties are testing and analyzing performance for complex systems, developing customized testing methodologies, group facilitation and authoring instructional materials.

Astract Modeling application usage is more than just parsing log files and calculating page frequencies. Whether we are analyzing navigation path effectiveness, planning for scenario testing, documenting performance test workload models or mapping services or objects to user activity having a single, intuitive picture to reference makes the job easier. In this session, we'll explore a highly adaptable method for visualizing application usage and how to use this model to improve cross-functional team communication without requiring team members to invest time learning some new fad of a modeling language that they'll probably never use again. This method references UCML™ which has been described as "what collaboration diagrams should have been."

Wednesday, November 8, 2006

4-Second Rule?

 It looks like Juniper Research has finally done away with the 8-second rule in favor of a 4-second rule. I want to point something out right up front... This new "rule" is based on a survey that asks the question...
Question: Typically, how long are you willing to wait for a single Web page to load before leaving the Web site? (Select one.)
A. More than 6 seconds.
B. 5-6 seconds.
C. 3-4 seconds.
D. 1-2 seconds.
E. Less than 1 second.
Sorry Juniper - I promise that if we sat down with your respondents and asked them to identify how many seconds various pages took to load that MOST of them would not get it right and that MOST of the wrong ones *think* a page takes longer to load than it actually does. Reviewing the report for yourself here: http://www.akamai.com/html/about/press/releases/2006/press_110606.html
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Friday, October 6, 2006

Thoughts on Certification

 I got an email asking a question about certification that I thought others might find interesting.

Hello,

I'm new to the QA arena, and haven't found a mentor yet, beyond the publications of those like yourselves. So far, I don't see that there is one internationally accepted certification for QA in general. I know there is the CSTE http://www.softwarecertifications.org/, and the ISTQB http://www.sqe.com/certification.asp?f=dis&ci=stf , which at least one of you worked on. My perception is that CSTE is a bit more accepted (when I search dice.com for both acronyms, I get a few more for CSTE, but still not many), but otherwise it's certification specific to tools like WinRunner or languages.


I have found also that CompTIA ( www.comptia.org) suggests these:

CompTIA A+
CompTIA e-Biz+
CompTIA i-Net+
CompTIA Server+
Certiport's Internet & Computing Core Certification - IC³

But no one else inside QA seems to have heard of them. Are they helpful, or is CompTIA just trying to earn money? Are there general certs that help QA?

Thanks for any insight you can offer! Keep up the great QA work!

and my response...

Friday, July 14, 2006

Choosing Performance Testing with Scott Barber (Stickyminds interview reprint)

A Word with the Wise:
Choosing Performance Testing with Scott Barber
by Joseph McAllister

Every kid eventually puts some thought into the question "What do you want to be when you grow up?" For PerfTestPlus CTO Scott Barber, who specializes in context-driven performance testing and analysis for distributed multi-user systems, the answer was not "performance tester." He planned to follow in the footsteps of his father, an industrial arts teacher, and sought an ROTC-scholarship-funded degree in civil engineering. In his junior year of college, though, Scott learned that his first years with the Army Corps of Engineers would involve digging foxholes for infantry rather than building bridges with the Seabees.

"I decided that if I was going to be crossing the front lines, I'd much rather be carrying heavy weaponry than heavy shovels," he says.