Showing posts with label What to Test. Show all posts
Showing posts with label What to Test. Show all posts

Tuesday, August 20, 2013

Any Given Monday – Google, Microsoft and Amazon All Experience Outages

It started out like any other Monday morning. I woke up, got dressed, put my contacts in and started making my way to the kitchen for coffee. Along the way, I launched a browser and the mail client on my laptop (as I always do on “home office” days) and I checked to make sure my son was up. After making coffee, I had a few minutes before it was time to drive my 14-year-old to school, I scanned the headlines in my newsfeed.

The top two headlines read:
I only got to read the hover-over teaser paragraphs before:

   a) I realized it was no longer like any other Monday morning and
   b) my son informed me it was time to go.

I am a link to the rest (and best) of this post

Do you have additional insight into, or were you impacted by any of these outages? Comment below.

--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

I don't produce software systems.
I help those who do produce software systems do it better.
I am a tester.

Tuesday, August 7, 2012

Can your website handle "instant celebrity"?

Ok, so I feel a touch voyeuristic even admitting this, but while I was checking on the latest from the Olympics I followed a link under Latest News -> Michael Phelps with a tag line of "What do you do after becoming the most accomplished Olympian in history? Date a model."

It was a tasteful piece about Michael bringing his (until now) "under the radar" girlfriend, Megan Rossee to some public event. Having (apparently like a lot of people) never heard of her,  I clicked on the link for her website (www.meganrossee.com)  in the article. What I got for my curiosity was *far* better than a bunch of portfolio photos of a model. I got the following:



Friday, April 6, 2012

Desperately Seeking "Performance Unit Testing" Examples

I've been talking about what I term "Performance Unit Testing" in classes and training courses for a long time. I've been teaching (more inspiring with hints toward implementation) client development teams about it for almost as long. Problem is, all I've got is stories that I can't attribute (NDAs and such) and that simply doesn't cut it when trying to make a point to someone who doesn't (or doesn't want to) get it.

So I'm looking for attributable samples, examples, stories, and/or case studies related to "Performance Unit Testing" that I can use (clearly, with attribution) in talks, training classes, maybe even blogs & articles. If you have something, please email me.

If you're not sure if you've got what I'm looking for, lemme share some desired attributes of what I'm looking for:

Monday, October 3, 2011

Stop Cheating and Start Running Realistic Tests

I did a webinar with SOASTA on 9/29/2011, in case you missed it, I've copied the description and links from SOASTA's Info Center so you can have a look.  If the twitter-verse is to be believed, it didn't suck.  :)

--

Stop Cheating and Start Running Realistic Tests

Constrained by inflexible test hardware, poor tool scalability, exorbitant pricing models, and lack of real time performance information, performance testers have been forced to cheat for too long! Cloud Testing opens up elastic, full-scale load generation from global locations at affordable cost, rapid and accurate test building, and real time views of internal and external performance metrics.
  • Stop removing “think times” to work around technical or license issues
  • Build tests using real business workflow, not just a flood of page hits
  • Run tests that preserve session states and accurate timings, end-to-end
  • Inspect every component as tests run, not just from the outside-in
Watch the Webinar | Download the Webinar
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Thursday, September 29, 2011

Making Every Test Count

This is from a while back, but I wouldn't call it dated.  It's a webinar, it runs for 48 min.  I like it, for whatever that's worth.  ;)

Abstract:

Do you ever find yourself wondering what the point is to executing this test... again!?!  Have you ever felt like the purpose of a test is to ensure there is a check mark in a particular check box?  Are you ever asked to get *more* information in even less time with even fewer resources than the lst test project you worked on?

In this presentation, Scott Barber will introduce you to a variety of tips and techniques you can apply to virtually any testing you do as you strive to make ever test you execute add value to the project.


--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Saturday, August 25, 2007

Model Workloads for Performance Testing: FIBLOTS

This is the third installment of a currently unknown number of posts about heuristics and mnemonics I find valuable when teaching and conducting performance testing.
 
Other posts about performance testing heuristics and mnemonics are:
 
 
For years, I have championed the use of production logs to create workload models for performance testing. During the same period, I've been researching and experimenting with methods to quickly create "good enough" workload models without empirical data that increase the value of the performance tests. I recently realized that these two ideas are actually complimentary, not exclusionary, and that with or without empirical usage data from production logs, I do the same thing, I:
 
FIBLOTS.
While the play on words makes this mnemonic particularly memorable, I'm not saying that I just make it up. Rather the acronym represents the following guideword heuristics that have served me well in deciding what to include in my workload models over the years.
 
  • Frequent: Common application usage.
  • Intensive: i.e. Resource hogging activities.
  • Business Critical: Even if these activities are both rare and not risky
  • Legal: Stuff that will get you sued or not paid.
  • Obvious: Stuff that is likely to earn you bad press
  • Technically Risky: New technologies, old technologies, places where it’s failed before, previously under-tested areas
  • Stakeholder Mandated: Don’t argue with the boss (too much).
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."