Showing posts with label Peak Performance. Show all posts
Showing posts with label Peak Performance. Show all posts

Friday, September 6, 2013

Just Another Manic Cyber Monday: Are you Ready?

Once September starts to roll around it seems like everyone’s preparing for something, be it returning to school, the fantasy football season, corporate budget planning, or looking for deals on end of model year vehicles. For me, it’s the time of year when I help people prepare for Cyber Monday, which has become the biggest online shopping day of the year.

So, is your website really ready to capitalize on all that buying fervor? Think about it. By September, your company is surely finalizing new products and marketing campaigns for the holiday season. But all those preparations will be for naught if your website isn't up to the challenge of increased holiday traffic – especially if your ops group doesn't have a system in place to monitor and react to the impact of that traffic in real time. The truth is, if your organization doesn't have a strategy in place by early September, you have a scant few weeks remaining to put one together. After that is done, you’re at serious risk of becoming ‘that company’ – you know, the one that makes headlines this holiday season for a massive site outage instead of record sales numbers – and the risk increases exponentially with every week you delay. If your company sells products that people want to give as gifts for the holidays, Cyber Monday is likely to be the busiest day of the year for your website.

Read the rest of this post here.
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

I don't produce software systems.
I help those who do produce software systems do it better.
I am a tester.

Saturday, August 24, 2013

Any Given Thursday – Digging into Nasdaq’s 3-Hour Outage

This has been an uncharacteristically bad week for web performance, with several major and historically reliable services reporting outages due to "network issues". In my (not always so humble) opinion:
"Insufficient available bandwidth causing an outage, however, bothers me. A lot. There is absolutely no good reason for insufficient bandwidth to cause an outage. Maybe a slowdown, but if a flood of network traffic (not a flood of traffic to your site, just a whole bunch of traffic on the same network as your site) leads to an outage, something is wrong, at least in my book."
Read the rest of Any Given Thursday
Read part 1 of my commentary in Any Given Monday

--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

I don't produce software systems.
I help those who do produce software systems do it better.
I am a tester.

Tuesday, August 20, 2013

Any Given Monday – Google, Microsoft and Amazon All Experience Outages

It started out like any other Monday morning. I woke up, got dressed, put my contacts in and started making my way to the kitchen for coffee. Along the way, I launched a browser and the mail client on my laptop (as I always do on “home office” days) and I checked to make sure my son was up. After making coffee, I had a few minutes before it was time to drive my 14-year-old to school, I scanned the headlines in my newsfeed.

The top two headlines read:
I only got to read the hover-over teaser paragraphs before:

   a) I realized it was no longer like any other Monday morning and
   b) my son informed me it was time to go.

I am a link to the rest (and best) of this post

Do you have additional insight into, or were you impacted by any of these outages? Comment below.

--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

I don't produce software systems.
I help those who do produce software systems do it better.
I am a tester.

Tuesday, August 7, 2012

Can your website handle "instant celebrity"?

Ok, so I feel a touch voyeuristic even admitting this, but while I was checking on the latest from the Olympics I followed a link under Latest News -> Michael Phelps with a tag line of "What do you do after becoming the most accomplished Olympian in history? Date a model."

It was a tasteful piece about Michael bringing his (until now) "under the radar" girlfriend, Megan Rossee to some public event. Having (apparently like a lot of people) never heard of her,  I clicked on the link for her website (www.meganrossee.com)  in the article. What I got for my curiosity was *far* better than a bunch of portfolio photos of a model. I got the following:



Friday, April 6, 2012

Desperately Seeking "Performance Unit Testing" Examples

I've been talking about what I term "Performance Unit Testing" in classes and training courses for a long time. I've been teaching (more inspiring with hints toward implementation) client development teams about it for almost as long. Problem is, all I've got is stories that I can't attribute (NDAs and such) and that simply doesn't cut it when trying to make a point to someone who doesn't (or doesn't want to) get it.

So I'm looking for attributable samples, examples, stories, and/or case studies related to "Performance Unit Testing" that I can use (clearly, with attribution) in talks, training classes, maybe even blogs & articles. If you have something, please email me.

If you're not sure if you've got what I'm looking for, lemme share some desired attributes of what I'm looking for:

Monday, December 5, 2011

10 Must-Know Tips for Performance Test Automation

More than other automation, bad performance test automation leads to:
  • Undetectably incorrect results
  • Good release decisions, based on bad data
  • Surprising, catastrophic failures in production
  • Incorrect hardware purchases
  • Extended down-time
  • Significant media coverage and brand erosion
More than other automation, performance test automation demands:
  • Clear objectives (not pass/fail requirements)
  • Valid application usage models
  • Detailed knowledge of the system and the business
  • External test monitoring
  • Cross-team collaboration
Unfortunately, bad performance test automation is:
  • Very easy to create,
  • Difficult to detect, and
  • More difficult to correct.
The following 10 tips, based on my own experiences, will help you avoid creating bad performance test automation in the first place.

Tip #10: Data Design
  • *Lots* of test data is essential (at least 3 sets per user to be simulated – 10 is not uncommon)
  • Test Data to be unique and minimally overlapping (updating the same row in a database 1000 times has a different performance profile than 1000 different rows)
  • Consider changed/consumed data (a search will provide different results, and a item to be purchased may be out of stock without careful planning)
  • Don’t share your data environment (see above)

Monday, May 5, 2008

Identity crisis or delusions of grandeur?

In this month's installment of "Peak Performance" I discuss the frequently erroneous and often grandiose titles software testers have on their business cards or in their e-mail SIGs. Identity crisis or delusions of grandeur? 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."