Showing posts with label Performance Testing. Show all posts
Showing posts with label Performance Testing. Show all posts

Friday, September 6, 2013

Just Another Manic Cyber Monday: Are you Ready?

Once September starts to roll around it seems like everyone’s preparing for something, be it returning to school, the fantasy football season, corporate budget planning, or looking for deals on end of model year vehicles. For me, it’s the time of year when I help people prepare for Cyber Monday, which has become the biggest online shopping day of the year.

So, is your website really ready to capitalize on all that buying fervor? Think about it. By September, your company is surely finalizing new products and marketing campaigns for the holiday season. But all those preparations will be for naught if your website isn't up to the challenge of increased holiday traffic – especially if your ops group doesn't have a system in place to monitor and react to the impact of that traffic in real time. The truth is, if your organization doesn't have a strategy in place by early September, you have a scant few weeks remaining to put one together. After that is done, you’re at serious risk of becoming ‘that company’ – you know, the one that makes headlines this holiday season for a massive site outage instead of record sales numbers – and the risk increases exponentially with every week you delay. If your company sells products that people want to give as gifts for the holidays, Cyber Monday is likely to be the busiest day of the year for your website.

Read the rest of this post here.
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

I don't produce software systems.
I help those who do produce software systems do it better.
I am a tester.

Saturday, August 24, 2013

Any Given Thursday – Digging into Nasdaq’s 3-Hour Outage

This has been an uncharacteristically bad week for web performance, with several major and historically reliable services reporting outages due to "network issues". In my (not always so humble) opinion:
"Insufficient available bandwidth causing an outage, however, bothers me. A lot. There is absolutely no good reason for insufficient bandwidth to cause an outage. Maybe a slowdown, but if a flood of network traffic (not a flood of traffic to your site, just a whole bunch of traffic on the same network as your site) leads to an outage, something is wrong, at least in my book."
Read the rest of Any Given Thursday
Read part 1 of my commentary in Any Given Monday

--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

I don't produce software systems.
I help those who do produce software systems do it better.
I am a tester.

Tuesday, August 20, 2013

Any Given Monday – Google, Microsoft and Amazon All Experience Outages

It started out like any other Monday morning. I woke up, got dressed, put my contacts in and started making my way to the kitchen for coffee. Along the way, I launched a browser and the mail client on my laptop (as I always do on “home office” days) and I checked to make sure my son was up. After making coffee, I had a few minutes before it was time to drive my 14-year-old to school, I scanned the headlines in my newsfeed.

The top two headlines read:
I only got to read the hover-over teaser paragraphs before:

   a) I realized it was no longer like any other Monday morning and
   b) my son informed me it was time to go.

I am a link to the rest (and best) of this post

Do you have additional insight into, or were you impacted by any of these outages? Comment below.

--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

I don't produce software systems.
I help those who do produce software systems do it better.
I am a tester.

Wednesday, July 10, 2013

Possible Discounts for Conference Adjacent Engagements!

I am booked to speak at several international conferences during the remainder of this year where I have availability immediately before and/or after to conduct consulting/training for companies or groups local to the conference. Any cost savings I realize by extending my trips (as opposed to making completely separate trips) will be passed on to paying clients. I'm looking forward to working with the folks who take advantage of this rare opportunity.

Specifically, I am available to the "first signed" clients for the following dates in the following locations:

  • September 9-13 and/or September 18-20; in/around Prague,Czech Republic (before/after Agile Prague)  
  • October 7-11; in/around Sydney, Australia (before iqnite)
  • October 22-25; in/around Waterloo, Canada (after Targeting Quality)
  • October 28-November 1 and/or November 11-15; in/around Malmö, Sweden (before/after Øredev)
If you are even mildly interested in engaging me during one of these blocks, please email me immediately. I expect them to fill quickly.

If you are not sure of what services I offer, you can check out the PerfTestPlus website, or take a look at my most commonly requested and (reportedly) valuable offerings below. There are, of course, other services I'd be happy to provide. If you don't see what you are looking for, please contact me and ask.

Tuesday, December 11, 2012

Lessons from NEXT2012 in Romania


I often see folks blogging about what they learned, were inspired by, or impressed them about attending an event. it is far less often when I see a headliner, or promoted presenter blog about the lessons they learned or what inspired or impressed them after the event. I've often wondered why that is.

For me, it has a lot to do with needing to quickly shift gears upon completing an event to catch-up on all the things that I put off to prepare for the event, figure out what immediate stuff landed in my inbox while I was ignoring it, and to follow-up on leads, lessons, inspirations and curiosities from the event itself.

Well, I'm going to make a concerted effort to do better about posting my lessons from events, starting with NEXT2012, hosted by SoftVision, held in Cluj-Napoca, Romania, Oct. 26-27
So, what were my take-aways from NEXT2012?

  • I'm *really* excited about how I'm now organizing and packaging my performance-related materials (more on that in a separate post).
  • SoftVision did a fantastic job organizing and handling logistics.
  • I am seriously impressed with the people I interacted with on both a professional and technical level.
  • Those same people are social, collaborative, friendly and are able to enjoy their work and create enjoyable work environments while being professionally and technically impressive.
  • Romania (as well as several surrounding areas not widely considered "software/technical powerhouses") is an emerging market worth watching.

Monday, October 8, 2012

Training Performance Testers in Romania

Later this month, I'm headed to Cluj, Romania to deliver 2 days of performance testing related training, day 1 is conference style, day 2 is workshop style. I'm kind of excited about this, not *just* because I've never been to Romania, but because of the interaction I've had up to this point with the group organizing the event. The English version of the event webpage is http://conferinta.softvision.ro/en/

Monday, August 27, 2012

Top 10 Tips for Performance Test Tool Evaluation from STP Online Summit

I had the pleasure of hosting the another Online Summit, delivered by Software Test Professionals: Survey of Performance Testing Tools. The online summit format consists of 11 sessions over 3 consecutive days. The sessions for this summit were:
One of my duties as host was to try to summarize the most valuable nuggets of information from across all of the presentations into a "top tips" list. This is what I came up with:

Scott's Top 10 Tips for Performance Testing Tool Evaluation from:



Tuesday, August 7, 2012

Can your website handle "instant celebrity"?

Ok, so I feel a touch voyeuristic even admitting this, but while I was checking on the latest from the Olympics I followed a link under Latest News -> Michael Phelps with a tag line of "What do you do after becoming the most accomplished Olympian in history? Date a model."

It was a tasteful piece about Michael bringing his (until now) "under the radar" girlfriend, Megan Rossee to some public event. Having (apparently like a lot of people) never heard of her,  I clicked on the link for her website (www.meganrossee.com)  in the article. What I got for my curiosity was *far* better than a bunch of portfolio photos of a model. I got the following:



Friday, April 6, 2012

Desperately Seeking "Performance Unit Testing" Examples

I've been talking about what I term "Performance Unit Testing" in classes and training courses for a long time. I've been teaching (more inspiring with hints toward implementation) client development teams about it for almost as long. Problem is, all I've got is stories that I can't attribute (NDAs and such) and that simply doesn't cut it when trying to make a point to someone who doesn't (or doesn't want to) get it.

So I'm looking for attributable samples, examples, stories, and/or case studies related to "Performance Unit Testing" that I can use (clearly, with attribution) in talks, training classes, maybe even blogs & articles. If you have something, please email me.

If you're not sure if you've got what I'm looking for, lemme share some desired attributes of what I'm looking for:

Monday, April 2, 2012

Let's Test 2012

The first (as far as anyone I know is aware) Context-Driven conference in Europe is quickly approaching. On May 7-9, 2012 in Stockholm, Sweden, Let's Test "A European conference on context-driven testing - for testers, by testers" will take place.

This is a CAST inspired conference, meaning that it focuses on in-depth exploration of topics, includes facilitated discussion as part of every talk (i.e. speakers don't get to "run out of time" as soon as they hear that "hard question") and conferring only increases between and after sessions. It's a fabulous format! If you haven't experienced it, and you are passionate about testing, you really want to -- it will change your perspective on conferences forever.

I am proud to say that I will not only be attending Let's Test 2012, but that I am honored to be on the program with some first-run content that I'm very excited about:

A Full Day Tutorial: Context Appropriate Performance Testing, from Simple to Rocket Science
A Keynote: Testing Missions in Context From Checking to Assessment
 

Friday, March 16, 2012

Business Value of Testing: Find Bugs ≠ Mission

My introduction to software testing was as a performance tester. Before the completion of my first project, I had a firm understanding of the primary mission of performance testing. That understanding has changed very little to this day, though I have improved how I communicate that mission. Currently, I phrase the (generic) primary mission of performance testing as:
"To find how many, how fast, how much and for how long; to assist business w/ related technical and/or business decisions and/or; to support related improvement activities as prioritized by the business."
That certainly doesn't mean that all performance testing efforts include every aspect of that mission, but I'm hard pressed to imagine something that I'd call performance testing that includes none of the aspects of that mission. It is also true that I have experienced all manner of secondary missions as part of a performance testing effort (missions related to fail-over, and denial of service attack security, for example). Those combinations, permutations, variations and additions are all where context comes into play.

However; when I was first asked to help out with some functional testing, I quickly realized that I didn't really know what was expected of me, so I asked. As you might imagine, folks looked at me like I'd grown a second head... a green one... with scales and horns. After about the 4th time I asked the question, got a funny look, watched as it became clear I was serious and received some variant of the answer:
"Find bugs"

Monday, March 12, 2012

Processing may take up to 60 seconds?!?






Seriously?!? This was a simple ccard transaction for a storage unit! Admittedly, it only took about 20 seconds, but it was still long enough for me to push the button, read the text, exclaim "You've *GOT* to be *KIDDING* me!!", my 12 y/o son to ask "What?", me to respond "60 seconds to process a payment on line, " him to reply "That's stupid", me to launch snipping tool & grab a capture before it processed my payment.

Grrr.... Hey, I've got an idea, why don't they give me a unit free for the next 10 years in exchange for 25 hrs of performance testing/tuning (and that would *still* be less than my typical bill rate) so that other folks don't have to deal with this crap.

Sunday, December 25, 2011

Curse of the Performance Tester?

Seriously?!? After wrapping gifts until nearly 5am (I was behind by even my standards due mostly to work travel, client commitments & preparing to close the corporate books for 2011), and getting up before 8am to celebrate Christmas with my boys, I finally stole a few minutes when I noticed they'd both fallen asleep on the couch to play with *my* new toy (i.e. install Skyrim), only to be foiled by...

Monday, December 5, 2011

10 Must-Know Tips for Performance Test Automation

More than other automation, bad performance test automation leads to:
  • Undetectably incorrect results
  • Good release decisions, based on bad data
  • Surprising, catastrophic failures in production
  • Incorrect hardware purchases
  • Extended down-time
  • Significant media coverage and brand erosion
More than other automation, performance test automation demands:
  • Clear objectives (not pass/fail requirements)
  • Valid application usage models
  • Detailed knowledge of the system and the business
  • External test monitoring
  • Cross-team collaboration
Unfortunately, bad performance test automation is:
  • Very easy to create,
  • Difficult to detect, and
  • More difficult to correct.
The following 10 tips, based on my own experiences, will help you avoid creating bad performance test automation in the first place.

Tip #10: Data Design
  • *Lots* of test data is essential (at least 3 sets per user to be simulated – 10 is not uncommon)
  • Test Data to be unique and minimally overlapping (updating the same row in a database 1000 times has a different performance profile than 1000 different rows)
  • Consider changed/consumed data (a search will provide different results, and a item to be purchased may be out of stock without careful planning)
  • Don’t share your data environment (see above)

Thursday, October 27, 2011

WOPR 17, my takeaways


The Workshop On Performance and Reliability (WOPR) 17 was held Oct 20-22, 2011 on the theme of “Finding Bottlenecks”.  Beyond the fact that this was a historic event in the sense that no other peer workshop inspired by LAWST has convened this many times.  Of course, as a co-founder of WOPR, I’m (somewhat unreasonably) proud of this accomplishment, but the fact that over the last 9 years so many folks have been so inspired by the community and value of WOPR that they have been willing to volunteer their time to plan and organize these events, their companies have been willing to donate meeting space (and often food & goodies), and participants have been frequently willing to pay their own way (sometime taking vacation time) to attend makes 17 events, one every 6 months, since WOPR 1 is a significant achievement – whether or not my “founder’s pride” is justified.  :)

As is the tradition of WOPR 20-25 folks, selected or invited by the “content owner” (a.k.a. the person or team who chose the theme to be explored this time) brought their personal experiences related to “Finding Bottlenecks” to share and explore with one another.  Also as is the tradition, certain patterns and commonalities emerged as these experiences were described and discussed. Everyone has their own take, there are no official findings, and I’m not even going to pretend that I can attribute all the contributing experiences and/or conversations to my takeaways below.

  • Finding bottlenecks can be technically challenging, examples include: 
    •  Analyzing the test & the data is far from straight forward 
    •  The “most useful” tools to narrow down the bottleneck may not be available – forcing us to be technically “creative” to work around those roadblocks.   
  • Finding bottlenecks can be *very* socio-politically challenging, examples include:o   
    • Lack of Trust (e.g. “That’s not a bottleneck, that’s the tool!”)
    • Denial (e.g. “It’s not possible that’s related to my code!”)
    • Lack of cross-team collaboration (e.g. “No, you can’t install that monitor on *our* system!”)
  • Sometimes human bottlenecks need to be resolved before technical bottlenecks can be found. (e.g. Perf Team being redirected, resources being re-allocated, excessive micromanagement, etc.)
Some other topics that came up that were relevant and interesting (such as the frequent discrepancy between tester/technical goals & business goals), but since these weren’t “on theme” we didn’t discuss these topics deeply enough for me to draw any conclusions other than “the points and positions that did come up were consistent with what I would have anticipated if I’d thought about it in advance”, which, for me, is a nice confirmation.

My point in sharing these thoughts on finding bottlenecks is so that all the folks out there who feel like theirs is the only organization that is thwarted by socio-political challenges even more than technical ones can realize that they really aren’t alone.

The findings of WOPR17 are the result of the collective effort of the workshop participants: AJ Alhait, Scott Barber, Goranka Bjedov, Jeremy Brown, Dan Downing, Craig Fuget, Dawn Haynes, Doug Hoffman, Paul Holland, Pam Holt, Ed King, Richard Leeke, Yury Makedonov, Emily Maslyn, Greg McNelly, John Meza, Blaine Morgan, Mimi Niemiller, Eric Proegler, Raymond Rivest, Bob Sklar, Roland Stens, and Nishi Uppal.
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Monday, October 17, 2011

Having lunch with a giant...

I "officially" started my career in software performance in Feb of 2000, only much later to realize I'd started down that path years prior.  In the fall of 2001 (10 years ago), I felt I was stagnating in my self-guided education and went on a hunt for books, articles, training, and/or people to learn from.  I found some peers (and eventually co-founded WOPR with Ross Collard to maximize peer learning) and I found 3 "giants" on whose shoulders I've stood since then (meaning, all of my work was and has remained consistent, complimentary, and/or extended from their work in the field).  Those "giants" are Connie Smith, Ph.D. (Software Performance Engineering), Daniel Menasce, Ph,D. (Capacity and Scalability Planning) and Alberto Savoia (Performance Testing).

Last fall, I had the honor of being on a panel with Connie and spending some time talking to Daniel during the CMG conference in Orlando.  I'd never spoken or corresponded with them before that, but it was nice to meet them and we had some great conversations.

Over the years, however, I have corresponded regularly with Alberto Savoia.  As it turns out, he was moving on to what he would now call his next "it" from software performance as I was becoming known in the industry, so we didn't converse regularly, but we did follow each others careers.  During that time, I drew a lot of inspiration from Alberto.  Not just from the work he'd done in the software performance space, but also from his other accomplishments in technology, the kind and complimentary recommendations he gave me and by graciously agreeing to write a forward for Performance Testing Guidance for Web Applications when I asked.

So earlier this year when I had the chance, I dropped everything to review and comment on his new "it", Pretotyping. He said the review was helpful and that some of what I'd commented on would be included in the next version.

Today, I finally met Alberto face to face.  We had lunch.  We talked about projects & passions old and new, we recalled history and speculated about the future.  He gave me a signed copy of Pretotype It, and I gave him a signed copy of Web Load Testing for Dummies, both of which had been prepared in advance.  And while Alberto has accomplished far more in his technology career than I have, somehow I didn't feel like I was having lunch with the "giant" on whose shoulders most of the work I am known for stands, I felt like I was having lunch with an old friend that I hadn't seen in too long.

To some of you, I suspect this seems a silly thing for me to be making a big deal about, but for a guy who left a small town twenty-some-odd years ago, never imagining that I'd meet anyone "famous", let alone become a "celebrity" of sorts in my (admittedly very small) field, it means a lot to me that someone who I've often credited as being a luminary to me, would not only take the time to have lunch with me, but to share thoughts and ideas with me like friends do.

So, thanks Alberto.  Thanks for the years of inspiration & thanks for the confirmation of friendship.  It means a lot to me, and know that you've provide me with another lead I intend to follow with anyone I may inspire during my career and later have the opportunity to meet.


--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Wednesday, October 5, 2011

Web Load Testing for Dummies: Book Announcement



"More so now than ever before, your company’s website and web applications are critical to the success of your business initiatives. Think of all the business generated or sustained via the World Wide Web today compared to any other time in history — in today’s digital culture, a business with any sort of crucial web presence needs to make sure that its website is working hard for the business and not against it. That’s what web load testing is all about.

"Key to success on the web is customer experience, which means that web application performance is a priority. Not convinced? Spend a few moments thinking about the impact to your business (in other words, think about how angry the CEO and/or investors would be) if:
  ✓ Your new application launch is delayed due to performance problems
  ✓ Your site breaks under the load of your successful marketing promotion
  ✓ High-traffic volume causes such poor web performance on your busiest online shopping day that abandonment skyrockets and conversions plummet
  ✓ Your new infrastructure is configured improperly, grinding the website to a crawl

"Managers and executives of organizations that derive significant portions of their revenue from web applications realize that they need to focus more on protecting revenue, reducing risk, and ensuring that customers have great experiences. They see how web applications that perform well on release day and throughout their production lives strengthen the company’s brand and reputation, creating customer loyalty. In other words, web load testing is a critical component to any risk management plan for web applications."

Get the eBook version free here.

--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Monday, October 3, 2011

An overview of Performance Testing for Agile/Lean teams

I'm going to be giving a short webinar on Oct 20 titled "An overview of Performance Testing for Agile/Lean teams" as part of a really cool recurring online mini-conference/webinar series call "Bathtub Conferences"  Check out the website for more information.

http://bit.ly/nWIvzk
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Stop Cheating and Start Running Realistic Tests

I did a webinar with SOASTA on 9/29/2011, in case you missed it, I've copied the description and links from SOASTA's Info Center so you can have a look.  If the twitter-verse is to be believed, it didn't suck.  :)

--

Stop Cheating and Start Running Realistic Tests

Constrained by inflexible test hardware, poor tool scalability, exorbitant pricing models, and lack of real time performance information, performance testers have been forced to cheat for too long! Cloud Testing opens up elastic, full-scale load generation from global locations at affordable cost, rapid and accurate test building, and real time views of internal and external performance metrics.
  • Stop removing “think times” to work around technical or license issues
  • Build tests using real business workflow, not just a flood of page hits
  • Run tests that preserve session states and accurate timings, end-to-end
  • Inspect every component as tests run, not just from the outside-in
Watch the Webinar | Download the Webinar
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

Thursday, September 29, 2011

Making Every Test Count

This is from a while back, but I wouldn't call it dated.  It's a webinar, it runs for 48 min.  I like it, for whatever that's worth.  ;)

Abstract:

Do you ever find yourself wondering what the point is to executing this test... again!?!  Have you ever felt like the purpose of a test is to ensure there is a check mark in a particular check box?  Are you ever asked to get *more* information in even less time with even fewer resources than the lst test project you worked on?

In this presentation, Scott Barber will introduce you to a variety of tips and techniques you can apply to virtually any testing you do as you strive to make ever test you execute add value to the project.


--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."