Friday, April 6, 2012

Desperately Seeking "Performance Unit Testing" Examples

I've been talking about what I term "Performance Unit Testing" in classes and training courses for a long time. I've been teaching (more inspiring with hints toward implementation) client development teams about it for almost as long. Problem is, all I've got is stories that I can't attribute (NDAs and such) and that simply doesn't cut it when trying to make a point to someone who doesn't (or doesn't want to) get it.

So I'm looking for attributable samples, examples, stories, and/or case studies related to "Performance Unit Testing" that I can use (clearly, with attribution) in talks, training classes, maybe even blogs & articles. If you have something, please email me.

If you're not sure if you've got what I'm looking for, lemme share some desired attributes of what I'm looking for:

  • Stuff that developers do, in line with writing code (like writing unit tests, or pairing, or peer review, etc) for the purpose of validating, tracking, and/or testing performance.
  • Some folks think of it as "code profiling plus"... but I'm only interested in the stuff done without "for pay tools" and that isn't "button-click auto-magic" in the IDE or compiler (and don't call me old-school for talking about compilers, I work with some hard-core embedded folks who still work in C and Ada!)
  • It doesn't have to be "line of code" level, I'm interested in Object/Component-level stuff too.
  • I'm *not* talking about UI stuff.
  • I'm not interested in anything that accounts for more than *maybe* 2% of a developer's time (the testing part, issue resolution and/or tuning non-withstanding)
  • When I say performance, I mean stuff related to things like responsiveness, resource utilization, size/volume, concurrency.
  • The example I use in class is "Capture the execution time of the object in your unit testing framework. Save that time off to a log or .csv. Once a week, open up the file in Excel & draw a trend line. If that trend line does something unexpected, that's probably a fair indicator that you might want to figure out what change in that object caused the change. The same logic can be applied to CPU, memory, I/O, etc."
Please do forward this to anyone you think has something that they'd be allowed to share with me. Please do not hesitate to ask questions if you're not sure what I'm on about (since I started out planning to only write the first 2 paragraphs of this, then realized that might only make sense to the folks I've already approached and can't share... so I added the bullets... and now I'm not entirely sure the concept comes across well in text... )

Thanks in advance!
 
--
Scott Barber
Chief Technologist, PerfTestPlus, Inc.
Director, Computer Measurement Group
About.me

Co-Author, Performance Testing Guidance for Web Applications
Author, Web Load Testing for Dummies
Contributing Author, Beautiful Testing, and How To Reduce the Cost of Testing

"If you can see it in your mind...
     you will find it in your life."

4 comments:

Alberto Savoia said...

Scott, in my chapter "Beautiful Tests" (in the book "Beautiful Code") I have a section on using JUnit for performance testing.

Here's the link: https://docs.google.com/open?id=0B0QztbuDlKs_ODQ2OWFkYTktZGFjYy00OTAyLWIxNzUtMGJhOWNkMDM1NTQw

Unknown said...

Brilliant! Thanks Alberto!

Santosh Arakere Marigowda said...

This is exactly what I am trying to achieve. Automated unit level performance tests which runs every night completely automated and posts the graphs into a centralised place. I use JMeter for unit level tests, R, awk, plyr, reshape and ggplot2 to parse, analyse and plot various graphs (completely automated). I have implelemnted this framework in a project successfully. The graphs will show both history and current test results.

QA Thought Leaders said...

Yet another excellent post on Performance Unit Testing. Thank you for sharing this post. How about sharing a post with your expertise on performance tuning or performance testing methodologies. Look forward to your next post.