What Performance Requirements?

What Performance Requirements?

What Performance Requirements?

Next week at QUEST, I’m going to discuss many best practices to follow when performance testing. Today I’m going to talk about the importance of Performance Requirements.

What performance requirements?” has become a running punchline at many, if not most organizations. Admit it, you have probably said this yourself. Sadly, this has become so widespread that it’s become expected and worse yet, acceptable at many organizations. Folks shrug their shoulder because they’re not surprised and performance testing is left to test what they can and report results. Whose fault is this? Is it the test manager’s fault? Development manager? PM? CIO? The answer is it’s everyone’s fault, with performance testing most to blame. Why you ask? Because we have allowed this to become acceptable behavior. It’s on us to educate the organization. We need to show them what a good performance requirement looks like and the value of good performance requirements. We need to be advocates. Ok, stepping down from my soapbox.


So, what is a good performance requirement? When writing a performance requirement, it should be quantifiable and define at minimum, the context and expected throughput, response time, max error rate, and sustained amount of time. Yeah asking for a lot, but we need to set the standard.

Ex: Application logon must have a 90th percentile of no more than 3,000 milliseconds when executing five concurrent logons and be sustainable for 15 minutes where the load on server resources may not be more than 75 percent on average. 


While this may seem like a lot of detail and may not be as concise as some of us are used to when it comes to defining software requirements, this is an example of a good performance requirement because it give the performance tester exactly what he or she needs to build a successful test scenario. Otherwise, the performance tester typically spends a lot of time with development and/or the business unit to try to get an idea of what to test for. When building scripts and if they’re using automation tools, they will be less likely to need to adjust their scripts for scenarios that they may be guessing what the business needs. Here are some examples of poor performance requirements to avoid:

  • Unquantifiable: Must work faster than product x or must perform at least as well as the previous product.
  • Ambiguous: The application should load within an adequate timeframe.
  • Unrealistic: The application should load in less than 0.01 seconds at all times.
  • Unverifiable: Flexible, easy, sufficient, safe, ad hoc, adequate, user-friendly, usable, when required, if required, appropriate, fast, portable, lightweight, small, large, maximize, minimize, robust, quickly, easily, clearly, you get the idea. . .

By being able to clearly define the success criteria, you may complete your performance testing more quickly, more easily show the success and challenges of application performance, possess more useful and meaningful information to make key decisions, save the organization $$$$$, improve software quality, and have happier performance testers, business owners, and software development teams. You are more likely to be able to predict future supporting hardware needs to help build, justify and deliver more accurate budgets. Now only if we can do a better job of advocating and educating our organizations so they may understand and realize these benefits, perhaps “What performance requirements?” will no longer be a punch line and instead raise some red flags.

Please follow and like us:


RSS
Follow by Email
LinkedIn
LinkedIn
Share