5 Lessons Learned on Performance Testing

5 Lessons Learned on Performance Testing

“Quality can be perceived as a repetitive simple task or a complex deep inquiry of the main objective of the functionality.”

About me

A few years ago, I worked on a “Performance Tuning” project immediately following a re-platforming project. My first approach was to consider the technical challenges of scripting and the cluster implementation to run my tests. Partway through the project, we realised that the customer did not consider the differences between frontend and backend performance; their real objective was only to improve the user experience.

Frontend performance depends on CSS, Images, HTML, Javascript functions and frontend requests to other third parties. Backend performance is related to the backend code, the infrastructure, the backend integrations, database and the application logic. Both impact the user experience but they have commonly different tools to measure and test. And they are also very frequently implemented by separate teams or specialists.

We divided the work into 2 phases: frontend improvements and backend enhancements in order to improve the user perception of performance in a heuristic way, tackling only the easiest to fix frontend issues, combined with the initial plan for backend tuning. This required a change of plans in the middle of the project, resulting in extra work that was not originally planned.

Here are some insights and key concepts to consider when planning your strategy and assessment.

1. Avoid working on the technical tasks before defining your Strategy

The most important element of a successful project is the common understanding of the requirements. Why? One of the 4 value statements of the Agile Manifesto is: “Working software over Comprehensive Documentation”, the fact that the physical product (software) and the documentation are in the same sentence, implies how the quality of the requirement impacts the product workability.   This is commonly achieved in an incremental manner, sometimes even fully completed until the end of the project. Even with the collaboration of expert customers during the analysis phase, there are several situations where the requirement will be passed through a refinement process, like developer feedback based on their consultancy experience, a customer review that might find differences in terminology, screen layout and interactions, or edge cases found by the development team during the implementation and testing.  The iterative agile practice with repeated cycles enables this requirement tuning to be simple, modular and straightforward.

Attempt to gather the reasons why the project has been initiated and the objectives that the project will achieve before worrying about the tools or approach of your strategy.

2. Be realistic with your required skill set

The budget for performance projects should cover the modeling phase, scripting, documentation, configuration for the testing environment, a performance cluster and/or third-party service, the monitoring of the system under test, execution, results analysis, system tuning, documentation and reporting. All of these tasks can be performed by different skilled specialists. Basically: a Performance Tester that will effortlessly create the performance scripts and design the user journeys. Also, the Tester or QA should run the tests and report results to the team, a DevOps that might provide the required tooling, the service or HTTP mocking,  the infrastructure or server monitoring, and one or more developers to perform the required improvements related to code issues. Some examples of code performance enhancement are the improvement of the algorithms used, there might be better libraries to review, reduce memory allocation, move on to better data structures, increase the concurrency and analyse the memory management for optimization.

The success of a team depends on having the right mix of skilled people, and not all developers or testers might have the required competencies.

A performance team should have a good understanding of performance testing concepts such as concurrency, latency, throughput, scalability, code profiling, performance tools scripting, monitoring, system architecture, SQL and stored procedures.

3. Define your own requirements

Some requirements you might consider before starting are:

Inputs:

  • Analytics Information, and/or non Functional requirements.
  • A performance test environment.
  • Budget reference to evaluate a service or an in-house performance cluster.
  • Input data generation on the system: Users, VPN accounts, products with stock, database access.
  • Third parties mocked services.
  • A system to monitor your application under test to identify bottlenecks, errors, traffic issues and slow running queries.

4. Refrain from experimenting with new tools on production projects

Besides the ethical dilemma of using an actual project to learn new skills instead of applying your experience (which is what the customer paid for) There is another drawback of learning a new tool in an actual project, you might get stuck by an unknown gap of the technology that will cause you to redefine your work and will cost you time.

Please notice that it does not mean that you should use the same tools forever. My recommendation is to take some time to spike your selected framework before making the decision. Spikes are not estimated in an agile project for a reason; spikes represent a risk, they often correspond to un-estimated work with ambiguity that will serve to explore new tools or new versions with less risk.

5. Third parties matter

The interactions and communication bottlenecks between your system and third parties cannot be excluded when running load testing. Commonly, to verify the behavior of a system under test we think that we should not consider the response times or SLA of third parties. That may be a mistake, if your process depends on a response, if you have set a larger timeout or a resilience strategy that results in more than one attempt when a service is slow or unresponsive that could have a huge impact on a system under peak load.

Never presume that a system is going to respond as fast as it usually does.

Time spent waiting for a response in a background process blocks that worker, and sometimes; depending on the configuration of your threads the same thread could be used for multiple jobs. Same exactly for an HTTP call that you can include in your tests to get information if there is any dependency, the latency will be spread in your threads.

A good analysis of your tests results in a monitoring system could end up in circuit breaker implementation for specific processes

In conclusion, as with any other activity part of the QA Practice, there should be a standard process to follow and adequate for each new project. The tools, requirements and tasks should be known by the full team from Sales and Analysis to the QA, Development and Support Teams, and based on this Performance Project Template, the full team will have a clear understanding of the tasks required and expected results.

About Tacit Knowledge

Tacit Knowledge develops, integrates, and supports enterprise software for household name brands and retailers. Together with our parent, Pitney Bowes, we offer solutions across all consumer touchpoints, from device to doorstep. We connect the entire ecosystem of our customers’ digital commerce business. We build and support the Ecommerce website and associated technologies through to providing fulfillment (Pick & Pack, Kitting) through to Delivery, Returns, and Post-purchase customer engagement tools.
An example of Functional Testing with JMeter
QA THOUGHTS

An example of Functional Testing with JMeter

Learn how Apache JMeter configurable logic makes it easy to create functional testing user journeys without code or scripting.
Read More
5 Lessons Learned on Performance Testing
QA THOUGHTS

5 Lessons Learned on Performance Testing

This article includes some insights and key concepts to consider when planning your performance testing strategy and assessment
Read More
1 2 3 4 9
Minia Oseguera