Best Practices & Tips for Performance Testing Life Cycle
Performance testing is a form of software testing that focuses on how a system running the system performs under a particular load. This is not about finding software bugs or defects. Performance testing measures according to benchmarks and standards. Performance testing should give developers the diagnostic information they need to eliminate bottlenecks.
Test Planning:
Test plan is one of the crucial steps of performance testing, for smooth transition of all performance testing activities, throughout the project life cycle. Deviation from test plan could lead to conflicts in deadlines and deliverables. In order to avoid such situations, it is important to have an effective test plan in place.
- – Provide test schedule for smoke tests and baseline/benchmark tests in test document
- – Mention all statements which are derived out of assumptions
- – Get the test plan reviewed by senior management and approved by client before proceeding to testing
- – Set client expectations early on to avoid any confusion
Test Design & Scripts Development:
Want to reduce scripting load? Listed below are few best practices for the same:
- – Acquire user account details with exact permission level as that of the end users, as testing with admin accounts or accounts with additional features, may create problems while validating the scripts with live accounts
- – Always secure the copy of initial/raw version of the script to refer back whenever needed.
- – Correlate all values which appear to be dynamic, like unix timestamps etc.
- – Have parameters for all user input data in the flow. It is recommended to have a .CSV file format.
Tests Execution:
Now that we have seen how to develop a test plan and design efficient scripts, it is now time to execute them.
- – Gather data requirements in advance and request expected number of accounts from the client.
- – Ensure sufficient privileges for the validated user accounts.
- – Use random ThinkTime in the script to emulate realistic end user behaviour
- – Disable logging during load tests to limit disk writes on load generators
- – Generate load from load generator machines whenever possible instead of using controller/master, as controller will collect results from load generators and render run-time data during test.
Final Delivery and Report Submission
Now is the time to generate reports based on the scripts that you have run and then present to the client.
- – Save final scripts within a designated folder with additional back-up (VSS or SVN or Google Drive)
- – Folder structure for all project artefacts should be organized as below:
- TestDocs could have test plan document, user flow documents and final test reports
- RawResults could have raw results file. (Example: .lrr for loadrunner and .jtl file for jmeter.)
- Reports could contain test reports of that particular test
- – In load test reports, add legends to graphs for enhanced readability.
- – Ensure that all the graphs show data points starting from zero and scale should be corresponding to the data collected.
Types of performance testing for software
To understand how software will perform on users’ systems, there different types of performance tests that can be applied during software testing. This is non-functional testing, which is designed to determine the readiness of a system. (Functional testing focuses on individual functions of software.)
- Load testing
- Stress testing
- Spike testing
- Endurance testing
- Scalability testing
- Volume testing
Most Common Problems Observed in Performance Testing
During performance testing of software, developers are looking for performance symptoms and issues.
- Bottlenecking — This occurs when data flow is interrupted or halted because there is not enough capacity to handle the workload.
- Poor scalability — If software cannot handle the desired number of concurrent tasks, results could be delayed, errors could increase, or other unexpected behavior could happen that affects:
- Disk usage
- CPU usage
- Memory leaks
- Operating system limitations
- Poor network configuration
- Software configuration issues — Often settings are not set at a sufficient level to handle the workload.
- Insufficient hardware resources — Performance testing may reveal physical memory constraints or low-performing CPUs.
Leave a Reply