By James Ohia
Introduction
Performance testing is a crucial step in the software development lifecycle, ensuring that the application or system can handle the expected load and meet the performance requirements. However, the success of performance testing largely depends on the ability to communicate the test results and their implications to stakeholders effectively. This is where a well-crafted performance test summary report comes in.
In this article, I will discuss the best practices for creating a performance test summary report and the key components that should be included in it. I will also include a full example performance summary report that was done in accordance with a performance test I performed. You can find demo performance tests here
Best Practices
- Keep it concise and easy to understand: The report should be concise and easy to understand, even for non-technical stakeholders. Avoid using technical jargon and provide clear explanations for any technical terms that are necessary.
- Provide a clear summary: Start the report with a brief summary of the test objectives, methodology, and key findings. This will give the stakeholders a quick overview of the report and help them understand the context of the test.
- Include details of the test environment: It is important to include details of the test environment, such as hardware and software configurations, network setup, and any other relevant details. This will help stakeholders understand the conditions under which the test was conducted and put the results in context.
- Present the test results effectively: The test results should be presented in a clear and concise manner, using tables, graphs, or charts where appropriate. The report should include both quantitative and qualitative results, such as response times, throughput, error rates, and system stability.
- Provide clear conclusions and recommendations: Based on the test results, provide clear conclusions and recommendations for the stakeholders. This will help them understand the implications of the test and make informed decisions about whether to deploy the application or system.
ย
Key Components
- Test Objectives: Clearly state the test objectives and the performance requirements that the application or system needs to meet.
- Test Methodology: Describe the methodology used for the performance test, including the test scenarios, the load generation strategy, and the metrics used to measure performance.
- Test Environment: Provide details of the test environment, including hardware and software configurations, network setup, and any other relevant details.
- Test Results: Present the test results in a clear and concise manner, using tables, graphs, or charts where appropriate. Include both quantitative and qualitative results, such as response times, throughput, error rates, and system stability.
- Findings and Analysis: Describe the key findings and analysis of the test results, including any bottlenecks, performance issues, or areas for improvement.
- Conclusions and Recommendations: Based on the test results, provide clear conclusions and recommendations for the stakeholders, including whether the application or system is ready for deployment, and if not, what needs to be done to improve performance.
The rest of this article will showcase a full performance test report that can help you relate to how one is written.
Performance Test Summary Report
ย
Summary
Performance Testing was done using the JMeter Performance tool. Here we focused on observing metrics such as the response time of each endpoint, the error percentage, the type of error that occurred during the performance testing, and the CPU, Memory, Disk, and Network utilization. This performance test is executed on https://jsonplaceholder.typicode.com/users using a test plan where the minimum number of users acted as the baseline and this number was increased to 100% at the end of the test.
The test was executed in 5 different cycles; subsequent cycles will include a 25% increase in the number of users for the same ramp-up time which is 10 minutes. The result for all the cycles was identical whereby the test was successful with all the cycles having 100% pass rate and the average response time having most of them at the range of 500ms and below. The memory was unaffected by the load staying at 38% in all the cycles while the CPU spiked up to 78% on the highest cycle run.
ย
Environment
The test was conducted on the following IP address and port
IP Address | Port |
172.10.10.10 | 4444 |
ย
Services/Test Plan
Test Result Analysis
The following Graph shows the application performance index, statistics containing executions, response times, and throughput for 10 minutes of Ramp-up time.
Test Server Cycle 1 run
This run involves the minimum number of users acting as the baseline which is 2000.
The Table shows us the number of concurrent virtual users sending a request, response time, and an error percentage across all services that were called. This result shows us that the run was successful with a 100% pass rate. It also shows us how good the average response time is.
Test Server Cycle 2 run
This run involves a 25% increase in the number of concurrent users from the cycle 1 run which is 2500 users.
This result shows us that the run was successful with a 100% pass rate. It also shows us how good the average response time is.
Test Server Cycle 3 run
This run involves a 50% increase in the number of concurrent users from the cycle 1 run which is 3000 users.
This result shows us that the run was successful with a 100% pass rate. It also shows us how good the average response time is.
Test Server Cycle 4 run
This run involves a 75% increase in the number of concurrent users from the cycle 1 run which is 35000 users.
This result shows us that the run was successful with a 100% pass rate. It also shows us how good the average response time is.
Test Server Cycle 5 run
This run involves a 100% increase in the number of concurrent users from the cycle 1 run which is 4000 users.
This result shows us that the run was successful with a 100% pass rate. It also shows us how good the average response time is.ย The results are consistently good in all the cycles.
The graphic shows us that the response time appears to stay stable as the number of concurrent users increases. The majority of transactions were between 501ms to 1500ms.
ย
ERRORS FOUND
The test passed with no server error.
CPU AND MEMORY UTILIZATION
Cycle 1
Here the memory stayed at 38% unaffected by the load whereas the CPU spiked to 55% during this run
Cycle 2
Here the memory stayed at 38% unaffected by the load whereas the CPU spiked to 64% during this run
Cycle 3
Here the memory stayed at 37% unaffected by the load whereas the CPU spiked to 53% during this run. This is the highest CPU spike during the performance test.
Cycle 4
Here the memory stayed at 39% unaffected by the load whereas the CPU spiked to 78% during this run. This is the highest CPU spike during the performance test.
ย
Cycle 5
Here the memory stayed at 38% unaffected by the load whereas the CPU spiked to 55% during this run.
DISK AND NETWORK UTILIZATION
Cycle 1
Here the Disk utilization is below 10% which is good. It is at 5% usage. The Network utilization is at 7.2Mb/s.
Cycle 2
Here the Disk utilization is below 10% which is good. It is at 8% usage. The Network utilization is at 7.5Mb/s.
Cycle 3
Here the Disk utilization is below 10% which is good. It is at 2% usage. The Network utilization is at 11Mb/s.
ย
Cycle 4
Here the Disk utilization is below 10% which is good. It is at 4% usage. The Network utilization is at 11Mb/s.
Cycle 5
Here the Disk utilization is below 10% which is good. It is at 8.5% usage. The Network utilization is at 12Mb/s. The network peaked at 12Mb/s.
Conclusion
The performance testing of the https://jsonplaceholder.typicode.com/users environment can be seen as a success. The result of a 100% pass rate for the minimum number of concurrent users. Nearly all the transactions were less than 1501 milliseconds with over. The memory was unaffected by the load staying constantly at 38%. The CPU spike didnโt pass 80% at double the minimum number of concurrent users. The disk usage maintained was below 10%. This is considered a good record.
Recommendation
The server 172.27.14.21 for the HRMS application is seen as a success and can be recommended to go live for the number of concurrent users planned.
ย
Finally (Not Part of the Report)
Creating a performance test summary report is a critical step in the performance testing process. A well-crafted report can help stakeholders understand the implications of the test results and make informed decisions about whether to deploy the application or system. By following the best practices outlined in this article and including the key components discussed, you can create a performance test summary report that effectively communicates the results and their implications.
James Ohia contributed this piece to TechBooky. Contact him on
Linkedln – https://www.linkedin.com/in/james-ohia/
Github – https://github.com/JamesOhia/
Discover more from TechBooky
Subscribe to get the latest posts sent to your email.