An proper approach for Performance testing
PERFORMANCE TESTING SERVICES
360Logica performance services can help you understand the meaning of ‘acceptable performance’ thresholds in the context of your business. The performance characteristics of the software may be a key business differentiator. However, do you know what these thresholds are. Can you derive metrics to determine that the software meets your thresholds. Can you cost-effectively measure your chosen performance metrics. If you rely on third parties to implement the complete solution, what performance quality gates can you deploy to retain some degree of control. Even at the earliest stages, 360Logica can help you formulate strategies to meet the challenge of delivering high performance software systems.
BUSINESS IMPLICATION
No matter what stage you are at in the lifecycle of your IT systems, 360Logica can provide performance services that optimize your systems’ performance.
Performance Testing Objectives
The objective of a performance test is to demonstrate that the system meets requirements
for transaction throughput and response times simultaneously.
The main deliverables from such a test, prior to execution, are automated test scripts and an
infrastructure to be used to execute automated tests for extended periods. This infrastructure
is an asset and an expensive one too, so it pays to make as much use of this infrastructure
as possible. Fortunately, this infrastructure is a test bed, which can be re-used for other tests
with broader objectives. A comprehensive test strategy would define a test infrastructure to
enable all these objectives be met.
The performance testing goals are:
• End-to-end transaction response time measurements.
• Measure Application Server components performance under various loads.
• Measure database components performance under various loads.
• Monitor system resources under various loads.
• Measure the network delay between the server and clients
Performance Testing
Pre-Requisites for Performance Testing
We can identify four pre-requisites for a performance test. Not all of these need be in place
prior to planning or preparing the test (although this might be helpful), but rather, the list
defines what is required before a test can be executed.
Stable system
A test team attempting to construct a performance test of a system whose software is of poor
quality is unlikely to be successful. If the software crashes regularly, it will probably not
withstand the relatively minor stress of repeated use. Testers will not be able to record
scripts in the first instance, or may not be able to execute a test for a reasonable length of
time before the software, middleware or operating systems crash.
Realistic test environment
The test environment should ideally be the production environment or a close simulation
and be dedicated to the performance test team for the duration of the test. Often this is not
possible. However, for the results of the test to be realistic, the test environment should be
comparable to the actual production environment. Even with an environment which is
somewhat different from the production environment, it should still be possible to interpret
the results obtained using a model of the system to predict, with some confidence, the
behavior of the target environment. A test environment which bears no similarity to the actual
production environment may be useful for finding obscure errors in the code, but is, however,
useless for a performance test.
Performance testing toolkit
The execution of a performance test must be, by its nature, completely automated. However,
there are requirements for tools throughout the test process. Test tools are considered in
more detail later, but the five main tool requirements for our ‘Performance Testing Toolkit’ are
summarized here:
· Test database creation/maintenance
· Load generation tools
· Resource monitoring
· Results analysis and reporting.
Performance Requirements
Performance requirements normally comprise three components:
· Response time requirements
· Transaction volumes detailed in ‘Load Profiles’
· Database volumes
Types of Testing to be Performed:
· Under normal Load - performance testing
· Under anticipated future load - scalability testing
· Under highly abnormal peak load - stress testing
· Under uninterrupted, sustained load - reliability testing
Performance Test Approach
Phase 1: Test Requirements & Identification Study
This activity is carried out during the business and technical requirements identification
phase. The objective is to understand the performance test requirements, Hardware &
Software components and Usage Model. It is important to understand as accurately and as
objectively as possible the nature of load that must be generated.
Following are the important performance test requirement that needs to be captured during
this phase.
• Response Time
• Transactions Per Second
• Hits Per Second
• No of con current users
• Volume of data
• Data growth rate
• Resource usage
• Hardware and Software configurations
Phase 2: Tool Identification & Evaluation
The tool identification and evaluation process would be carried out based on the performance test requirements, protocols, software and hardware used in the application. A POC (Proof Of Concept) would be carried out if required. The objective of this activity is to ensure that the tools identified support all the applications used in the solution and helps in measuring the performance test goals.
Phase 3: Performance Test Strategy
Based on the test requirements and automated tools, a detailed test strategy would be
developed during this phase, which would indicate the test scenarios, load generation types
of testing, etc.
Phase 4: Test Design
Based on the test strategy detailed test scenarios would be prepared. During the test design
period the following activities will be carried out:
• Scenario design
• Detailed test execution plan
• Dedicated test environment setup
• Script Recording/ Programming
• Script Customization (Delay, Checkpoints, Synchronizations points, parameterization etc)
• Data Generation
Phase 5: Test Execution
The test execution will follow the various types of test as identified in the test plan. All the
scenarios identified will be executed. Virtual user loads are simulated based on the usage
pattern and load levels applied as stated in the performance test strategy.
Phase 6: Performance Analysis & Report
The test logs and results generated are analyzed based on Performance under various
load, Transaction/second, database throughput, Network throughput, Think time, Network
delay, Resource usage, Transaction Distribution and Data handling. Manual and automated
results analysis methods can be used for performance results analysis
The following performance test reports/ graphs can be generated as part of performance
testing:-
• Transaction Response time
• Transactions per Second
• Transaction Summary graph
• Transaction performance Summary graph
• Transaction Response graph – Under load graph
• Virtual user Summary graph
• Error Statistics graph
• Hits per second graph
• Throughput graph
• Down load per second graph
Monitoring of the servers are done where we capture the following:-
· CPU Utilization
· Threads Usage
· Connections Usage
· Memory Usage (e.g. JVM usage)
Based on the Performance report analysis, suggestions on improvement or tuning will be
provided to the design team:
• Performance improvements to application software, middleware, database organization.
• Changes to server system parameters.
• Upgrades to client or server hardware, network capacity or routing.
for more details, please send me email at aman@360logica.com
|