Wednesday, June 17, 2009

Analyze Performance Results, Report, and Retest

Managers and stakeholders need more than just the results from various tests — they need conclusions, as well as consolidated data that supports those conclusions. Technical team members also need more than just results — they need analysis, comparisons, and details behind how the results were obtained. Team members of all types get value from performance results being shared more frequently. Before results can be reported, the data must be analyzed. Consider the following important points when analyzing the data returned by your performance test:
Analyze the data both individually and as part of a collaborative, cross-functional technical team.
Analyze the captured data and compare the results against the metric’s acceptable or expected level to determine whether the performance of the application being tested shows a trend toward or away from the performance objectives.
If the test fails, a diagnosis and tuning Step are generally warranted.
If you fix any bottlenecks, repeat the test to validate the fix.
Performance-testing results will often enable the team to analyze components at a deep level and correlate the information back to the real world with proper test design and usage analysis.
Performance test results should enable informed architecture and business decisions.
Frequently, the analysis will reveal that, in order to completely understand the results of a particular test, additional metrics will need to be captured during subsequent test-execution cycles.
Immediately share test results and make raw data available to your entire team.
Talk to the consumers of the data to validate that the test achieved the desired results and that the data means what you think it means.
Modify the test to get new, better, or different information if the results do not represent what the test was defined to determine.
Use current results to set priorities for the next test.
Collecting metrics frequently produces very large volumes of data. Although it is tempting to reduce the amount of data, always exercise caution when using data-reduction techniques because valuable data can be lost. Most reports fall into one of the following two categories:
Technical Reports
Description of the test, including workload model and test environment.
Easily digestible data with minimal pre-processing.
Access to the complete data set and test conditions.
Short statements of observations, concerns, questions, and requests for collaboration.
Stakeholder Reports
Criteria to which the results relate.
Intuitive, visual representations of the most relevant data.
Brief verbal summaries of the chart or graph in terms of criteria.
Intuitive, visual representations of the workload model and test environment.
Access to associated technical reports, complete data sets, and test conditions.
Summaries of observations, concerns, and recommendations. The key to effective reporting is to present information of interest to the intended audience in a manner that is quick, simple, and intuitive. The following are some underlying principles for achieving effective reports:
Report early, report often.
Report visually.
Report intuitively.
Use the right statistics.
Consolidate data correctly.
Summarize data effectively.
Customize for the intended audience.
Use concise verbal summaries using strong but factual language.
Make the data available to stakeholders.
Filter out any unnecessary data.
If reporting intermediate results, include the priorities, concerns, and blocks for the next several test-execution cycles.


http://perftesting.codeplex.com/Wiki/View.aspx?title=How%20To%3A%20Conduct%20Performance%20Testing%20Core%20Steps%20for%20a%20Web%20Application

Source - Microsoft book

No comments:

Post a Comment