~Performance Queries: Choosing a Use Case Option~

The initial step of creating a customized performance query involves defining the purpose of the query. The use case option that is chosen determines the type of charts that are later generated.

Performance Analysis

Identify, validate and compare application, desktop and End User performance trends. Use this option to:

  • Validating application performance
  • Identifying performance trends
  • Determine the source of a performance problem

Performance Analysis query results can be viewed as a grid, a column chart (simple or grouped), a line chart, or a pie chart.

Comparative Analysis

This use case option compares the performance of any two groups of End Users. A group can consist of any defined set of primary filters. Use this option to:

  • Compare performance of End Users using different application versions
  • Compare performance of single End Users vs. their departments
  • Analyzing location or virtual environment impact on performance

Quality of Service Analysis

This option provides a statistical comparison of selected activities against a single value that represents average performance. The results can be viewed as a grid or as a bubble chart. The bubble chart generated displays performance results relative to average overall performance.

To accurately compare the performance of users across different applications (as opposed to a specific monitored attribute), it is necessary to create a grade of performance for applications and users. This statistical representation of performance is called a Universal Performance Indicator (UPI). The UPI provides the ability to represent user performance across different tiers as a percentile. In the figure below, locations above the fiftieth percentile have worse performance (above average), and locations below the fiftieth percentile have better performance (below average).

A Quality of Service Analysis query can be generated using one of the following methods:

  • The user generating the query defines a reference group. This group represents the average quality of service (50th percentile) against which all other activities are compared.
  • No reference group is defined. In this case, the average measurements of all selected activities are compared against each other, and the mean result is designated as the 50th percentile, against which all other activities are compared.

Use Quality of Service Analysis to:

  • Identify application/desktop performance outliers
  • Isolating worst performing users or servers

Influencing Performance Factors

This option provides a statistical analysis of selected activities against all comparison categories. The bar chart generated identifies the attributes that most influence performance, e.g., OS version, location, etc.

Influencing Performance Factors results can be viewed as a grid or as a bubble chart. The Grid view includes columns that show the number and percentage of End Points with poor performance metrics.