Performance report

The performance report summarizes important information about the validity of the run and the data that is most significant during the test run. You can view a collection of performance reports. The report also shows the response trend of the pages in the test run and the graph of the response trend of each page for a specified interval.

List of the terms

The following table provides definitions of the list of the terms used in the reports

Term Description
Attempt The request sent from the client application to the server until the requested page is displayed successfully.
Hit The number of requests sent by the virtual users to the server.
Response time The time taken by the server to respond to the request sent from client application. It is the time from when the request is sent till the page is successfully downloaded.
Standard deviation The standard deviation indicates whether the page was loaded consistently across all virtual users in the test. A smaller standard deviation indicates that the page loaded consistently while a larger standard deviation indicates the page loaded with different times with a larger degree of variance.
Throughput The number of requests responded successfully per unit of time.
Page health The number of pages that loaded successfully from the attempts made, the page is considered healthy.

Overall page

An overview of the overall performance of the pages. The Overall page provides the following information:

  • A progress indicator that displays the status during the playback.
  • A pie chart displays the information about the overall verification point passed and failed in the playback, if they are set. For the schedule run, it displays the overall requirements that passed and failed, including success rate in percentage for:
    • Page Verification Points (VPs) Status
    • Page Element VPs
    • Page Status Code
    • Page Element Status Code
    • Page Health

Option

Description Verdict
Page VPs Displays the verdict of the page title verification points if the verification points are set.
  • Passed
  • Failed
  • Inconclusive
  • Error
Page Element VPs Displays the verdict of the response code or response size verification points if the verification points are set.
  • Passed
  • Failed
  • Inconclusive
  • Error
Page Primary Request Status Codes Displays the success and failure status in the playback.

If a primary request includes verification points, the Page Status Code Successes value indicates that the verification point for the response code is passed.

If a primary request has no verification points, the Page Status Code Successes value indicates that the primary request is received, and a response is returned with a status code.

  • Passed
  • Failed
Page Element Status Codes Displays the success and failure rate in the playback.

If a primary request includes verification points, the Page Element Successes value indicates that the response code verification point passed for that request.

If there are no verification points included in a request, the Page Element Successes indicates that the request is received and returned a response with a status code.

  • Passed
  • Failed
Page Health Displays the total health of the pages, transactions, and loops in the playback or schedule run.
  • Healthy
  • Unhealthy
  • If you click any individual chart, you can go to that specific report to analyze the status in detail.
  • If you click any legend for example, Passed, the chart is updated to show only the other verdicts of a test or a schedule run. For example, the Page Status Codes has a legend as Passed and Failed. If you click Passed, the chart is updated to show only errors in the test run or a schedule run.
  • Similarly, if you double-click any legend, the chart is updated to display only the selected verdict by removing all other verdicts from the chart. Thus, you can focus on only one counter which you want to investigate in detail.

Summary page

The Summary page displays a summary of the page related information that were played back during the test. You can view the details of the test run and can analyze the final or intermediate results of a test quickly.
The Summary page displays the following Run Summary information:

Run Summary
Parameter Description of the value that is displayed
Active The number of virtual users that are active
Completed The number of users completed the test run.
Executed Test The name of the test that were played back during the test.
Elapsed Time The total time taken to complete a test run. The elapsed time is displayed in hours, minutes, and seconds.
Run Status The status of the test that were played back. The Run status can be Initializing Computers, Adding Users, Running, Transferring data to test log, Stopped, or Complete.
Requirement - Status The status of the requirement. The requirement status indicates Passed or Failed status in the playback.
Requirement- Percent Passed The percentage passed data for requirements in the playback.
Total Users The total number of users in the test run.
Total Error Behaviors The number of error behaviors that occurred in the test run.
Total Error Conditions The number of error conditions that occurred in the test run.

The Summary page displays the following Page Summary information:

Page Summary
Parameter Description
Percent Page VPs Passed The percentage of verification points that passed for a page in the playback.
Response Time For All Pages - Average The average time that was taken for the pages to load successfully in the playback.
Response Time For All Pages - Standard Deviation

The standard deviation of the time taken for all the pages to load successfully in the playback.

The standard deviation indicates whether the page was loaded consistently across all virtual users in the test. A smaller standard deviation indicates that the page loaded consistently while a larger standard deviation indicates the page loaded at different times with a larger degree of variance.
Response Time For All Pages - Max The maximum time that was taken for all pages to load successfully in the playback.
Response Time For All Pages - Min The minimum time that was taken for all pages to load successfully in the playback.
Total Page VPs Passed The number of verification points that passed for the page in the playback.
Total Page VPs Failed The number of verification points that failed for the page in the playback.
Total Page VPs Inconclusive The number of verification points that was inconclusive for the page in the playback.
Total Page VPs Error The number of page verification that failed and indicates an error in the playback.
Total Page Attempts The number of requests sent from the client application to the server until the requested page is displayed successfully.
Total Page Hits The number of pages that were loaded successfully.
Total Percent Healthy For All Pages The number of pages that loaded successfully from the number of attempts made is expressed in percentage.
Total Count Healthy For All Pages The count of all pages that loaded successfully from the number of attempts made.
Total Count Unhealthy For All Pages The count of all pages that were not loaded successfully from the number of attempts made.
The Summary page displays the following Page Element Summary information:
Page Element Summary
Parameter Description
Response Time For All Page Elements - Average The average time that was taken to load the elements on all the page in the playback.
Response Time For All Page Elements - Standard Deviation The standard deviation of the response time for all page elements. The standard deviation informs whether the request sent is consistent. If the standard deviation is smaller, then the response time is more consistent, and if the standard deviation is greater, the response time is more varied.
Total Page Element VPs Passed The percentage of verification points that passed for the element on the page in the playback.
Total Page Element VPs Failed The number of verification points that failed for the elements on the page in the playback.
Percent Page Element VPs Passed The number of verification points that passed for the elements on the page in the playback.
Total Page Element VPs Inconclusive The number of verification points was inconclusive for the elements on the page in the playback.
Total Page Element VPs Error The number of verification points that failed indicates an error for the elements on the page in the playback.
Total Page Element Attempts The number of requests sent from the client application to the server until the requested page element is displayed successfully.
Total Page Element Hits The number of elements on the page that were loaded successfully.
Total Page Elements Fresh In Cache, Not Sent to Server The number of elements on the page where no request was sent as the page elements were fresh in the local cache.

If you have set transactions in your test, the Summary page displays the following Transaction information:

Transaction
Parameter Description of the value that is displayed
Elapsed Time For All Transactions - Average The average elapsed time that was taken for all transaction in the playback.
Elapsed Time For All Transactions - Max The maximum elapsed time that was taken for all transaction in the playback.
Elapsed Time For All Transactions - Min The minimum elapsed time that was taken for all transaction in the playback.
Elapsed Time For All Transactions - Standard Deviation The standard deviation of the elapsed time for all transactions in the playback.
Total Transactions Completed [for Run] The number of transactions completed in the playback.
Total Transactions Started [for Run] The number of transactions started in the playback.

Page Performance page

The Page Performance page displays the average response of the slowest primary pages in the test as the test progresses. The page performance displays response for each page and name of each page with the corresponding response time. You can evaluate the response for each page request during and after the test.

The bar chart displays the average response time of the slowest pages. Each bar represents a page that you attempted during playback. When you play back, the bar chart changes, the slowest pages are updated dynamically during the playback. For example, the login page might be one of the slowest pages at the start of the run, but then, as the test progresses, the Shopping Cart page might replace it as one of the slowest. After the run, the page displays the slowest pages for the entire run.

The table that follows after the bar chart provides the following additional information:

Parameter Description of the value that is displayed
Attempts - Rate The number of attempts out of the total number of attempts to get a successful page displayed is expressed in percentage.

For example, if the requested page was displayed for 5 of the 10 virtual users, the rate of page attempts completed is displayed as 50%.

Attempts The number of requests sent from the client application to the server until the requested page is displayed successfully.

For example, if the requested page was displayed after the second request, the number of attempts displayed is 2.

Page

The name of the page in the recording that was attempted in the playback.

Page Response Time - Average The average time that was taken for each page to load successfully in the playback.
Percent Healthy Page The number of pages that loaded successfully from the number of attempts made is expressed in percent.
Page Response Time - Min

The minimum time that was taken for each page to load successfully in the playback.

Page Response Time - Max The maximum time that was taken for each page to load successfully in the playback.
Page Response Time - Standard Deviation

The standard deviation of the time taken for each page to load successfully in the playback.

The standard deviation indicates whether the page was loaded consistently across all virtual users in the test. A smaller standard deviation indicates that the page loaded consistently while a larger standard deviation indicates the page loaded at different times with a larger degree of variance.

Page Details

The Page Details report displays a summary of the statistics of the pages that were played back during the test. You can view the details of the performance of each page.

You can copy the data of the table in the HTML format from the report and use the data in other reports or other applications by using the Copy icon Copy icon.

The Page Details table displays the following information:

Parameter Description of the value that is displayed
Attempts The number of requests sent from the client application to the server until the requested page is displayed successfully.

For example, if the requested page was displayed after the second request, the number of attempts displayed is 2.

Attempts Completed - Rate The number of attempts out of the total number of attempts to get a successful page displayed is expressed in percentage.

For example, if the requested page was displayed for 5 of the 10 virtual users, the rate of page attempts completed is displayed as 50%.

Connection Time - Average The average time that was taken to successfully display the page when requests were sent by all virtual users. The connection time indicates whether the requests are responded, and it is generally used to identify the delay.

For example, if the time to successfully display the page by the 5 virtual users were 100 ms, 100 ms, 150 ms, 150 ms, and 200 ms, the average connection is displayed as 140 ms.

Client Delay Time - Average The average time taken by the client application during the test to process the data before sending out the next request.
Fail The number of verification points that failed.
Hits The number of pages that were loaded successfully.
Page The name of the page in the recording that was attempted in the playback.
Page Response Time Average The average time that was taken for each page to load successfully in the playback.
Page Element Response Time -Average The average time that was taken for all elements to be loaded successfully on the page in the playback.
Page Response Time - Min

The minimum time that was taken for each page to load successfully in the playback.

Page Response Time - Max The maximum time that was taken for each page to load successfully in the playback.
Page Response Time - Standard Deviation

The standard deviation of the time taken for each page to load successfully in the playback.

The standard deviation indicates whether the page was loaded consistently across all virtual users in the test. A smaller standard deviation indicates that the page loaded consistently while a larger standard deviation indicates the page loaded at different times with a larger degree of variance.

Percent Healthy Page The number of pages that loaded successfully from the number of attempts made is expressed in percent.
Page Response Time - Percentile/90 The time taken to load the page successfully in the 90th percentile.
Page Response Time - Percentile/95 The time taken to load the page successfully in the 95thpercentile.
Page Response Time - Percentile/99 The time taken to load the page successfully in the 99th percentile.

Page Element Details

The Page Element Details report displays a summary of the statistics of page elements that were played back during the test. You can view the details of the performance of each page elements.

You can copy the data of the table in the HTML format from the report and use the data in other reports or in other applications by using the Copy icon Copy icon.

The Page Element Details table displays the following information:

Parameters Description of the value that is displayed
Attempts The number of requests sent from the client application to the server until the requested page element is displayed successfully.

For example, if the requested page element was displayed after the second request, the number of attempts displayed is 2.

Fail The number of verification points that failed to load successfully.
Hits The number of elements on the page that were loaded successfully.
Page The name of the element on the page in the recording that was attempted in the playback.
Page Element The name of the element on the page that is attempted to be loaded in the playback.
Page Element Response Time - Average The average time that was taken to load the element on the page during the playback.
Page Element Response Time - Min The minimum time that was taken to load the element on the page during the playback.
Page Element Response Time - Max The maximum time that was taken to load the element on the page during the playback.
Page Element Response Time - Standard Deviation The standard deviation of the time taken for each page element to load successfully in the playback.

The standard deviation indicates whether the element on the page was loaded consistently across all virtual users in the test. A smaller standard deviation indicates that the element on the page loaded consistently while a larger standard deviation indicates the page element loaded with different times with a larger degree of variance.

Page Element Response Size - Average The average size of the element content that is sent by the server to the client application. The unit of the size is displayed in bytes.
Page Element Response Time - Percentile/90 The time taken to load the element on the page successfully in the 90th percentile.
Page Element Response Time - Percentile/95 The time taken to load the element on the page successfully in the 95th percentile.
Page Element Response Time - Percentile/99 The time taken to load the element on the page successfully in the 99th percentile.

Response vs. Time Summary page

The Response vs. Time Summary page displays the average response trend for a specified interval displayed in the form of graph. The page displays two-line graphs with corresponding summary tables.
Graph Description
The Page Response vs. Time The graph displays the average time that was taken for the pages to load successfully in the playback. Each point on the graph is an average of what has occurred during that interval.
Page Element Response vs. Time The graph displays the average time that was taken for the page element to load successfully in the playback. Each point on the graph is an average of what has occurred during that interval.
The table after the graph provides the following additional information:
Summary Description
Performance Summary The summary of average response time and the standard deviation for all pages in the playback.

The summary of average response time for all page elements and the standard deviation in the playback. The table also lists the total number of page elements where no request was sent to the server because the client determined that the page elements were fresh in the local cache.

Response vs. Time Detail page

The Response vs. Time Detail page displays the response trend in the form of graph for a specific interval.
Graph Description
Average Page Response Time [for Interval] The graph displays the average time that was taken for each page for a sample interval to load successfully in the playback. When a schedule includes staged loads, the colored time-range markers, in the graph displays the stages.
The table after the graph provides the following additional information:
Parameter Description of the value that is displayed
Attempts The number of requests per interval from the client application to the server until the requested page is displayed successfully.
Attempts - Rate The number of attempts out of the total number of attempts to get a successful page displayed.
Page Response Time - Average

The average time that was taken for each page to load successfully in the playback.

Page Response Time - Min

The minimum time that was taken for each page to load successfully in the playback.

Page Response Time - Max

The maximum time that was taken for each page to load successfully in the playback.

Page Response Time - Standard Deviation

The standard deviation of the time taken for each page to load successfully in the playback.

The standard deviation indicates whether the page was loaded consistently across all virtual users in the test. A smaller standard deviation indicates that the page loaded consistently while a larger standard deviation indicates the page loaded at different times with a larger degree of variance.

Page Throughput page

The Page Throughput page provides displays the frequency of requests transferred per sample interval. The line graph displays the Page Hit Rate, and the User Load.
Graph Description
Page Hit Rate The graph displays the number of attempts were made per interval for all pages that were loaded successfully.
User Load The graph displays active users and completed users in the playback.
The table after the graph provides the following additional information:
Summary Description
Performance Summary The Page Rate Hit summary table lists the total hit rates and counts for each page in the run.

The User load summary table lists the active, completed, and total users in the playback. As the run nears completion, the number of active users decreases and the number of completed users increases.

If the number of requests and hits are not close, the server might be having trouble keeping up with the workload.

If you add virtual users during a run and view the two graphs, you can monitor the ability of your computer to manage the workload. As the page hit rate stabilizes, even though the active user count continues to be higher and the computer is well-tuned, the average response time naturally slows down. This response time reduction happens because the computer is running at its maximum effective throughput level and is effectively slows the rate of page hits.

Server Throughput page

The Server Throughput page lists the rate and number of bytes that are transferred per interval and for the entire run. The page also lists the status of the virtual users for each interval and for the entire run.
Graph Description
Byte Transfer Rates The graph displays the rate of bytes sent and received per interval for all intervals in the playback.
User Load The graph displays active users and users completed testing, per sample interval, over the course of a run. You set the Statistics sample interval value in the schedule, as a schedule property. As the run nears completion, the number of active users decreases and the number of completed users increases.
The table after the graph provides the following additional information:
Summary Description
Performance Summary The Byte Transfer Rates summary table lists the number of bytes were sent and received in the playback.

The User Load summary table lists the active, completed and total users in the playback.

The bytes sent and bytes received throughput rate, which is computed from the client perspective, displays how much data Rational® Performance Tester is pushing through your server. Typically, you analyze this data with other metrics, such as the page throughput and resource monitoring data, to understand how network throughput demand affects server performance.

Server Health Summary page

The Server Health Summary page gives an overall indication of how well the server is responding to the load.
Graph Description
Page Health The graph displays the number of pages that loaded successfully from the number of attempts made.
Page Element Health The graph displays the number of page element that loaded successfully from the number of attempts made.
The table after the graph provides the following additional information:
Summary Description
Performance Summary The Performance Summary displays the summary of of the pages and elements on the page that were loaded successfully.

Server Health Detail page

The Server Health Detail page provides specific details of each page percent status code success for the test run. A successful status code refers to the HTTP response code passing if a verification point is set for that page.

Caching Details page

The Caching Details page provides specific details on caching behavior during a test run.
Graph Description
Caching Activity The graph displays the total number of page element cache attempts, page element cache hits, and page element cache misses for the run. The values correspond to responses, that indicates whether the content is modified. Additionally, the bar chart displays the total number of page elements in the cache that are skipped for the run. The-l value indicates the cache hits that are still in the local cache, where communication is not necessary.
Page Element Cache Hit Ratios The graph displays the percentage of cache attempts that indicate confirmed success for the run. Server-confirmed cache hits occur when the server returns a response code. Client-confirmed cache hits occur when the content is still in the local cache and no communication is required.

Resources page

The Resources page displays information about all the resource counters that are monitored during the schedule run. You can view the following information as mentioned in the table from the Resources page:
If... Then the Resources page displays...
If you did not add any Resource Monitoring source to a performance schedule

A message that states that you must set up the Resource Monitoring sources to view the resource details.

If you added Resource Monitoring sources to a performance schedule
  • The Resource Monitoring sources that are monitored during the schedule run.
  • All resource counters for those Resource Monitoring sources that are monitored during the schedule run.
  • The Unavailable sources section lists the Resource Monitoring sources that are unavailable or unreachable during the schedule run.
    Note: The Unavailable sources section is displayed only if any of the Resource Monitoring sources are unreachable or unavailable during the schedule run.
If you added Resource Monitoring sources by using labels to a performance schedule
  • The following information in the Server sources matching the labels set in the schedule (*Source defined in team space) section:
    • Labels and the Resource Monitoring sources associated with those labels that are monitored during the schedule run.
    • Resource Monitoring sources that are unavailable or unreachable during the schedule run.
    • An empty array ([]) when you used labels that are not tagged to any Resource Monitoring source in Rational® Test Automation Server.
    • The asterisk (*) symbol is displayed after the name of the Resource Monitoring source if you add the Resource Monitoring source at the team space level in Rational® Test Automation Server.
  • All resource counters for the Resource Monitoring sources that are monitored during the schedule run.

If you ran a performance schedule by using the overridermlabels command from the Rational® Performance Tester command line
  • The following information in the Server sources matching the labels set with the command-line flag -overridermlabels (*Source defined in team space) section:
    • Labels that you use to add the Resource Monitoring sources to the schedule for the schedule run.

    • Resource Monitoring sources associated with those labels that are monitored during the schedule run.
    • Resource Monitoring sources that are unavailable or unreachable during the schedule run.

    • An empty array ([]) when you use labels that are not tagged to any Resource Monitoring source in Rational® Test Automation Server.

    • The asterisk (*) symbol is displayed after the name of the Resource Monitoring source if you add the Resource Monitoring source at the team space level in Rational® Test Automation Server.
  • All resource counters for the Resource Monitoring sources that are monitored during the schedule run.

The Legend displays the Resource Monitoring type and its resource counters. When there are multiple Resource Monitoring sources, the resource counters for the respective sources are displayed in front of their Resource Monitoring source name. You can customize the resource counter information displayed in a graph by clicking any individual resource counter or type of source. You can click or double-click any individual resource counter for the following results:
  • A single click on the resource counter hides the data displayed on the graph. Click the resource counter again to display the data in the graph.
  • A double-click on the resource counter removes information about all other resource counters from the graph and displays only the information about the selected resource counter.
    Tip: You can click Select All option to restore all the resource counter information on the graph.

When you click on any of the sources, the graph removes all the resource counters of other sources and displays only the resource counters of the selected source.

For example, you have an Apache httpd server and a Windows Performance Host as a Resource Monitoring source. When the schedule completes, the Resources page displays the resource counter information of both the sources. If you want to analyze the resource counters for any one of the sources, you can click the Apache httpd server or the Windows Performance Host. Based on your selection, the graph is updated to display the selected source resource counters information.

The Performance Summary table that follows the graph, lists the most recent values of the resource counters that are monitored during the schedule run. The first two columns display the Type of the source and Name of the resource counter. This table also lists the minimum, maximum, and average values of the resource counters that are monitored during the schedule run.

Page Element Responses

The Page Element page displays the slowest page element responses for the selected page.

Page Response Time Contributions

The Page Response Time Contributions page displays how much time each page element contributes to the overall page response time and the request delay time and connection time.

Page Size

This page lists the size of each page of your application in the test. The size of the page contributes to the response time. If part of a page or an entire page is cached, then those requests coming from the cache did not contribute to the total page size.
Column Description
Average Page Size For Run The bar in the graph represents a page. To view the Page Elements Size report, click a bar and select Page Element Sizes. All the elements that are on the page displays with sizes. The size of a page is mostly determined by the size of its elements.
Page Size Summary The summary table lists the average response size of the page element, the maximum and minimum response size of the page element, and the number of attempts.

Errors

The Errors page lists the number of errors and the corresponding actions that occurred in the test or schedule. You can view the following graphs on the Errors page:

  • Error Conditions: This graph displays the number of errors that the conditions met.

  • Error Behaviors: This graph displays how each error condition is managed.

  • Error Conditions over Time: This graph displays errors against the time that occurred during the playback of the test or schedule.

Note: You must define how to manage errors in the Advanced tab of the Test Details, VU Schedule Details, or Compound Test Details pane to log errors when a specific condition occurs.

Page Health

Use this page or report to determine if the pages of your application contain errors. If a page contains any error, the report displays that the page is not 100% healthy. If there are pages that are not 100% healthy, the report displays another section that lists such pages and the reported errors.