GUI execution

In the IBM® Rational® Performance Test Server application, tests are run primarily from the Test Lab perspective and the progress of the test is displayed in the Console view.

More console messages are displayed for performance tests than for standard tests because additional tasks are required, for example, starting agents, and starting probes.

Test progress

The following example test console output was generated while running a performance test of 10 transactions per second using two agents: gh-debian1 and ghc-pc048. (Log line time stamps were omitted to save space.)


Load Generating Test: Added agent http://ghc-pc048:4476
Load Generating Test: Added agent http://gh-debian1:4476
Initialising: Tests/ Message Responses
Created API for agent: http://gh-debian1:4476 (communicating on: 
http://gh-debian1:53455)
Created API for agent: http://ghc-pc048:4476 (communicating on: 
http://ghc-pc048:4663)

The transaction rating is listed:


Preparing Load Generating Test with target of 10

Transactions are divided equally between the agents and the agents are started:


Preparing Load Generating Test for execution on 
http://ghc-pc048:4476 target of 5
Preparing Load Generating Test for execution on 
http://gh-debian1:4476 target of 5
Task 1 prepared on agent http://ghc-pc048:4476
Task 2 prepared on agent http://gh-debian1:4476
Successfully prepared Load Generating Test
Load Generating Test: starting agent task: 1
Load Generating Test: starting agent task: 2

The test run starts with regular progress updates (details provided in next topic):


http://ghc-pc048:4476: started (25), passed (135), timed out (0), 
failed (0), pending db writes (0)
http://gh-debian1:4476: started (25), passed (114), timed out (0), 
failed (0), pending db writes (0)

The test comes to a close:


http://ghc-pc048:4476: started (1), passed (1797), timed out (0), 
failed (0), pending db writes (0)
http://gh-debian1:4476: started (0), passed (1773), timed out (0), 
failed (0), pending db writes (6)
Load Generating Test: all agent tasks completed
Summarising performance test data...
Summarising performance test data completed
[Passed] Tests/Message Responses: completed.

The test completed and can now be viewed in the Results Gallery.

The following table describes test progress counters.

Counter Description
Started Iterations® started in the report interval, which is controlled by the Collect statistics every field on the Performance Test Statistics tab.

The default is 5 seconds, so if the test is set for 10 transactions per second, this number shows a total of 50 each time.

Passed Iterations® passed so far during the test.
Timed Out Iterations® in which message receivers did not get a response within the configured timeout period.
Failed Iterations® failed so far.
Pending DB Writes Database writes queued on the results database. Large numbers indicate that database access is slower than required and might be a result of a slow network connection. However, the writes are buffered and usually do not slow down the test rate. The limit of the buffer size is 2048. If you repeatedly see that this figure is reported, the rate of load injection is being slowed down by the database. Investigate the processing capability of the results database, and if necessary, move the computer that hosts the database to a faster network that is more accessible to the Agent.
Note: A performance test might run on longer than the specified time while the remaining test instances complete and database writes are flushed. In this case, the "started" figure is at zero for those intervals since the number of iterations was already started.
Note: Figures for Passed/Failed/Timed Out are only populated when the test includes timed sections that are configured. With no timed sections, these values remain at zero.