Testing with Apache Kafka

Starting from IBM® Rational® Integration Tester 10.0.2 and later, you can create Kafka transports to test Kafka services.

Kafka is a distributed streaming platform with the following capabilities:
  • Publish and subscribe to streams of records, like a message queue or enterprise messaging system.
  • Store streams of records in a fault-tolerant durable way.
  • Process streams of records as they occur.
Kafka is generally used for two broad classes of applications:
  • Building real-time streaming data pipelines that reliably get data between systems or applications.
  • Building real-time streaming applications that transform or react to the streams of data.

Software requirements

To use Kafka messaging with Rational® Integration Tester, you must have the connection details of the Kafka cluster.

For more information about Kafka software, see the Apache Kafka website.

Note: Rational® Integration Tester ships the required JAR files to communicate with Kafka. No Library Manager configuration is necessary.

Prerequisites for testing with Kafka

You must have the following details to configure the Kafka transport in Rational® Integration Tester:
  • Connection details of one of the Kafka brokers.
  • Topic and Partition information for publishing messages.
  • Topic and consumer group information for subscribing to messages.
  • If you want to use the SSL for communications, you must have configured Kafka server to use SSL and set up the required identity stores and key stores in Rational® Integration Tester.

Overview of tasks

The following table provides a list of the tasks that you can perform for testing with Kafka along with the link to the corresponding topic for the details of the task.

Task More information
1. You must create a logical transport connection in the Architectural School perspective for the Kafka transport broker. Creating logical Apache Kafka transport connections
2. You must configure the physical resource for the logical transport connection. Creating physical Apache Kafka transport connections
3. If you are testing a Kafka application, you can set up the Message Exchange Pattern (MEP) and define the publish and subscribe actions of your application. MEP settings for Kafka transport
4 You must configure the transport and formatter for the Kafka messages before you can use the publish or subscribe actions. Working with Kafka messages
4.1 You can publish Kafka messages. Publishing messages
4.2 You can subscribe to Kafka messages. Subscribing to messages
5 You must set up a Service Component that uses Kafka as the physical transport before you can create tests or stubs. Creating a service component
6 You must set up an Operation that uses Kafka as the physical transport before you can create tests or stubs. Creating an operation
7 You can specify the stubbing settings based on the MEP for the Kafka transport. Stubbing settings for Kafka transport
8 You can record events for the Kafka transport by using the direct mode (messages routed through the configured port and topics or pattern for Kafka). You can also record events by using the sift and pass-through mode. The Kafka messages are sifted for the set criteria and the passed ones are routed through the configured proxy port. Recording Kafka transport traffic