Kafka
Standard Kafka consumers transport ingests data from subscribed cluster topics
Overview
This page provides an overview of configuring the Kafka consumer in Joule to ingest data from specified Kafka topics, allowing for seamless integration with data pipelines.
Joule's Kafka consumer deserialises Kafka streams into Joule StreamEvents
, enabling them to be processed within the Joule environment. This deserialisation can be automated or customised with a transformer based on configuration settings.
Client library: org.apache.kafka:kafka-clients:2.7.0
Examples & DSL attributes
The example provided illustrates how to set up a consumer to process messages from the quotes
topic on a specified Kafka broker (KAFKA_BROKER:9092
).
It uses the StickyAssignor
strategy for partition assignment and provides an example of adding the Kafka broker's symbolic name to the /etc/hosts
file.
Add to the /etc/hosts
file the address of the Kafka host. This enables you to use the symbolic name rather than a specific IP address i.e. 127.0.0.1 KAFKA_BROKER
Attributes schema
name
Name of source stream
String
cluster address
Kafka cluster details. Maps to bootstrap.servers
http://<ip-address>:port
consumerGroupId
Consumer group to ensure an event is read only once. Maps to group.id
String
topics
Message topics to subscribe too
List of strings
pollingTimeout
This places an upper bound on the amount of time that the consumer can be idle before fetching more records. See this documentation for more information
Long
Default: 250ms
properties
Additional consumer properties to be applied to the Kafka client API. See official documentation for available properties. Default properties applied are listed below
Properties map
Ingestion deserialisation in Joule
The Kafka consumer configuration in Joule is designed to seamlessly ingest and process data from Kafka topics by deserialising streams into Joule-compatible formats.
Using Kafka consumer attributes and custom configurations, it is possible to transform messages into Joule StreamEvents
that can be processed further in data pipelines.
This setup supports various deserialisation options for efficient and flexible integration with data infrastructure. Continue with this documentation to see how it is done.
Default consumer properties
Last updated
Was this helpful?