# Kafka

## Overview

This page provides an overview of configuring the Kafka consumer in Joule to ingest data from specified Kafka topics, allowing for seamless integration with data pipelines.

Joule's Kafka consumer deserialises Kafka streams into Joule `StreamEvents`, enabling them to be processed within the Joule environment. This deserialisation can be automated or customised with a transformer based on configuration settings.

{% hint style="info" %}
**Client library:** [org.apache.kafka:kafka-clients:2.7.0](https://kafka.apache.org/documentation/)
{% endhint %}

## Examples & DSL attributes

The example provided illustrates how to set up a consumer to process messages from the `quotes` topic on a specified Kafka broker (`KAFKA_BROKER:9092`).

It uses the `StickyAssignor` strategy for partition assignment and provides an example of adding the Kafka broker's symbolic name to the `/etc/hosts` file.

```yaml
kafkaConsumer:
  name: nasdaq_quotes_stream
  cluster address: KAFKA_BROKER:9092
  consumerGroupId: nasdaq
  topics:
    - quotes
    
  properties:
    partition.assignment.strategy: org.apache.kafka.clients.consumer.StickyAssignor
```

{% hint style="danger" %}
Add to the `/etc/hosts` file the address of the Kafka host. This enables you to use the symbolic name rather than a specific IP address i.e. 127.0.0.1 KAFKA\_BROKER
{% endhint %}

### Attributes schema

<table><thead><tr><th width="185">Attribute</th><th width="253">Description</th><th width="219">Data Type</th><th data-type="checkbox">Required</th></tr></thead><tbody><tr><td>name</td><td>Name of source stream </td><td>String</td><td>true</td></tr><tr><td>cluster address </td><td>Kafka cluster details. Maps to bootstrap.servers</td><td>http://&#x3C;ip-address>:port</td><td>true</td></tr><tr><td>consumerGroupId</td><td>Consumer group to ensure an event is read only once. Maps to <code>group.id</code></td><td>String</td><td>true</td></tr><tr><td>topics</td><td>Message topics to subscribe too</td><td>List of strings</td><td>true</td></tr><tr><td>pollingTimeout</td><td>This places an upper bound on the amount of time that the consumer can be idle before fetching more records.<br>See this <a href="https://learn.conduktor.io/kafka/kafka-consumer-important-settings-poll-and-internal-threads-behavior/">documentation</a> for more information</td><td><p>Long</p><p>Default: 250ms</p></td><td>false</td></tr><tr><td>deserializer</td><td>Deserialisation configuration</td><td><a href="kafka/ingestion">See Ingestion documentation</a></td><td>false</td></tr><tr><td>properties</td><td>Additional consumer properties to be applied to the Kafka client API. See official <a href="https://kafka.apache.org/27/documentation.html#consumerconfigs">documentation</a> for available properties. Default properties applied are listed <a href="#default-consumer-properties">below</a></td><td>Properties map</td><td>false</td></tr></tbody></table>

## Ingestion deserialisation in Joule

The Kafka consumer configuration in Joule is designed to seamlessly ingest and process data from Kafka topics by deserialising streams into Joule-compatible formats.

Using Kafka consumer attributes and custom configurations, it is possible to transform messages into Joule `StreamEvents` that can be processed further in data pipelines.

This setup supports various deserialisation options for efficient and flexible integration with data infrastructure. [Continue with this documentation to see how it is done](https://docs.fractalworks.io/joule/components/connectors/sources/kafka/ingestion).

### Default consumer properties

```properties
enable.auto.commit=true
auto.offset.reset=latest
auto.commit.interval.ms=50
session.sendTimeout.ms=30000
key.deserializer=org.apache.kafka.common.serialization.IntegerDeserializer
value.deserializer=org.apache.kafka.common.serialization.StringDeserializer
```
