Joule
  • Welcome to Joule's Docs
  • Why Joule?
    • Joule capabilities
  • What is Joule?
    • Key features
    • The tech stack
  • Use case enablement
    • Use case building framework
  • Concepts
    • Core concepts
    • Low code development
    • Unified execution engine
    • Batch and stream processing
    • Continuous metrics
    • Key Joule data types
      • StreamEvent object
      • Contextual data
      • GeoNode
  • Tutorials
    • Getting started
    • Build your first use case
    • Stream sliding window quote analytics
    • Advanced tutorials
      • Custom missing value processor
      • Stateless Bollinger band analytics
      • IoT device control
  • FAQ
  • Glossary
  • Components
    • Pipelines
      • Use case anatomy
      • Data priming
        • Types of import
      • Processing unit
      • Group by
      • Emit computed events
      • Telemetry auditing
    • Processors
      • Common attributes
      • Filters
        • By type
        • By expression
        • Send on delta
        • Remove attributes
        • Drop all events
      • Enrichment
        • Key concepts
          • Anatomy of enrichment DSL
          • Banking example
        • Metrics
        • Dynamic contextual data
          • Caching architecture
        • Static contextual data
      • Transformation
        • Field Tokeniser
        • Obfuscation
          • Encryption
          • Masking
          • Bucketing
          • Redaction
      • Triggers
        • Change Data Capture
        • Business rules
      • Stream join
        • Inner stream joins
        • Outer stream joins
        • Join attributes & policy
      • Event tap
        • Anatomy of a Tap
        • SQL Queries
    • Analytics
      • Analytic tools
        • User defined analytics
          • Streaming analytics example
          • User defined analytics
          • User defined scripts
          • User defined functions
            • Average function library
        • Window analytics
          • Tumbling window
          • Sliding window
          • Aggregate functions
        • Analytic functions
          • Stateful
            • Exponential moving average
            • Rolling Sum
          • Stateless
            • Normalisation
              • Absolute max
              • Min max
              • Standardisation
              • Mean
              • Log
              • Z-Score
            • Scaling
              • Unit scale
              • Robust Scale
            • Statistics
              • Statistic summaries
              • Weighted moving average
              • Simple moving average
              • Count
            • General
              • Euclidean
        • Advanced analytics
          • Geospatial
            • Entity geo tracker
            • Geofence occupancy trigger
            • Geo search
            • IP address resolver
            • Reverse geocoding
            • Spatial Index
          • HyperLogLog
          • Distinct counter
      • ML inferencing
        • Feature engineering
          • Scripting
          • Scaling
          • Transform
        • Online predictive analytics
        • Model audit
        • Model management
      • Metrics engine
        • Create metrics
        • Apply metrics
        • Manage metrics
        • Priming metrics
    • Contextual data
      • Architecture
      • Configuration
      • MinIO S3
      • Apache Geode
    • Connectors
      • Sources
        • Kafka
          • Ingestion
        • RabbitMQ
          • Further RabbitMQ configurations
        • MQTT
          • Topic wildcards
          • Session management
          • Last Will and Testament
        • Rest endpoints
        • MinIO S3
        • File watcher
      • Sinks
        • Kafka
        • RabbitMQ
          • Further configurations
        • MQTT
          • Persistent messaging
          • Last Will and Testament
        • SQL databases
        • InfluxDB
        • MongoDB
        • Geode
        • WebSocket endpoint
        • MinIO S3
        • File transport
        • Slack
        • Email
      • Serialisers
        • Serialisation
          • Custom transform example
          • Formatters
        • Deserialisers
          • Custom parsing example
    • Observability
      • Enabling JMX for Joule
      • Meters
      • Metrics API
  • DEVELOPER GUIDES
    • Setting up developer environment
      • Environment setup
      • Build and deploy
      • Install Joule
        • Install Docker demo environment
        • Install with Docker
        • Install from source
        • Install Joule examples
    • Joulectl CLI
    • API Endpoints
      • Mangement API
        • Use case
        • Pipelines
        • Data connectors
        • Contextual data
      • Data access API
        • Query
        • Upload
        • WebSocket
      • SQL support
    • Builder SDK
      • Connector API
        • Sources
          • StreamEventParser API
        • Sinks
          • CustomTransformer API
      • Processor API
      • Analytics API
        • Create custom metrics
        • Define analytics
        • Windows API
        • SQL queries
      • Transformation API
        • Obfuscation API
        • FieldTokenizer API
      • File processing
      • Data types
        • StreamEvent
        • ReferenceDataObject
        • GeoNode
    • System configuration
      • System properties
  • Deployment strategies
    • Deployment Overview
    • Single Node
    • Cluster
    • GuardianDB
    • Packaging
      • Containers
      • Bare metal
  • Product updates
    • Public Roadmap
    • Release Notes
      • v1.2.0 Join Streams with stateful analytics
      • v1.1.0 Streaming analytics enhancements
      • v1.0.4 Predictive stream processing
      • v1.0.3 Contextual SQL based metrics
    • Change history
Powered by GitBook
On this page
  • Overview
  • Examples & DSL attributes
  • Processing steps
  • Deserialisation attributes schema
  • Provided value deserialisers
  • Avro Support
  • Example AVRO deserialisation
  • Sample AVRO IDL for quote schema

Was this helpful?

  1. Components
  2. Connectors
  3. Sources
  4. Kafka

Ingestion

Joule enables flexible Kafka data deserialisation for processing

Overview

When working with external data, deserialisation is often needed to convert raw data into formats compatible with Joule’s processing pipelines.

Joule provides a versatile deserialisation framework that extends the native Kafka deserialisation capabilities, supporting multiple formats to make ingestion and processing easier.

Available deserialisation options include:

  1. Domain object deserialisation Custom parsers can transform domain-specific objects into Joule-compatible formats.

  2. AVRO deserialisation AVRO schemas can convert structured data into Joule's internal StreamEvents.

  3. Native Joule StreamEvent deserialisation Joule's format for internal processing allows direct ingestion of data as StreamEvents.

Examples & DSL attributes

This example shows how a custom parser (QuoteToStreamEventParser) transforms Quote domain objects into StreamEvents that can be processed by Joule. Deserialisation in action.

kafkaConsumer:
  ... 
  deserializer:
    parser: 
      com.fractalworks.examples.banking.data.QuoteToStreamEventParser
    
    key deserializer: 
      org.apache.kafka.common.serialization.IntegerDeserializer
    
    value deserializer: 
      com.fractalworks.streams.transport.kafka.serializers.object.ObjectDeserializer
  ...

Processing steps

  1. Consumption The Kafka client receives the event data using a generic object via the specified value deserialiser.

  2. Parsing The object is passed to QuoteToStreamEventParser, converting it from a Quote object to a StreamEvent.

  3. Streaming pipeline The StreamEvent is added to Joule’s streaming pipeline queue.

  4. Acknowledgment The Kafka infrastructure acknowledges the event asynchronously to manage processing flow.

Deserialisation attributes schema

Attribute
Description
Data Type
Required

parser

User provided parser implementation

Implementation of CustomTransformer

key deserializer

Domain class that maps to the partition key type. Property maps to key.serializer property

String

Default: IntegerDeserializer

value deserializer

Domain class that deserialises incoming object to StreamEvent. Property maps to value.serializer property

String

Default: StringDeserializer

avro setting

AVRO ingestion specification.

Avro specification

Provided value deserialisers

The following deserialisation options enable Joule to ingest external data structures seamlessly:

  1. StreamEvent (Binary) When connecting Joule processes in a Directed Acyclic Graph (DAG), events are passed as internal StreamEvent objects in binary format. To properly deserialise these events, configure the system with the following setting: com.fractalworks.streams.transport.kafka.serializers.avro.AvroStreamEventDeserializer This ensures that the binary StreamEvent is correctly interpreted and processed.

  2. StreamEvent (JSON) For testing purposes, you may have saved StreamEvent objects as JSON strings. These events can be consumed and converted into concrete objects by configuring the system with the following setting: com.fractalworks.streams.transport.kafka.serializers.json.StreamEventJsonDeserializer This setting ensures that the JSON data is properly deserialised into the appropriate StreamEvent objects.

Avro Support

Example AVRO deserialisation

Here’s a configuration that uses an AVRO schema to map Quote events into StreamEvents.

kafkaConsumer:
  cluster address: localhost:19092
  consumerGroupId: quotesConsumerGroup
  topics:
    - quotes

  deserializer:
    key deserializer: org.apache.kafka.common.serialization.IntegerDeserializer
    avro setting:
      local schema: avro/quote.avsc

Sample AVRO IDL for quote schema

This AVRO schema defines a Quote object with attributes like symbol, mid, bid, and ask, allowing Joule to map incoming Quote data into StreamEvents automatically for streamlined ingestion.

{
  "type" : "record",
  "name" : "Quote",
  "namespace" : "com.fractalworks.examples.banking.data",
  "fields" : [
    {"name" : "symbol", "type" : "string"},
    {"name" : "mid",  "type" : "double"},
    {"name" : "bid", "type" : "double"},
    {"name" : "ask", "type" : "double"},
    {"name" : "volume", "type" : "long"},
    {"name" : "volatility", "type" : "double"},
    {"name" : "time", "type" : "long"},
    {"name": "date",
      "type" : {
        "type" : "int",
        "logicalType": "date"
      }
    }]
}
PreviousKafkaNextRabbitMQ

Last updated 6 months ago

Was this helpful?

Custom data types To handle custom data types without using an , you must configure the system with com.fractalworks.streams.transport.kafka.serializers.object.ObjectDeserializer. This configuration requires implementing a custom parser that can interpret and transform the incoming data into a format Joule can process.

Joule includes AVRO support through locally stored schemas (AVRO IDL files) to map domain objects to StreamEvents. Although AVRO schema registries are not supported yet, this .

To review the schema for this example, please visit the .

AVRO IDL schema
feature is planned for 2025
Kafka sources page