Processors
Processors are the core of the Joule platform, each performing a specific task. These create use case when linked together
Last updated
Processors are the core of the Joule platform, each performing a specific task. These create use case when linked together
Last updated
Joule harnesses processors as the foundational units of its processing functionality, linking them together to form event stream pipelines. These pipelines are central to Joule's architecture, facilitating the implementation of various use cases with ease and efficiency.
A pipeline can represent either an entire use case or a portion of one, offering flexible implementation options tailored to specific needs.
Pipelines consists of one or more interconnected processors, each designed to execute a distinct task. Events are processed sequentially, with the final output either routed to subsequent linked pipelines or delivered to designated data consumers.
All processors maintain real-time observability metrics, accessible through any JMX monitoring tool. For more in-depth information, please refer to the observability documentation.
Modular design Processors can be combined flexibly into pipelines, allowing custom configurations for specific use cases.
Real-time processing Events are processed sequentially and in real time, supporting high-speed, low-latency applications.
Out-of-the-box processors Joule offers a wide range of ready-to-use processors for common tasks, accelerating deployment.
Extensibility via SDK The Processor SDK allows developers to create custom processors, extending functionality to meet unique business needs.
Built-in observability All processors provide real-time metrics accessible through JMX, enhancing monitoring and troubleshooting.
Joule offers many out-of-the-box processor implementations that support the implementation of use cases.
Processors are categorised by function, based on their role within the platform.
Joule provides a Processors SDK to enable business developers to extend the processing capabilities.
See Processor API documentation for further information.
Filters
Reduce stream processing overhead by filtering irrelevant events
Enrichment
Enrich events using linked reference data, metrics and analytics
Transformation
Transform event attribute values based upon a desired target state
Triggers
Apply rule based triggers for downstream business actions
Stream Join
Join independent stream events to trigger advance analytical insights and dynamic business rules
Event Tap
Tap events directly in to an in-memory database to enable on / off line processing