Key features
Build, deploy and scale analytic use cases fast
Last updated
Build, deploy and scale analytic use cases fast
Last updated
As a user of Joule, your main focus will be building use cases that address point-in-time or strategic business needs - there is no need to continually rebuild technology assets just reuse existing processing and data templates to kick start the process and add new assets when needed. Joule supports turning your needs in to deployable runtime assets.
Joule has been developed to leverage mature and emerging technologies through the application of clean interfaces through application of the Joule SDK.
Key features include:
Low code development platform Effortlessly create use cases with Joule DSL, enabling swift development of versatile stream processing pipelines.
Analytics Enablement Harness ML model support, auditing, geospatial capabilities, streaming window analytics, SQL metrics engine, and triggers for enhanced functionality.
Stream Processors Enrich, encrypt, and mask data effortlessly with out-of-the-box processors and supported scripting languages (Node.js, JavaScript, Python).
Data Connectors Effortlessly consume and stream consistent data with out-of-the-box connectors, including Kafka, MQTT, data lakes, Big Query, NoSQL, web sockets, OpenAPI, and more.
Contextual Data Enrich streaming events with dynamic, static and slow-moving contextual data using embedded local caching.
APIs & SDK An extendable API is available to empower developers in building custom components.
Observability All components have processing metric counters which can be monitored using external solutions
Flexible Deployment Joule has been designed to be platform agnostic, offering seamless deployment options whether you choose a local, on-premise, or cloud-based environment.
At its core, Joule adheres to the design principle of delivering a low-code use case platform that fosters rapid development iterations for impactful business outcomes. Packaged with a dedicated use case language, DSL, and a suite of reusable assets, Joule empowers developers to commence building immediately after installation.
Flexible event subscriptions and publishing
Stream event processing pipeline
Custom SQL Metrics definition
Extendability through custom components
Mainstream product integrations
Joule provides three flexible methods to build analytical insights. Each method is describe below.
The integration of streaming analytics serves as a pivotal feature, empowering the evolution of sophisticated use case development, including applications like geospatial analytics for marketing, business analytics, and feature preparation for machine learning predictions.
Tumbling and sliding windows
Standard statistical functions
Custom analytic functions
Geospatial analytics (Geo Tracker, Geofence occupancy, spatial index)
Event processing is executed through the definition of a processor's pipeline. Events undergo sequential processing utilising a micro-batch methodology, a technique employed to boost processing throughput while optimising the utilisation of underlying hardware capabilities.
Filtering
Filtering event based using a configurable criteria or a an expression. Example use cases is customer opt-out, missing data elements, out of range etc,.
Enrichment
Enrichment of streaming events with a embedded low-latency data caching solution
Transformation
Event field tokenisation, encryption, masking, bucketing and redaction
Triggers
Real-time alerts and event triggers using rule based processing and delta CDC processing
Event Tap
Tap events directly in to an in-memory database to enable on/off line processing
Scripting
Execute external scripts or defined expression within the use case DSL. using supported scripting languages ( Node.js, JavaScript, Python)
Metrics
A SQL compliant metrics engine which computes scheduled metrics
Machine Learning
Leverage streaming online predictions to drive insights to action
Analytics
Streaming analytics using event windows, expressions, scripts and much more
New processors are constantly added to the platform. Please contact fractalworks for an updated list.
Data connectors, sources and sinks are a key component within the Joule low-code ecosystem that consume and distribute data.
For Joule applications, contextual data is crucial for enabling advanced and insightful stream processing. By seamlessly integrating contextual data with real-time events, the system delivers enriched processing outcomes and better informed insights.
A Java SDK for developers is supplied to extend platform capabilities, enabling the customisation and enhancement of processors and data transports.
APIs
Rest base APIs to access key Joule functions
Data access APIs
Deployment management APIs
SDK
Flexible SDK to enable platform extensibility
User defined analytics
Processors
Data connecctors
Each processing component in Joule furnishes a standard set of metrics, offering users insights into he number of events received, processed, discarded, and failed. Furthermore, with the SQL engine enabled, both raw and processed events are stored, making them queryable and exportable for enhanced analytical capabilities.
Joule has been designed to be platform agnostic, offering seamless deployment options whether you choose a local, on-premise, or cloud-based environment. Joule is packaged as a Docker container for simplified deployment configurations or as a standalone binary, providing flexibility to meet diverse deployment needs.